May 14, 2020 at 11:08 am
All,
Firstly this is just a self teaching exercise so there is no urgency. Secondly I suspect this is documented online and I've failed to find it. I am happy to be pointed towards links to read if that's the case.
Package 1 is running two instances of package 2 in parallel. Package 2 has the following three components (in processing order):
A script component reading an XML file
A lookup to see if the record already exists
A SQL DB destination inserting any new records
Package 2 transaction settings are 'isolation level: read committed' and 'transaction support: required'. I've also turned off the caching on the lookup object.
As soon as the instances of package 2 start I get a locking issue as follows:
Session id: 67 waiting for a LCK_M_IS has blocking_session_id=-2
Session id: 68 waiting for a LCK_M_IS has blocking_session_id=67
My questions are:
1a) If my understanding is correct that blocking_session_id -2 is an orphaned transaction?
1b) There is nothing else happening in the database so what is -2 representing? A system process?
2a) LCK_M_IS is an intended shared lock?
2b) If I am correct with 2a then the two processes should be able to obtain a shared lock at the same time?
3) Why is the SQL server's deadlock process not intervening? I've left the package running for at least thirty minutes and neither process gets ended
4) Why does the lock occur as soon as the package instances start? I thought it would work as follows?:
Script component reads the XML file
Take an exclusive lock for the table
Lookup to see if the record already exists
SQL DB insert.
Release the lock
With the caching switched off each package only needs to lock the table to do the lookup and then insert and then it can release it?
Thanks
May 14, 2020 at 1:27 pm
Is the SQL DB destination table the same for both packages?
Is the lock being applied on that table, or on other objects?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
May 14, 2020 at 2:22 pm
Thank you for your help.
Sorry for not being clear. The destination table is the same for both packages. It's the same package being used twice in parallel. Each is processing a different folder.
The lock is being applied on that table.
The querying I'm using is as follows:
select
request_mode, session_id, resource_type, request_type,request_status, text,percent_complete, wait_type, percent_complete,command ,blocking_session_id,
wait_resource
from sys.dm_exec_requests
cross apply sys.dm_exec_sql_text(sql_handle)
inner join sys.dm_tran_locks on sys.dm_exec_requests.session_id=sys.dm_tran_locks.request_session_id
The results are:
S 59 DATABASE LOCK GRANT select * from [dbo].[Road] 0 LCK_M_IS 0 SELECT 68 OBJECT: 12:1314103722:0
IS 59 OBJECT LOCK WAIT select * from [dbo].[Road] 0 LCK_M_IS 0 SELECT 68 OBJECT: 12:1314103722:0
S 68 DATABASE LOCK GRANT select * from [dbo].[Road] 0 LCK_M_IS 0 SELECT -2 OBJECT: 12:1314103722:0
IS 68 OBJECT LOCK WAIT select * from [dbo].[Road] 0 LCK_M_IS 0 SELECT -2 OBJECT: 12:1314103722:0
Sorry the output isn't easy to read.
The object id result is as follows:
select OBJECT_NAME(1314103722,12)
Result: Road
May 14, 2020 at 3:15 pm
SPID -2 is an orphaned DTC transaction:
https://www.sqlskills.com/blogs/paul/disaster-recovery-101-dealing-with-negative-spids-2-and-3/
Another good read on SPID -2:
http://www.eraofdata.com/sql-server/troubleshooting-sql-server/orphaned-msdtc-transactions-2-spids/
So something is starting a distributed transaction and SQL is uncertain on the state of that distributed transaction.
Also, how long do you expect these parallel packages to run for? If you run them in series, do they complete without error in a reasonable time (ie under your 30 minutes you waited before cancelling)? If you are pulling in TONS of data from XML and comparing it to TONS of data in the table, it may just take time.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
May 14, 2020 at 4:12 pm
Thank you for your help and the links.
I just tried them in series and the first instance gets blocked by spid -2.
If I put them in parallel and disable one of them it completes OK and takes about 5 minutes per file. I can also run a select on the table and see the number of records increasing.
Strange.
I'll do some more investigation and see if I can work out what's causing the orphaned DTC transaction.
May 14, 2020 at 4:48 pm
That is weird that running it in series causes the block as well...
even stranger that running package 1 to call 1 copy of package 2 makes things run successfully.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
May 14, 2020 at 6:07 pm
I think I may have, partly, found the issue here https://support.microsoft.com/en-gb/help/2253391/the-execution-of-a-sql-server-integration-services-ssis-package-stop. I'm running SSIS 2017 but I suspect the issue still applies.
There is a more detailed explanation of Validate External Metadata here https://thesqlgirl.com/2016/09/17/ssis-package-validation/
If I set ValidateExternalMetadata to false then package 2 will start and process some of the files before failing with "The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D00A "Unable to enlist in the transaction".
I'm still working on this new error. I'm not sure it's related to the same issue or not. I think it might just be a timeout issue. If I work it out I'll post an update.
May 14, 2020 at 6:58 pm
Is there a reason you changed the transaction support value to 'required'? This is not the default value - and is forcing SSIS to create a distributed transaction and probably the root of your problem.
Essentially - you are asking SSIS to start a transaction in the master package and maintain that transaction.
What happens if you set that back to the default value of supported and don't start a transaction in the master package?
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
May 15, 2020 at 9:38 am
Thank you for your help. I, mistakenly, understood it as meaning that it would create separate transactions in each sub package. I set it to see if it would solve the original issue.
I've set it back to supported and I still get the new error. I'll keep testing.
Am I fundamentally approaching this in the wrong way? Is it best practice, with SSIS, to use a separate staging table for each parallel import and merge them at the end?
May 15, 2020 at 4:03 pm
Thank you for your help. I, mistakenly, understood it as meaning that it would create separate transactions in each sub package. I set it to see if it would solve the original issue.
I've set it back to supported and I still get the new error. I'll keep testing.
Am I fundamentally approaching this in the wrong way? Is it best practice, with SSIS, to use a separate staging table for each parallel import and merge them at the end?
Did you change this in both the master and child package? Or just the child package? If only the child package - the master package is still creating a transaction.
Not sure about the approach...I probably would not set it up this way and instead have a separate package for each group of files. I have found that files being sent from different vendors/groups will be different enough that having a package (or set of packages) for each group is much easier to setup and manage than a single generic package for all groups.
If the file format is the same across multiple groups - and I can distinguish between the groups based on either the file name or data in the files then I just use a single package. It really depends on what is different about the files and how we want to process them...
One thing I have found is that using Project Deployment to the Integration Services Catalog has made it much easier to create new projects from existing projects. By doing this - I can then easily make project parameter changes and simple file changes in the data flow(s) and have the project ready in just a couple of hours (at most...often even less time).
The downside to this is if I find a 'bug' in the packages - especially in any script tasks. If that happens I then have to fix it in every project and every package...but that doesn't happen often and should not happen once the script has been tested fully.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
May 16, 2020 at 8:57 pm
Thank you for your help.
I will check the transaction settings when I go back to this issue next week. I'll post any results that will help anyone else who gets the same issue.
In this case the reasons for running two sub packages in parallel are speed and an opportunity to learn more about SSIS. The main package splits the files into two folders then runs two instances of the same sub package providing a parameter of which folder to process.
May 17, 2020 at 2:29 pm
Yeah, sounds like the transaction settings are off somewhere.
If this is more of a learning exercise and you have the option to run the package in the IDE, consider this option...
Create a new package supporting...I don't know...three or four data flow tasks that can run in parallel. Have each task read from a different file (hard code the file reference). Leave all settings as default. Run the package.
Granted, this package does not support the parent-child design pattern you're after (and not good for typical production use), but experimentally-speaking, it might shed some light on the possible issue with transaction settings.
When running the package, if blocking still occurs, it might be easier to research and get an understanding as to what is going on. From there, you can implement what you learn to the parent-child design.
If the blocking does not occur, they try to determine why.
But yeah, when I come across problems like this, I find that creating less ideal alternative solutions can lend to finding the solution.
May 19, 2020 at 11:03 am
All,
Thank you for your help.
It was the transaction settings in the master package. I think I accidently changed it when I was experimenting with the isolation level settings.
Thanks
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply