June 4, 2012 at 6:46 am
Hi!
I have a package in which i have two connection managers pointing to different databases (i.e. DB1 and DB2), multiple tasks running across these databases.
Now I want all of these tasks run under one transaction. For eg:
Task1 => runs at DB1 => success
Task2 => runs at DB2 => fails
the it will rollback all the changes.
Please suggest.
June 4, 2012 at 7:19 am
Put both tasks in one sequence container and put the TransactionOption to Required.
Make sure MSDTC (Microsoft Distributed Transaction Coordinator) is enabled and configured.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
June 4, 2012 at 7:44 am
Thanks Keon for such a quick reply.
I tried this (TransactionOption of sequence container => Required) but the package got stuck with wait_info LCK_M_SCH_S.
June 4, 2012 at 7:46 am
* Right click on control flow page ===> properties
set transactionoption = "Required"
* Right click on 'first task' ==> properties
set transactionoption = "Supported"
* Right click on 'second task' ==> properties
set transactionoption = "Supported"
saveall & run your package
June 4, 2012 at 7:51 am
Already tried but no success..
Here is the settings with which I am trying:
Control Flow: IL => Serializable, TransactionOption => Required
Sequence Container: IL => Serializable, TransactionOption => Required
And all the tasks under Sequence container: IL => Serializable, TransactionOption => Supported
June 4, 2012 at 7:59 am
Checkup with Distributed Trasaction Coordinator is enabled or not, in case if it is not started
go to run==> services.msc start service.
June 4, 2012 at 8:01 am
DTC is enabled.
June 4, 2012 at 11:32 pm
Can I use BEGIN DISTRIBUTED TRAN inside "Execute SQL Task"?
June 4, 2012 at 11:34 pm
Tarun Jaggi (6/4/2012)
Can I use BEGIN DISTRIBUTED TRAN inside "Execute SQL Task"?
If you're working with linked servers and every TSQL statement is in one Execute SQL Task, then yes, you can.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
June 4, 2012 at 11:43 pm
I have multiple Execute SQL Tasks (running on different databases) and Data Flows
June 4, 2012 at 11:48 pm
Tarun Jaggi (6/4/2012)
I have multiple Execute SQL Tasks (running on different databases) and Data Flows
You'll need to use the SSIS transactions.
Is MSDTC running on every node and is it properly configured?
Regarding the lock, check if you don't have a deadlock somewhere.
Are you using Fast Load with tablock in the dataflow?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
June 5, 2012 at 12:26 am
Koen Verbeeck (6/4/2012)
You'll need to use the SSIS transactions.
Is MSDTC running on every node and is it properly configured?
Regarding the lock, check if you don't have a deadlock somewhere.
Are you using Fast Load with tablock in the dataflow?
You may be interested in the following information. These are my notes from personal experience with MSDTC.
I apologise for the format 😉
Let me know if you find the information useful.
1. Data destinations by default verify themselves against the database tables by acquiring a schema lock which isn't released until the transaction is ended. This can cause problems with the simplest of packages when run in transactions with other packages. You can however prevent it by toggling the ValidateExternalMetadata value to false. If the package stalls on validation the first time it is executed within a group this may be the cause.
2. There is a bug in SQL Server that prevents the committing of a buffer with more than about 8 thousand rows in the situation where the package is in a transaction, the destination table is empty, the database uses simple logging, bulk inserts (fast load) are used. The issue can be avoided by inserting a single row prior to the DFT or a patch can be applied (comes in CU5?).
3. It is possible to achieve a situation where a package will inexplicably fail with a cryptic message about transactions. This was found to occur in 2 fact packages but may possibly occur in dimensions. The problem arises in the update architecture (as opposed to truncate-reload). The SQL statement that SELECTS the pre-existing data from the Fact table does not complete before the bulk insert(fast load) insert statement attempts to execute. In this situation it appears the transaction can deadlock or be aborted (causing the error) depending on configuration. This has now also been seen in aggregation packages. To avoid this issue you can put in place a blocking transformation either just before the insert or just after the read. Alternatively you can turn off the fast load. Depending on the size of your dataset one or the other may be a better option. Test and see. I suspect the blocking SRT will be faster in most if not all cases for facts, dimensions which are generally smaller may be faster without the overhead of fast load.
4. Recent Addition: you may find that similar to note 1 the package hangs on validation phase the second time it is executed within a group (it completed successfully the first time). This has occurred on some dimensions where the dimension source component before the merge join could not Validate External Metadata the second time around. Set this property to false to resolve.
June 5, 2012 at 5:57 am
The transaction starts working by changing the TransactionOption to Required along with setting ValidateExternalMetadata to FALSE for all components in Data Flow.
However, I am getting the following error when I am trying to execute more than one "Execute Sql Task" in parallel:
[Connection manager "SourceConnection"] Error: The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D00A "Unable to enlist in the transaction.".
MSDTC is enabled and working. For one "Execute Sql Task" it is working. But my requirement is to run more than one "Execute Sql Task" in parallel.
Please help!!
June 5, 2012 at 5:27 pm
EDIT: Nevermind, posted in haste. Retain same connection is relevant to manual transactions.
Does it work when all your ExecSQLTasks are against the same server and only fail when you include one against your other server? (assuming your DBs are on separate servers)
June 5, 2012 at 6:02 pm
This link may be helpful:
http://richardlees.blogspot.com.au/2010/01/transaction-manager-is-not-available.html
Viewing 15 posts - 1 through 15 (of 22 total)
You must be logged in to reply to this topic. Login to reply