March 6, 2008 at 7:24 am
i have a data flow task and i am using begin trans and commit/rollback (using SQL tasks) before/after this task respectively.
i have done the following changes:
1) the transactionoption of the SQL tasks is supported
2) retainsameconnection prop of destination conn manager is set to true.
In the data task first there is an insertion into a table and then a select query is run on that table.
Now, whats happening is because the transaction has not been committed the select query can not read data from the table.
If i change the retainsameconnection prop of destination conn manager to false my package runs successfully.
I have tried setting isolationlevel prop of data flow task to 'read uncommitted' but it also does not work.
Can any one suggest me a solution here?
March 6, 2008 at 1:29 pm
If you're starting the transaction using a SQL task try
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
BEGIN TRANSACTION
Peter
March 7, 2008 at 12:05 am
I tried doing this but it doesn't work.
I was just wondering does it have anything to do with the MSDTC service.
Will enabling or disabling it affect my solution?
March 7, 2008 at 1:41 am
Confusing stuff! I don't think the isolation level really matters, because a transaction can read it's own data. Maybe you should set the RetainSameConnection on the source connection. I did the following test:
CREATE TABLE Test1(Name VARCHAR(50))
CREATE TABLE Test2(Name VARCHAR(50))
Then I created a control flow with 4 consecutive SQL tasks, all with the same connection manager.
SQL Task 1
SET TRANSACTION ISOLATION LEVEL SERIALIZABLE
BEGIN TRANSACTION
SQL Task 2
INSERT INTO Test1(Name) VALUES ('John')
SQL Task 3
INSERT INTO Test2(Name) SELECT Name FROM Test1
SQL Task 4
COMMIT TRANSACTION
If I set RetainSameConnection to true, both tables hold the value 'John'. However, if I set the RetainSameConnection to false, the final COMMIT TRANSACTION fails with the error "The COMMIT TRANSACTION request has no corresponding BEGIN TRANSACTION.". But still both tables contain the value 'John'. Use the profiler and watch the SPID and 'Audit login/logout' to see what's going on.
March 7, 2008 at 5:05 am
Hi,
I think what's happened there is that in task 1 you've opened a transaction then closed the connection (I think this effectively rolls back the transaction). Task 2 you've inserted data and committed it (implicit transaction) and closed the connection; same with task 3 and then finally you've tried to commit the transaction but because the initial transaction didn't remain open you get that error message. I wonder if you put 1 insert into task 1 what would happen. Presumably your table would only have one record in it as the first statement would be rolled back on losing the connection.
From a more general stance though, I don't understand why you'd want to do a dirty read in the first place. Surely if you're inserting data then selecting, the best way to do it is through the pipeline.... in other words, you select the data for the initial insert first, then branch off the pipeline using these rows to join onto whatever tables are needed and output to the required destination.
Kindest Regards,
Frank Bazan
March 10, 2008 at 7:07 am
i think i have found the source of the problem.
actually i am using script component to run the select query which opens and closes a connection.
I will now try to incorporate the insert and select in the same script connection.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply