December 14, 2012 at 1:37 am
Friends,
My requirement is to import data from a remote server, if error occurs then get the row numbers and rollback, else commit.
I am creating a test package for checking this.
Here, I need to import Employee table from server S1 to S2. Difference is that Employee table on S2 wont allow Null values in EmpName column. (There are other constraints as well). So, I move S1.Employee table to S2.EmployeeStaging table first which has same structure as S1.employee table.
I Created an execute SQL task to begin transaction,
Next I have the DFT for movement from staging to Employee table, write the error on all the rows to a flatfile and I come out of the DFT.
I have a boolean package variable bFlag(value is set in the script task inside DFT) to identify whether I need to commit or rollback the transaction.
Then I have commit and rollback execute SQL tasks.
When I run the package, if there is an error, It should rollback but it doesnt.
Do we have any restrictions on rolling back DFT transactions using execute SQL task?
Because if i check using execute sql task begin tran->insert a record-> rollback/commit, it works fine.
As you would have guessed by no, I am SSIS beginner, any help/alternatives would be much appreciated since I have read almost all articles available online and none worked.
Thanks.
December 14, 2012 at 2:05 am
Did you put the property RetainSameConnection to true on the connection manager?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
December 14, 2012 at 2:18 am
Yes.
December 14, 2012 at 3:15 am
The two Execute SQL Task and the destination in the dataflow use the same connection manager right?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
December 14, 2012 at 3:36 am
Yes, right.
December 14, 2012 at 5:16 am
So you have a transaction around the dataflow, data is transferred inside the dataflow, the dataflow errors out and data is still inserted?
That's very strange. Also because the dataflow will in most cases roll back the data itself (even without a transaction defined), unless you mess with the default properties.
What destination component did you use?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
December 17, 2012 at 2:55 am
Here, I need to import Employee table from server S1 to S2. Difference is that Employee table on S2 wont allow Null values in EmpName column. (There are other constraints as well). So, I move S1.Employee table to S2.EmployeeStaging table first which has same structure as S1.employee table.
I Created an execute SQL task to begin transaction,
Next I have the DFT for movement from staging to Employee table, write the error on all the rows to a flatfile and I come out of the DFT.
If i understand correctly :-
Task 1 :- move rows from S1.Employee table to S2.EmployeeStaging table
Task 2 :- move rows from S2.EmployeeStaging table to S2.Employee table
DFT is used for Task 2. But what is used for Task 1 ?
Also, Can you attach images of the control and data flow of your test package ?
BI Developer
SSRS, SSIS, SSAS, IBM Cognos, IBM Infosphere Cubing services, Crystal reports, IBM DB2, SQL Server, T-SQL
Please visit... ApplyBI
December 17, 2012 at 6:15 am
You can ignore step 1 without effecting anything.
In short, I need to trap errors at line numbers x, y ,z and, if error happens, need to rollback the changes, else commit.
Only difference to default SSIS settings is that I need to capture the error, keep running the transaction to list out all the errors and then rollback.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply