September 5, 2014 at 2:31 am
Hi All,
I am copying a data from source which has 500 rows.
Error is coming at Row98 and I have set the MaxInsertCommitSize of 100 in the destination table fast load option.
After running the package it fails but copies 97 rows, which I guess shouldn't have been done.
It should have rolled back the entire batch since commit size was 100
Help please!!!
September 5, 2014 at 2:51 am
What is the batch size and what is the buffer size of the data flow?
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
September 5, 2014 at 5:11 am
Buffer Size is Default i.e. 10485760
And BatchSize, I guess is MaxInsertCommitSize viz. 100
September 5, 2014 at 5:16 am
er.mayankshukla (9/5/2014)
Buffer Size is Default i.e. 10485760And BatchSize, I guess is MaxInsertCommitSize viz. 100
There is also a commit at the end of a buffer, if I'm not mistaken.
Batch Sizes, Fast Load, Commit Size And The OLE DB Destination[/url]
But with only 97 rows committed, you would need to have some pretty large rows to get less than 97 rows in a single buffer.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
September 5, 2014 at 8:26 am
Well this is bug in BIDS2008
I have tested the same in DataTools2010 and it works fine.
I am using 2008R2, I think I am missing some Patch for this
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply