October 12, 2004 at 10:19 am
Hi,
I've written about 10 DTS Packages and I have a few questions.
DTS Fetch Buffer Size and Insert Batch Size ... what do they mean to me as far as commits and performance?
DTS Hung Jobs ... I have jobs that sit forever in the Start Mode. Is there a setting in the DTS package that would make a DTS slow to extract data (first buffer I guess)?
Thanks, Anna
October 12, 2004 at 10:36 am
Not sure about the buffer size. Haven't worked with it. The batch size, however, can prevent the logs from filling. It tells the package what size of (Rows) to insert before committing the transaciton. If it errors out after 1100 rows and you have a size of 1000, 1000 rows will be inserted. The 100 would be rolled back.
Not sure about the starting issue. What does the package do?
October 13, 2004 at 1:49 am
Fetch Buffer Size sets the number of rows of data being fetched from the source.
If you're contiunally loading 100,000's rows everytime you run the package, you can adjust this up to get better throughput.
Naturally you'll need to test, test, test in you're own environment.
--------------------
Colt 45 - the original point and click interface
October 13, 2004 at 7:27 am
If we continually load 2.6 million rows what might be a good set of settings for the DTS Package?
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply