Fast data load with Error Handling (DTS)?

  • Hi,

    I am loading data from large fixed field flat files to SQL Server database using DTS packages. I have encountered some records which error out because of constraints on the table. If I do a load with default DTS options, the load is fast but if it encounters even a single error it fails completely without inserting even a single row.

    When I try to capture error in a file and with "insert batch size" = 1 and "Always commit final batch" checked. The data is loaded as desired but it is very very slow. It is about 4-5 times slower.

    Does someone has a better idea on how to improve the load time with error handling?

    Thanks

  • When I looked at a simmilar issue I tried 2 ways. 1 - Using ActiveX script in the transformation. 2 - Loading the file into a staging table with no constraints and running several sql task checks againts the important fields. This way I knew exactly which were dud records and which were ok. I could then load the good records and e-mail the flat file data manager with the duds. Yes it takes a little longer but it does a thorough job and takes less time than trying to load the data have it fail and loading it again.... and again.

  • Thanks for your help Wildh.

    Checking data for contraints using sql tasks is good suggestion but I have lots of tables involved and that would mean checking all tables and all constraints on them using sql tasks.

    I was thinking is there something i am missing in the DTS configuration that could still reject bad records and keep loading good ones at a fast speed.

    Do you have any suggestion on it?

    Thanks

  • Increase the Max Errors to an "acceptable"  number

    AND

    Bulk Insert into an staging the File then

    Compare the two to know what's missing!

    if you want the source error row logged to avoid the Bulkinsert then Fast load HAS TO BE TURN OFF

     

    HTH

     


    * Noel

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply