How to skip bad import data

  • I import text (CSV) files into a sql2k database. These text files can have hundreds of thousand of lines. Maybe .001% of the data is bad, for example,

    .... "joe", "blow", "233 N "E" Street", ....

     

    and that cause DTS to error out. I am not interested in correcting the bad data, just skipping over it during the import. So, how can I tell DTS to ignore those bad records and continue importing the remainder of the file?

     

    TIA,

    Bill

     

     

  • On the options tab for the transform you can set a max error count.  The default is 0 (no errors allowed), set this v. high.  I don't know if there is a maximum  limit, BOL doesn't say.

     

    Julie

  • Julie,

    I saw that DTS option but when I set it to 1000, I noted the DTS error log started reporting errors on lines 3456, 3457, 3458, 3459.....4456. which was not the case. So I set it back to zero. But since you mentioned it, I'll take another look at this option.

    Thanks,

    Bill

     

  • Julie,

    I've always dumped my data into a dummy or temp table first, with no restrictions on the data, (every field is a varchar) and then copied the good data into the "final" table. 

    This has allowed me to fix or ignore bad data.

    Regards,
    Matt

  • Another thought would be to use active-x tasks in your transformation with a replace statement.  If all your dealing with is somehting that is quoted you can use replace(fieldx, """, "")

    Regards,
    Matt

  • Checkout the Multi-Phase feature of the Transform Data task, or you could also use the Data Driven Query Task.  Both are very handy for cases such as yours. 

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply