Importing into table with unique constraint

  • First, thanks to you guys for answering my previous question about multiple-column unique constraints.

    Now for my next (related) question: am trying to import a large file (35000 rows, ASCII-text, delimited) into my table. I tried using the DTS Import/Export wizard and it bombs when it finds a duplicate row in my file (that violates the unique constraint).

    How can I get my import to ignore/skip duplicate rows and get it to import everything else without canceling the import altogether? Is this possible using the DTS import (I suspect it isn't), or should I use one of the other tools such as bcp, bulk insert, etc.?

    In short, what's the best way to attack this issue?

    Thanks!

  • you can up the error count to get by these.

    Steve Jones

    steve@dkranch.net

  • BCP can insert it, which it also allows you to set a /e errorcount which if you set to 40000 will ignore all errors since you file is smaller. However if you are not used to BCP, then just import as a new table then reimport from the new table into the table you want it in (this way you can specify DISTINCT in the query of the source table) or you could do the same with INSERT INTO TABLEWANT (FLDLIST) SELECT DISTINCT FLDLIST FROM NEWTABLEWILLDROP. When you have you data in then you drop the other table we used to get the data in the database. Note however this will work right only if there are no other columns than the ones that have to be unique. Otherwise bcp will be you best option.

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply