Bulk insert from inconsistent file

  • Hi,

    I am trying to do a BULK insert using a .csv file. The issue is that only those text fields that have commas embedded in them are surrounded by double-quotes.

    Here is an example of three records

    1,This is fine, 2000,Ok

    2, "This is, not fine", 3000,Not OK

    3, Next field is an issue, 4000, "Oops, has a comma"

    Is there anyway I can import this programatically using BULK insert or bcp. I know I can do this using DTS, but I would like to do it programatically as there are multiples files in a folder and I have to loop through each file. Doing this in DTS would be tough as it does not support loops (SQL 2000).

    Any help would be appreciated.

    Thanks,

    Amir

     

  • I''ve not used bcp very much so can't answer your question intelligently, though I do believe it's possible.

    However, you can indeed loop in DTS.  See here:

    http://www.sqldts.com/default.aspx?246

    I've got several DTS packages in production that do this.  One word of note, you have to set ExecutionStatus = DTSStepExecStat_Waiting

    for *all* DTS steps you want to re-run.

  • Thanks a lot. that would help. I would also like to know if it is possible to load this kind of data using BULK insert or bcp and a format file and how to do it.

    Thanks,

    Amir

     

  • bcp cannot optionally detect quotes, it can only process the data if the quotes are always present for the specific columns, so it will not work for your data

    as already stated use dts or bulk insert into a staging table and write sql to parse the data

    Far away is close at hand in the images of elsewhere.
    Anon.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply