February 13, 2002 at 4:59 am
I have a CSV file of 105k records that when I used DTS, BCP or BULK INSERT the data does not import correctly. The file is formated like ,,,"text",,"text" and so on.. when building a DTS Package I select the row as LF and the Text Deliminator as " and I immediately get a message telling me that the usage of the row deliminator is not correct. There is a few fields that have ,"text,text,text", my only thought is it is finding the ,'s inside the text.. isn't the concept of adding the text qualifiers supposed to resolve that?
February 13, 2002 at 6:48 am
You are correct about the text qualifiers. They should ignore the embedded commas. Have you tried using the "{CR}{LF}" as the row delimiter? If that does not work, can you open the file in Excel or some other viewer to determine if the rows actually have a delimiter - it may be fixed width, with no delimiter.
February 13, 2002 at 10:44 am
Funny thing is that the file is too large for excel. From what I can see it is fine. We have processed the file in Perl.. and it works. My initial thought is that the file is to large for SQL which is why it errors out. Although I have used the file to import to do some database development... the columns don't line up because I am missing the text qualifier... but it imports.
Do you know how to pass BCP with the correct qualifiers ?? The database is 163 columns... so i would rather not build a fmt file "just to test" only to find out that it does not work.
This is an EDI file... if that helps anyone for input.
Thanks
February 13, 2002 at 11:07 am
You can run BCP without the -n or -c options and it will ask you questions about each column. At the end, it will offer to save a format file. That's the easiest way to build a format file. By hand is silly. Once you save the file, you can cancel (CTRL-C) the bcp.
Steve Jones
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply