May 26, 2004 at 7:24 am
I have a flat text file, actually in CSV format, that I'm trying to do a simple import into a table using DTS. I define the text file source and the database connection. Then I go to define the Transform Data task. I point to the file, assume the defaults (double-quote is text delimiter, CR-LF is end of line). The first couple of records appear in the preview window. I click next to go to the page where you specify the Column Delimiter and I get the message "Invalid delimiter data: text qualifier must be followed by a column delimiter (except the last column).
I imported the file into Excel and it came in with no problems. Obviously I can use this as a manual workaround but I'd like to figure out why the error message is being generated and which row(s) in the input text file are causing it. I've tried removing all the single quotes that appear within the text fields. That didn't help. There are empty text files ("") but that shouldn't be a problem, I have other files with empty fields and they work fine.
Any ideas on how to track this one down?
May 26, 2004 at 12:06 pm
This MS Knowledge Base article addresses the error:
http://support.microsoft.com/default.aspx?scid=kb;en-us;241761&Product=sql2k
As for finding the input rows that cause the error, you could try enabling error logging on the "options" tab of the Transform Data Task.
Greg
Greg
May 26, 2004 at 12:41 pm
It wasn't the MS reported problem.
I finally broke down and bulled my way through it by changing all the "," to !!! and then all the "<crlf>" to !!. Then I searched for " and found the following address:
150 Main St. Apt "B"
Duh... Apparently the dimbulbs in our HR IT division who maintain our personnel system are incapable of editing addresses for reasonably valid characters. But then again if they captured everyone's ZIP+4 up front I wouldn't be running this exercise every month.
I'll have to take a bad copy of the file and try error logging on the Transform Data Task to see if it would have pointed me to the bad data.
May 27, 2004 at 6:07 am
LOL... been there and done that!
Sometimes it takes just examining the data file in fine detail to identify potential gotchas. A good text editor always seems to be your best friend in these situations. I use a product called Textpad (www.textpad.com) to analyze my data. There are others out there that are just as effective.
One more thing to watch for on flat files. A unix based database system that I get flat data files will sometimes use a simple CR for the EOR instead of the CRLF which is the defacto standard EOR for IBM text files. This is a simple to identify problem if you look at the file in a HEX editor view and see if there is the "13 10" combination at the end of each data line. The fix is just as easy using the same VB app solution I mention above.
I prefer flat data files for my EDI processing, but the legacy systems I deal with constantly require me to come up with ways to clean up the flat files before I even attempt to import them. This is not a shortcoming of the DTS product, it is a shortcoming of most of the legacy systems.
"I will not be taken alive!" - S. Hussein
May 27, 2004 at 2:43 pm
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply