June 30, 2005 at 12:28 am
Hi,
Client needs to constantly update a table with data that comes to him in as a csv file. I tried using DTS to do this. I chose the 'text file' driver. The import crashes randomly at some row giving the error
Error at Source for Row number 27087. Errors encountered so far in this task: 1.
I tried importing to a staging table that I created and also to a table that DTS itself would create. Same issue. Then, I converted the csv file to a .xls file and used the xls driver for the import. Now, it runs successfully without errors.
What could be the reason for the csv to fail. I checked the truant row and found nothing apparently wrong?
Pls help me.
thx.
Vis.
July 1, 2005 at 2:29 am
July 1, 2005 at 5:13 am
Hi Niki,
Did as suggested. Now I am getting a little further on the import only to run into the problem listed below:
Error at Source for Row number 37632. Errors encountered so far in this task: 1. Too many columns found in the current row; non-whitespace characters were found after the last defined column's data.
---------------------------
OK
---------------------------
As usual, I ran manually thro' the row and couldn't find anything wrong. I tried changing the FILE TYPE setting from ANSI to OEM. Ran into the original problem again listed in my first post.
Any clues. Thx for helping me out.
Vis.
July 1, 2005 at 6:27 am
Hi,
do you have a SP installed on the SqlServer? Check http://support.microsoft.com/kb/q300042/
Also, is it possible that you have a variable number of columns in each row?
July 1, 2005 at 7:10 am
Hi Shrinivas,
Your quote
Hi Niki,
Did as suggested. Now I am getting a little further on the import only to run into the problem listed below:
Error at Source for Row number 37632. Errors encountered so far in this task: 1. Too many columns found in the current row; non-whitespace characters were found after the last defined column's data.
---------------------------
OK
---------------------------
As usual, I ran manually thro' the row and couldn't find anything wrong. I tried changing the FILE TYPE setting from ANSI to OEM. Ran into the original problem again listed in my first post.
Any clues. Thx for helping me out.
Vis.
---------------------
Your csv file may be having unwanted comma/s in the said record i.e. row No. 37632, because of whcich DTS gets more columns then required,
Pls. check it!!!
Raj...
July 1, 2005 at 8:53 am
Recently I had similar problem while transforming data from dbase to sql server 2000. I downloaded dbf viewer from http://www.dbf2002.com. Opened the .dbf file, saved it as .sql file and ran the script in query analyzer as INSERT...
Don't run at one go, but in 1000's if it is huge! You will make out if any of the record has extra quote or no quote
July 1, 2005 at 9:08 am
Your problem is that your CSV file is saved using a Text Qualifier. You are missing a closing double quote in a column that contains a comma in the column data. The error is caused from the Text Parser treating the data from the opening double quote to the end-of-line character (vbCrLf - 0x0D0A h) as one column. The second error you are receiving is because it is now ignoring the double quotes and seeing an extra comma seperator.
1- Open the file in Excel and scroll down to erroring row. shift the data to the correct columns, then save as a csv file. This will correct the formatting issues.
-OR-
2- Another option you would have would be to create an actixe-x script to read the file in line by line and by using Split(rowdata ,Char(34), 3) fix elementand then rowdata=Join() commands in a loop to replace the commas with an odd string... like "_#comma#_"... write out the new file.
Your transform would need to replace the odd string with the comma...
CONST MyFunkyString = "_#comma#_"
DTSDestination("Col001") = Replace( DTSSource("Col001"), MyFunkyString, ",")
DTSDestination("Col002") = Replace( DTSSource("Col002"), MyFunkyString, ",")
...I recomend the first option to promote better usefullness of your limited time on this planet.
-Mike Gercevich
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply