September 11, 2006 at 2:34 am
I'm trying to import a file with the following data:
19402,200608,899430014, 120.000, 122.040, 165.9
19402,200608,899499003, 28.000, 154.890, 210.6
19402,200608,899511006, 10.000, 51.750, 70.3
21334,200509,099001001, 3.000, 0.000, 65.79,
21334,200509,099002001, 2.000, 0.000, 8.78,
21334,200509,099011001, 2.000, 0.000, 8.77,
21334,200509,406942001, 10.000, 1.740, 3.25,
As you can see above, some of the rows end with a comma and some don't. Is it possible to handle this effectively in DTS / SSIS using BULK INSERT or similar or should I write a VB app and read through files using the FileSystemObject?
Thanks for any help.
September 12, 2006 at 5:40 am
Not sure whether DTS/SSIS will ignore trailing commas, however, I can think of two ways to remove them from a file :-
1) if the file is not too big, then I am fairly sure the regex search and replace in textpad text editor should be able to handle the removal
2) If you have perl installed then the perl program :-
s/,$//g
should remove them
September 12, 2006 at 6:22 am
When defining the import text file, choose the transformation button and tell it to ignore the last column. Select ignore on the Source column.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply