March 19, 2009 at 1:18 pm
My mistake, the data source IS USING Tab as delimiter, but I am still failing here to import, the wizard always stop with error message like:
The data conversion for column "Column 2" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
Why SQL can't handle some exception? this is really stupid, it should at least let the import carry forward, not just stop there.
March 19, 2009 at 1:46 pm
OK, I am getting there, now I am able to import the data, except two issues here:
1. There is a EOF in my data source file, something like: EOF Final Record Count: 4654. How can I tell SQL to ignore it?
2. I am in the dtsx package design, the imported data contains two double quotes which I don't want, for example, data should be: 001, but imported with two extra double quotes as "001"
Thanks for your patient.
March 19, 2009 at 2:20 pm
Do you use " as a text qualifier?
March 19, 2009 at 2:32 pm
Hi
This isn't a perfect solution but it might get you a step closer.
Try a bulk insert. You will need to define your table structure and you will also need to do a bit of cleaning up in two of the columns afterwards.
create table #csvtest (
col1Name varchar(128),
col2Name varchar(128),
col3Name varchar(128),
col4Name varchar(128),
col5Name varchar(128),
col6Name varchar(128),
col7Name varchar(128))
BULK INSERT #CSVTest FROM 'c:\temp\imp.txt'
WITH (
FIELDTERMINATOR = '" "',
ROWTERMINATOR = ''
)
Bevan
March 19, 2009 at 7:22 pm
dmoldovan (3/19/2009)
Do you use " as a text qualifier?
Thanks. That's exactly what I just found out. How about removing the last EOF?
March 19, 2009 at 11:35 pm
halifaxdal (3/19/2009)
dmoldovan (3/19/2009)
Do you use " as a text qualifier?Thanks. That's exactly what I just found out. How about removing the last EOF?
Not to mention the leading quote in the first column 😉 You'll need to update the table to get rid of the quotes.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 20, 2009 at 1:31 am
halifaxdal (3/19/2009)
dmoldovan (3/19/2009)
Do you use " as a text qualifier?Thanks. That's exactly what I just found out. How about removing the last EOF?
Unfortunately the import/export wizard will not help you with this...You'll have to set up your own routine - like, for example, importing all the file and delete the last row...
An advantage of using custom programming or maybe SSIS is that you can use the "record count" to check one more time the import...
March 20, 2009 at 4:18 am
Looking at the sample data, the column separator should be a space, with the text delimiter set as double quotes - this should then handle any embedded spaces in the data properly
March 20, 2009 at 6:12 am
You can also use the BULK INSERT command with a BULK Format file.
March 20, 2009 at 8:46 am
Thank you all for the reply, like I mentioned earlier, the problem has almost been resolved except: how to ignore the last line in the source file, so please keep the discussion on this.
March 20, 2009 at 10:57 am
halifaxdal (3/20/2009)
Thank you all for the reply, like I mentioned earlier, the problem has almost been resolved except: how to ignore the last line in the source file, so please keep the discussion on this.
Fine... the best way to fix that is to get the people who are providing the data to fix it.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 20, 2009 at 12:38 pm
Try using double-quote as the text qualifier and space as the delimiter.
March 23, 2009 at 7:20 am
Jeff Moden (3/20/2009)
halifaxdal (3/20/2009)
Thank you all for the reply, like I mentioned earlier, the problem has almost been resolved except: how to ignore the last line in the source file, so please keep the discussion on this.Fine... the best way to fix that is to get the people who are providing the data to fix it.
Unfortunately, data provider said the extra line must be inserted for their own purpose, this means I have to get around with it.
March 23, 2009 at 2:29 pm
halifaxdal (3/23/2009)
Jeff Moden (3/20/2009)
halifaxdal (3/20/2009)
Thank you all for the reply, like I mentioned earlier, the problem has almost been resolved except: how to ignore the last line in the source file, so please keep the discussion on this.Fine... the best way to fix that is to get the people who are providing the data to fix it.
Unfortunately, data provider said the extra line must be inserted for their own purpose, this means I have to get around with it.
You told folks to keep the discussion centered around getting rid of the last line... everything they've mentioned so far is pertinent to possibly developing a method to getting rid of the last line and I'm not sure why you felt you had to say "so please keep the discussion on this". The ironic part is that your request came right after the post with the probable best solution... a BCP format file.
If you'd like to attach a file with at least the first 10 rows and the last 10 rows, we can probably work this out in one form or another. Heh... hopefully, that's keeping the disussion on this. 😛
Warning... if the file contains any private information for anyone, do not post it. Write back and we'll do something else.
--Jeff Moden
Change is inevitable... Change for the better is not.
March 24, 2009 at 7:20 am
If
Viewing 15 posts - 16 through 29 (of 29 total)
You must be logged in to reply to this topic. Login to reply