December 14, 2007 at 10:13 am
I am doing a large file bulk insert into a table, I have written a stored procedure that does this.
The question that I have is as follows:
The file is 1.4 gigs, and this particular time one of the rows has bad data that is causing the file to break, meaning one of the columns in that row is larger then the specified length.
What is the easiest way of finding what row is causing this problem?
Thanks.
Ravi
December 14, 2007 at 11:35 am
You have a few options here:
1: you can create a SSIS package that dumps all bad data into a file then you can fix the data
2. import the data via the import wizard and specify the columns widths as needed and the error catching will tell you column is in fault. Continue the import of that column into a temp table and validate which ones are correct and which ones are not by using the len(column) function.
There may be an easier way, but this is all i can think of @ lunch time 😀
December 14, 2007 at 11:45 am
What do you mean by an Import Wizard? Do you mean an ssis package that will run like an import wizard?
Pretty much if the row is not in the correct format, it will insert this row in another table instead of the main table? So we are going to be doing row level checking? Let me see if I can do that.
There is no easy way of doing it in that case.
Thanks.
December 14, 2007 at 1:26 pm
yeah right click on the database --> tasks --> import data. you are correct this will execute as a ssis package, but it does show problem columns. Once you have the column you can do a check of
select * from temptable where len(problem col) > nbr its supposed to be
Like I said your other option is develop a package from BIDS and have the errors dump to a file.
December 14, 2007 at 6:04 pm
Or, use BCP with the -m (Max errors) parameter set to some ridiculously high number and the -e (err_file) set to a file name in the same directory. This will do 2 things...
1. Allow all the rest of the rows to be imported successfully even though 1 or more a "broken".
2. Will write the bad rows to a file where you can try to fix them.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply