February 25, 2011 at 11:44 pm
I need to dump some rows into a staging table.
However I want sql server to continue with the rest of the rows when 1 or more rows return an error.
Is there any way I can do this without using any outside tools?
February 26, 2011 at 2:35 am
Yes, it's possible:
Option 1:
Find the logic that will cause an error and exclude those rows before you perform the insert.
Option 2:
Since it's a staging table, you could remove the reasons for getting the error (e.g. change suspicious columns to varchar/nvarchar) even if those columns should only hold numerc data.
As a side note: I recommend to change the way you think about dealing with this insert process. You shouldn't think about what can go wrong with a single row. Think about it what columns can cause an error and react accordingly. Thinking about a row at a time is considered RBAR (Row-By-Agonizing-Row) and will usually lead to poor performance due to inefficient code.
February 26, 2011 at 11:14 pm
Khades (2/25/2011)
I need to dump some rows into a staging table. However I want sql server to continue with the rest of the rows when 1 or more rows return an error. Is there any way I can do this without using any outside tools?
Depends what you mean by 'outside tools'. SSIS is part of SQL Server and has the feature you need built-in.
One other pure TSQL method not mentioned would be to bulk copy the rows out to a file and then bulk copy them back in, specifying the maximum number of errors you can tolerate, and where to route the error rows. I would use SSIS.
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
February 27, 2011 at 1:43 am
SQLkiwi (2/26/2011)
Khades (2/25/2011)
I need to dump some rows into a staging table. However I want sql server to continue with the rest of the rows when 1 or more rows return an error. Is there any way I can do this without using any outside tools?Depends what you mean by 'outside tools'. SSIS is part of SQL Server and has the feature you need built-in.
One other pure TSQL method not mentioned would be to bulk copy the rows out to a file and then bulk copy them back in, specifying the maximum number of errors you can tolerate, and where to route the error rows. I would use SSIS.
I would use SSIS too. But it was not the option for this project (I don't know why). By outside tools I meant SSIS. But I didn't want to say SSIS strictly and then have someone suggest Informatica or another ETL tool. 😀
February 27, 2011 at 2:05 pm
As of 2005 (IIRC), Bulk Insert (and BCP since way back when) will let you identify an errata file to save errored rows in. You simply tell Bulk Insert to allow an impossibly large number of error rows and the Bulk Insert will load the "good" rows in staging and send the bad rows (and some collateral damaged) rows to a bad file for "rework".
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply