August 14, 2009 at 6:41 am
Dear All,
I have a requirement where my SP will have some n number of Insert/Update statements. Some of the statements may throw error "Cannot Insert duplicate rows" due to unique constraints...
My objective is i can ignore these errors and continue with next statements. Please let me know whether i can use Try / Catch for this scenario and how?
Thanks in Advance....
August 14, 2009 at 6:53 am
Of course you can!
Try/Catch was introduced to allow the developer to handle errors in code. One legitimate way to handle an error is to simply ignore it.
Be careful that a catch block cannot be empty, so you will have to do something into the block:
BEGIN TRY
INSERT INTO MyTable VALUES (1,2,3)
INSERT INTO MyTable VALUES (1,2,3)
END TRY
BEGIN CATCH
PRINT ''
END CATCH
Another thing to keep in mind is that the execution jumps to the catch block immediately if an error is encountered: this means that a catch block that ignores errors is not equal to Visual Basic's "On Error Resume Next". After an error is thrown in the try block, the instructions subsequent to the one that raised the error are not executed.
If you want to ignore errors for any single instruction, you will have to create a separate try block for each one:
-- Insert and ignore errors 1
BEGIN TRY
INSERT INTO MyTable VALUES (1,2,3)
END TRY
BEGIN CATCH
PRINT ''
END CATCH
-- Insert and ignore errors 2
BEGIN TRY
INSERT INTO MyTable VALUES (1,2,3)
END TRY
BEGIN CATCH
PRINT ''
END CATCH
Hope this helps
Gianluca
-- Gianluca Sartori
August 14, 2009 at 7:13 am
Does the try catch work for all cases? For instance if the error is a deadlock, doesnt itjust fail the batch completly and not go to Try Catch? Just out of curiosity I am asking
-Roy
August 14, 2009 at 7:17 am
Hi,
Thanks a lot! for your quick response....
August 14, 2009 at 7:27 am
From BOL:
Errors Unaffected by a TRY…CATCH Construct
TRY…CATCH constructs do not trap the following conditions:
* Warnings or informational messages that have a severity of 10 or lower.
* Errors that have a severity of 20 or higher that stop the SQL Server Database Engine task processing for the session. If an error occurs that has severity of 20 or higher and the database connection is not disrupted, TRY…CATCH will handle the error.
* Attentions, such as client-interrupt requests or broken client connections.
* When the session is ended by a system administrator by using the KILL statement.
The following types of errors are not handled by a CATCH block when they occur at the same level of execution as the TRY…CATCH construct:
* Compile errors, such as syntax errors, that prevent a batch from running.
* Errors that occur during statement-level recompilation, such as object name resolution errors that occur after compilation because of deferred name resolution.
These errors are returned to the level that ran the batch, stored procedure, or trigger.
And another entry from BOL specific on deadlock:
Handling Deadlocks
TRY…CATCH can be used to handle deadlocks. The 1205 deadlock victim error can be caught by the CATCH block and the transaction can be rolled back until the threads become unlocked. For more information about deadlocking, see Deadlocking.
-- Gianluca Sartori
August 14, 2009 at 7:29 am
I think TRY/CATCH was one of the biggest enhancements introduced in T-SQL with SQL2K5.
I wonder how on earth I could live without it.
-- Gianluca Sartori
August 14, 2009 at 7:33 am
Thanks Gianluca for clarifying it. 🙂
-Roy
August 14, 2009 at 7:47 am
... oops! I found it: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=296760
I think I will vote for it.
-- Gianluca Sartori
August 14, 2009 at 11:59 am
haichells (8/14/2009)
Dear All,I have a requirement where my SP will have some n number of Insert/Update statements. Some of the statements may throw error "Cannot Insert duplicate rows" due to unique constraints...
My objective is i can ignore these errors and continue with next statements. Please let me know whether i can use Try / Catch for this scenario and how?
Thanks in Advance....
Gianluca is right - you CAN implement using try/catch. That said - his example isn't right for your scenario:bunching all of the INSERTS into a single TRY/CATCH block will execute all inserts until it runs into the first duplicate, which will then fail and make the CATCH kick in, and then the procedure will keep executing everything AFTER the try/catch block. Meaning - you just skipped all of the inserts AFTER the first duplicate.
If you want to insert one row at a time AND get as many in as you can - EACH has to be in its own TRY/CATCH block. This is just one of several reasons why Joe's concept is along the right lines (or using the older version of the same kind of MERGE he refers to).
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
August 15, 2009 at 6:50 am
Matt Miller (8/14/2009)
If you want to insert one row at a time AND get as many in as you can - EACH has to be in its own TRY/CATCH block.
That's what I suggested indeed. I agree with Joe anyway that checking for each row one at a time is not efficient and there are many strategies to do it set based.
That said, you could also take advantage of the branching statements to ignore errors for each row in a loop:
DECLARE @test-2 TABLE (
columnA INT PRIMARY KEY
)
DECLARE cur CURSOR STATIC FORWARD_ONLY
FOR
SELECT TOP 100 id
FROM master.dbo.syscolumns
OPEN cur
DECLARE @id int
nextRow:
FETCH NEXT FROM cur INTO @id
WHILE @@FETCH_STATUS = 0
BEGIN
BEGIN TRY
INSERT INTO @test-2 VALUES (@id)
END TRY
BEGIN CATCH
GOTO nextRow
END CATCH
FETCH NEXT FROM cur INTO @id
END
CLOSE cur
DEALLOCATE cur
I would not recommend it anyway if there's a set based solution avaliable (and there is always at least one).
-- Gianluca Sartori
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply