April 15, 2015 at 1:41 am
Hi Experts,
I am running the following query, which deletes around 800 million rows from the table.
I would like to introduce transactions with in this code and also, if success entire deletion process should be committed and number of rows deleted, table name and success should be inserted to the log table.
If there is a failure, transaction should be rolled back and table name and error message should be inserted to the same log file
Select 1
While @@ROWCOUNT > 0
Begin
DELETE Top(100000) FROM [dbo].[Table1]
FROM [dbo].[Table2]
INNER JOIN [dbo].[Table3] ON [Table2].[PracticeID] = [Table3].[PracticeID]
INNER JOIN [dbo].[Table1] ON [Table3].[InputDevicePracticeID] = [Table1].[InputDevicePracticeID]
WHERE [Table2].PracticeID =55;
End
Please do let me know how to achieve this?
Thanks,
Naveen
April 15, 2015 at 2:48 am
If you wrap that in a transaction, then you're removing the point of running batched deletes in a loop, you may as well remove the while and the top if you're going to introduce transactions
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
April 15, 2015 at 2:54 am
You need to wrap everything in a try/catch and handle the transaction accordingly.
You could use something similar to the stored procedure code template you can find here: http://spaghettidba.com/2011/07/08/my-stored-procedure-code-template/
This is how I would do it:
SET NOCOUNT ON;
SET XACT_ABORT,
QUOTED_IDENTIFIER,
ANSI_NULLS,
ANSI_PADDING,
ANSI_WARNINGS,
ARITHABORT,
CONCAT_NULL_YIELDS_NULL ON;
SET NUMERIC_ROUNDABORT OFF;
BEGIN TRANSACTION
DECLARE @LogTable TABLE (
Outcome bit,
RowsAffected int,
MessageText nvarchar(max)
);
BEGIN TRY
DECLARE @RowCnt int = 1;
DECLARE @TotalRowCnt int = 1;
WHILE @RowCnt > 0
BEGIN
DELETE Top(100000) FROM [dbo].[Table1]
FROM [dbo].[Table2]
INNER JOIN [dbo].[Table3] ON [Table2].[PracticeID] = [Table3].[PracticeID]
INNER JOIN [dbo].[Table1] ON [Table3].[InputDevicePracticeID] = [Table1].[InputDevicePracticeID]
WHERE [Table2].PracticeID =55;
SET @RowCnt = @@ROWCOUNT;
SET @TotalRowCnt += @RowCnt;
END
INSERT INTO @LogTable (Outcome, RowsAffected, MessageText)
VALUES (0, @TotalRowCnt, N'Operation completed successfully');
IF XACT_STATE() = 1
COMMIT TRANSACTION
END TRY
BEGIN CATCH
DECLARE @ErrorMessage NVARCHAR(4000)
DECLARE @ErrorSeverity INT
DECLARE @ErrorState INT
SELECT @ErrorMessage = ERROR_MESSAGE(),
@ErrorSeverity = ERROR_SEVERITY(),
@ErrorState = ERROR_STATE()
IF XACT_STATE() <> 0
ROLLBACK TRANSACTION
INSERT INTO @LogTable (Outcome, RowsAffected, MessageText)
VALUES (1, @TotalRowCnt, N'Operation completed with errors: ' + @ErrorMessage);
RAISERROR ( @ErrorMessage, @ErrorSeverity, @ErrorState)
END CATCH
-- Gianluca Sartori
April 15, 2015 at 2:54 am
Then, how do we track the errors?
since table is very large, I cannot remove batch delete process.
I wanted some mechanism to track tables successfully deleted and failed.
Thanks,
Naveen J V
April 15, 2015 at 2:55 am
GilaMonster (4/15/2015)
If you wrap that in a transaction, then you're removing the point of running batched deletes in a loop, you may as well remove the while and the top if you're going to introduce transactions
Extremely good point.
-- Gianluca Sartori
April 15, 2015 at 3:09 am
Naveen J V (4/15/2015)
since table is very large, I cannot remove batch delete process.
But by adding a transaction you'll have that effect. Deletes are done in batches to keep the log from growing too large and to avoid lock escalation. By wrapping the loop in a transaction you're negating both, even if you leave the looping code in.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
April 15, 2015 at 3:19 am
Gail, you are absolutely right.
I have doing a batch delete due to large size of the table. But in case if I get error during delete, how do I capture?
If I get an error, I need to abort the operation and log the error.
April 15, 2015 at 3:24 am
Then you're either going to have to accept that the delete will have more severe impact (because it's not batched) or do something complex like insert the rows into another table as you delete and put them back if something does fail (complex)
Catching the error is easy, undoing the delete is the harder part.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
April 15, 2015 at 3:54 am
Thanks Gail.
April 15, 2015 at 12:17 pm
Well is your goal just to get rid of the data in the table or do you have a reason you need to capture errors when deleting row by row?
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply