April 29, 2015 at 9:51 am
I'm using the below SQL to delete about 110million rows in batches of 100,000
It's working fine in that if I stop the script, everything up to that point stays deleted.
The problem I have is that twice now, the transaction log has filled up, killed the script, and SQL has rolled back further than just the last 100,000 rows.
I thought that by issuing a CHECKPOINT, all dirty pages were written to disk and the trans log was truncated so I'm confused why the log is still filling up. The DB is in Simple Recovery mode.
Does anyone have any advice?
SET NOCOUNT ON;
DECLARE @r INT;
SET @r = 1;
WHILE @r > 0
BEGIN
BEGIN TRANSACTION;
DELETE TOP (100000)
FROM tablename
WHERE datefield < '2008-06-21 00:00:00';
SET @r = @@ROWCOUNT;
COMMIT TRANSACTION;
CHECKPOINT;
END
April 29, 2015 at 9:55 am
planetmatt (4/29/2015)
I'm using the below SQL to delete about 110million rows in batches of 100,000It's working fine in that if I stop the script, everything up to that point stays deleted.
The problem I have is that twice now, the transaction log has filled up, killed the script, and SQL has rolled back further than just the last 100,000 rows.
I thought that by issuing a CHECKPOINT, all dirty pages were written to disk and the trans log was truncated so I'm confused why the log is still filling up. The DB is in Simple Recovery mode.
Does anyone have any advice?
SET NOCOUNT ON;
DECLARE @r INT;
SET @r = 1;
WHILE @r > 0
BEGIN
BEGIN TRANSACTION;
DELETE TOP (100000)
FROM tablename
WHERE datefield < '2008-06-21 00:00:00';
SET @r = @@ROWCOUNT;
COMMIT TRANSACTION;
CHECKPOINT;
END
Not sure why your transaction log is filling up, but I have a question. You say you are trying to delete 110,000,000 rows of data, how many rows of data are you keeping?
April 29, 2015 at 9:57 am
And can you have some down time to complete the deletion of data?
April 29, 2015 at 10:02 am
Lynn Pettis (4/29/2015)
And can you have some down time to complete the deletion of data?
There are 737million rows total. I need to delete all records related to a master record created before a cut off date before a data migration team migrate the data into a new system
Yes, I will have a downtime window to delete the data and right now I'm running scripts against a test system to figure out timings for the actual live clear down. I estimate this table and the deletion of 111million rows will take about 8 hours in batches of 100,000.
I need to be able to leave this script running overnight but I can't if the log fills up and it rolls back millions of rows.
April 29, 2015 at 10:03 am
I suspect you had a nested transaction. If you tried to test the script by stopping it, you may have created a nested transaction the second time you ran it.
SELECT @@trancount
April 29, 2015 at 10:07 am
Bill Talada (4/29/2015)
I suspect you had a nested transaction. If you tried to test the script by stopping it, you may have created a nested transaction the second time you ran it.SELECT @@trancount
Doh, yeah, 3 open trans. How stupid? Thanks. I've committed all open trans and going to rerun it again now and leave overnight. I fully expect the log not to fill up now. Total schoolboy error.
Thanks so much.
April 29, 2015 at 11:50 am
This usually does the trick, placed at the top of a test script:
-- OPEN trans?
WHILE @@trancount > 0 COMMIT TRAN
GO
April 29, 2015 at 11:55 am
2 things:
If you only have the DELETE and the SET inside your transaction, you don't need an explicit transaction.
If you're going to use explicit transactions, you should use TRY...CATCH blocks for error handling and if you stop the batch, be sure to commit or rollback the transaction.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply