Somebody explain Checkpoint to me

  • I'm using the below SQL to delete about 110million rows in batches of 100,000

    It's working fine in that if I stop the script, everything up to that point stays deleted.

    The problem I have is that twice now, the transaction log has filled up, killed the script, and SQL has rolled back further than just the last 100,000 rows.

    I thought that by issuing a CHECKPOINT, all dirty pages were written to disk and the trans log was truncated so I'm confused why the log is still filling up. The DB is in Simple Recovery mode.

    Does anyone have any advice?

    SET NOCOUNT ON;

    DECLARE @r INT;

    SET @r = 1;

    WHILE @r > 0

    BEGIN

    BEGIN TRANSACTION;

    DELETE TOP (100000)

    FROM tablename

    WHERE datefield < '2008-06-21 00:00:00';

    SET @r = @@ROWCOUNT;

    COMMIT TRANSACTION;

    CHECKPOINT;

    END

  • planetmatt (4/29/2015)


    I'm using the below SQL to delete about 110million rows in batches of 100,000

    It's working fine in that if I stop the script, everything up to that point stays deleted.

    The problem I have is that twice now, the transaction log has filled up, killed the script, and SQL has rolled back further than just the last 100,000 rows.

    I thought that by issuing a CHECKPOINT, all dirty pages were written to disk and the trans log was truncated so I'm confused why the log is still filling up. The DB is in Simple Recovery mode.

    Does anyone have any advice?

    SET NOCOUNT ON;

    DECLARE @r INT;

    SET @r = 1;

    WHILE @r > 0

    BEGIN

    BEGIN TRANSACTION;

    DELETE TOP (100000)

    FROM tablename

    WHERE datefield < '2008-06-21 00:00:00';

    SET @r = @@ROWCOUNT;

    COMMIT TRANSACTION;

    CHECKPOINT;

    END

    Not sure why your transaction log is filling up, but I have a question. You say you are trying to delete 110,000,000 rows of data, how many rows of data are you keeping?

  • And can you have some down time to complete the deletion of data?

  • Lynn Pettis (4/29/2015)


    And can you have some down time to complete the deletion of data?

    There are 737million rows total. I need to delete all records related to a master record created before a cut off date before a data migration team migrate the data into a new system

    Yes, I will have a downtime window to delete the data and right now I'm running scripts against a test system to figure out timings for the actual live clear down. I estimate this table and the deletion of 111million rows will take about 8 hours in batches of 100,000.

    I need to be able to leave this script running overnight but I can't if the log fills up and it rolls back millions of rows.

  • I suspect you had a nested transaction. If you tried to test the script by stopping it, you may have created a nested transaction the second time you ran it.

    SELECT @@trancount

  • Bill Talada (4/29/2015)


    I suspect you had a nested transaction. If you tried to test the script by stopping it, you may have created a nested transaction the second time you ran it.

    SELECT @@trancount

    Doh, yeah, 3 open trans. How stupid? Thanks. I've committed all open trans and going to rerun it again now and leave overnight. I fully expect the log not to fill up now. Total schoolboy error.

    Thanks so much.

  • This usually does the trick, placed at the top of a test script:

    -- OPEN trans?

    WHILE @@trancount > 0 COMMIT TRAN

    GO

  • 2 things:

    If you only have the DELETE and the SET inside your transaction, you don't need an explicit transaction.

    If you're going to use explicit transactions, you should use TRY...CATCH blocks for error handling and if you stop the batch, be sure to commit or rollback the transaction.

    Luis C.
    General Disclaimer:
    Are you seriously taking the advice and code from someone from the internet without testing it? Do you at least understand it? Or can it easily kill your server?

    How to post data/code on a forum to get the best help: Option 1 / Option 2

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply