FOR ATTACH_REBUILD_LOG

  • We're testing some obfuscation ( home-grown encryption ) scripts against copies of our production database. These need work because they run very long and tend to fill up the log. In the meantime the developers are testing the scripts. Killing one of these after it has run for over 36 hours will result in a very, very long rollback process. Restarting sql server just makes the database recovery time very long. If we don't care at all about any data changes that occurred during the run of these scripts, I'm wondering if we can detach the database, move the log file, and re-attach possibly with this command: FOR ATTACH_REBUILD_LOG

  • If you don't mind the database being transactionally inconsistent at best, completely inaccessible at worst, sure, go right ahead. Just make sure you have backups because you will run into a case where the DB won't reattach.

    If SQL has to do a rollback and you delete the log, the database will not reattach. Even with that command.

    If you don't want the changes, take a backup before you start then restore the backup once you're done.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Thanks for the reply. Sounds like we just need to break these update statements down into chunks. No records are being deleted or inserted, just changing things like Name from Jones to Yones or Street Address from 333 Whiterose to 999 Smith Since some of the tables have 50-100 million rows, I'm wondering if something like this block of code used for deletion could be adapted for update statements.

    BEGIN

    -- Number of rows to delete per batch

    DECLARE @N INT

    -- Total count of rows matching specified criterion

    DECLARE @cnt decimal(18,2)

    -- No. of times DELETE statement should loop to delete all relevent records

    DECLARE @loops INT

    -- ten thousand per loop

    SET @N = 10000

    SELECT @cnt = COUNT(*) FROM user_history

    WHERE created_date_time<'2009-01-01'

    SET @loops = CEILING(@cnt/@N)

    print @loops

    WHILE @loops > 0

    BEGIN

    DELETE TOP (@N) FROM user_history

    WHERE created_date_time<'2009-01-01'

    SET @loops = @loops - 1

    --checkpoint ?

    END

    END

  • or this:

    WHILE (1 = 1)

    BEGIN

    BEGIN TRANSACTION

    UPDATE TOP (1000) [Order Details]

    SET Discount = 0.1

    WHERE Discount != 0.1 -- predicate for filtering;

    IF @@ROWCOUNT = 0 -- terminating condition;

    BEGIN

    COMMIT TRANSACTION

    BREAK

    END

    COMMIT TRANSACTION

    -- WAITFOR DELAY '00:00:01';

    END

  • Indianrock (7/31/2012)


    or this:

    WHILE (1 = 1)

    BEGIN

    BEGIN TRANSACTION

    UPDATE TOP (1000) [Order Details]

    SET Discount = 0.1

    WHERE Discount != 0.1 -- predicate for filtering;

    IF @@ROWCOUNT = 0 -- terminating condition;

    BEGIN

    COMMIT TRANSACTION

    BREAK

    END

    COMMIT TRANSACTION

    -- WAITFOR DELAY '00:00:01';

    END

    It's in a transaction, it's no different in terms of logging than a single update that affects all rows.

    If you want to do it in chunks, each chunk must be a separate transaction or all you've done is increase the amount of code written for the same result.

    p.s. 1000 rows is small. Start with 50000 and tweak from there.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • ok, what we're after is a) ability to kill the process and have that come to rest fairly quickly ( i.e. all prior transactions committed so only most recent needs to roll back and b) not fill up the log on dev servers that can barely hold a copy of our 2TB prod database ( it is in simple recovery when in DEV )

  • GilaMonster (7/31/2012)


    If you want to do it in chunks, each chunk must be a separate transaction or all you've done is increase the amount of code written for the same result.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply