Possible deferred transaction

  • We ran a test yesterday : sql server 2005 Enterprise. We started an insert of all the records in one large table into an empty copy of that table in another filegroup/file.

    Despite log backups every 15 minutes, the log was autogrowing and nearing the limit of space in it's folder, so I cancelled the transaction. Log backups continue automatically every 15 min. So we switched to simple recovery mode and ran again. Less logging but still log growth so aborted a 2nd time.

    To finish up, I resumed full recovery mode, ran a differential backup and re-enabled the automatic log backups, which are occuring. Today I see the log is pretty full and has done one autogrow this morning. The log reuse info in sys.databases indicates "active transaction" preventing log truncation.

    Should I restart sql or wait until this evening and do a full backup followed by a restart?

  • The log can only be truncated up to the beginning of the oldest active transaction. If you're trying to insert an entire (large) table, then that will run as a single transaction and hence the log won't be able to truncate.

    Maybe do the insert in chunks, not in an explicit transaction, that will allow the log space to be reused.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • We're not trying to continue that insert now, just trying to get the log truncated as would normally happen with a log backup. Any suggestions?

  • SOLVED ! thanks for your input Gail, we will have to consider smaller inserts. I looked in our Spotlight tool and noticed the spid of the oldest transaction in our log file. It was an inactive connection from before our file insert test. I killed that spid, ran a log backup and all was good. Log nearly empty of "actual data" now.

    Moral to the story? Single user mode before such tests???

  • Indianrock (10/26/2009)


    SOLVED ! thanks for your input Gail, we will have to consider smaller inserts. I looked in our Spotlight tool and noticed the spid of the oldest transaction in our log file. It was an inactive connection from before our file insert test. I killed that spid, ran a log backup and all was good. Log nearly empty of "actual data" now.

    Moral to the story? Single user mode before such tests???

    I would say .... DBCC Opentran

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution πŸ˜€

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • ALZDBA (10/26/2009)


    Indianrock (10/26/2009)


    SOLVED ! thanks for your input Gail, we will have to consider smaller inserts. I looked in our Spotlight tool and noticed the spid of the oldest transaction in our log file. It was an inactive connection from before our file insert test. I killed that spid, ran a log backup and all was good. Log nearly empty of "actual data" now.

    Moral to the story? Single user mode before such tests???

    I would say .... DBCC Opentran

    I would say .... Use the test system, that's what it is for...;-)

    Jeffrey Williams
    β€œWe are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Jeffrey Williams-493691 (10/26/2009)


    ALZDBA (10/26/2009)


    Indianrock (10/26/2009)


    SOLVED ! thanks for your input Gail, we will have to consider smaller inserts. I looked in our Spotlight tool and noticed the spid of the oldest transaction in our log file. It was an inactive connection from before our file insert test. I killed that spid, ran a log backup and all was good. Log nearly empty of "actual data" now.

    Moral to the story? Single user mode before such tests???

    I would say .... DBCC Opentran

    I would say .... Use the test system, that's what it is for...;-)

    Thanks for the info about DBCC OpenTran ALADBA. Jefferey, as far as the test system, yes the whole project was run there. However that is a sql server, not clustered, with no SAN ( Netapp in our case), no regular backups running etc etc etc. There isn't going to be a test environment any time soon that closely resembles our production environment, and that's probably the case for more firms than not, especially these days.

    Anyway, it was a learning experience. In the test area, in simple recovery, we had no log growth, so I'm guessing the database connections that existed prior to the test ( and which restricted user mode would have gotten rid of ), created the transaction in the log that log backups wouldn't clear until I killed that session.

  • I hear you - it is the same here also. I wasn't really trying to ruffle any feathers - I hope it didn't come across that way. πŸ™‚

    Jeffrey Williams
    β€œWe are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Not at all. In fact, if I learn something, ruffle away. I was trying to minimize impact to our production environment yesterday ( by not setting to simple recovery initially and not setting to single-user or restricted_user) but now I think that was a mistake.

    Probably should be: set restricted_user, log backup, set simple recovery, do test, set full recovery, set multi_user, do differential backup, resume log backups.

  • I forgot to mention that the test we just did was in preparation for a large-scale project where we will split the database out into numerous new files for better IO.

    I suppose instead of a big insert into ... select * from ... we could have created a copy of a table and used the new alter table with "move to" syntax. ( which is what we'll actually use for our project ) But the "move to" is probably doing something similar to the select * insert anyway.

    I'm not sure there is any way to break these, sometimes large, transactions up into chunks. Simple recovery will be the key to keeping our log from exceeding available disk space.

    In SQL Server 2005, a new clause β€˜MOVE TO’ was added to the ALTER TABLE command to allow for moving a table to a different filegroup. MOVE TO clause is used along with DROP CONSTRAINT clause in the ALTER TABLE syntax.

    ALTER TABLE TAB1 DROP CONSTRAINT PK_TAB1 WITH (MOVE TO TEST_DATA_2)

    GO

    ALTER TABLE TAB1 ADD CONSTRAINT PK_TAB1 PRIMARY KEY(TAB1_ID)

    GO

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply