Stopping Transaction Log Records

  • I have added a new column to a table Order_Facts and need to update previous history records using data from another table.  The target table contains about 14,000,000 records.  But when I run the following sql script, I get message that the log file is full and the scripts ends and I have to shrink the database log.

    Can I run the sql script so no Transaction Logs are written for this one time update?

     

    Here is my sql script:

    Update Order_Facts

     SET Order_Facts.No_Charge_Flag = PORTTAB.ortcha

    FROM

     PORTTAB

    WHERE

     Order_Facts.Order_Type = PORTTAB.ortcod

  • Richard

    Try setting the database recovery model to Simple before you do the update.  Make sure you change it back to Full afterwards and take a full backup immediately.

    John

  • Your best bet at this time would be to look at your data and query and see how you can break it down into smaller batches rather than updating all 14,000,000 records at once.  Even if you change the database recovery model to Simple.  If you leave it in Full Logging mode doing small batches, be sure to run a transaction log backup between batches so the transaction log gets truncated.

    hth,

    Lynn

  • It is actually easy to increase the Transaction Log size open the database and then open the Taskpad and you should be able to increase it.  The Taskpad is context sensitive. If the current size is 200megs you can increase it by say another 100megs to 300megs.  The size is limited by you hard drive size and you can even move it to a separate drive.  Remember to schedule the Transaction log related DBCC statements to run as needed. Hope this helps.

    Kind regards,

    Gift Peddie

    Kind regards,
    Gift Peddie

  • The database is in Simple mode already, and I do not want to increase the log size due to disk size.  I will try to break the update into smaller batches and backup the log file in between. 

     

    How can I break the batches into smaller updates?  Is there a sql code to do it by number of records?  ex:  do record 1 thru record 50000, then do 50001 thru 100000.....

     

    Thanks again for the help!!

  • -- Perform 500K updates at a time. Adjust as necessary

    Set RowCount 500000

    Select GetDate() -- Seed @@RowCount

    While @@RowCount > 0

    Begin

      -- Update rows that haven't yet been updated

      Update O

      Set No_Charge_Flag = P.ortcha

      From Order_Facts As O

      Inner Join PORTTAB As P

        On ( O.Order_Type = P.ortcod )

      Where (O.No_Charge_Flag <> P.ortcha

           OR  (O.No_Charge_Flag Is Null And P.ortcha Is Not Null) )

      -- Explicitly truncate your log here

    End

  • What I've done in the past is used a modulo operator on a suitable numeric column.  Perhaps an identity column.

    Like so:

    ---

    Update SomeTable Set SomeColumn = SomeNewValue

    where IdentityColumn % 10 = 0

    ---

    then repeat, changing the where clause to

    where IdentityColumn % 10 = 1

    and so on, through 9.  It is not a very efficient way in terms of resource cost, but it takes only a few seconds to write and gets the job done.

    Now I have a big disk so I don't have to do it the hard way anymore.

    jg

     

     

     

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply