Is there a way to control log file size in Log Shipping configuration

  • Hi,

    We have SQL Server 2005 EE x64 with SP3 and we have configured log shipping & the backup, copy restore jobs running every 30 mins.

    Sometimes, within 30 mins, 500MB of log is genarating and some times only 10 MB of log is genarating. If the log file size is 500 MB, copy to our secondary location on our network is really slow and impacting the users.

    So I'm looking for an option in log shipping where we can control the log file size, something like if the log file size is 50 MB, then perform a log backup. But how to know whether therse is a 50 MB of is generated in the .ldf file and perform the log backup instead of running the backup at every 30 mins?

    thanks for your inputs

  • How about creating a SQL Agent Alert that fires when the log file is bigger than 50MB ? Get it to start a job that does the log backup.

    The alert would have the following properties :

    type : SQL Server performance alert condition

  • How about creating a SQL Agent Alert that fires when the log file is bigger than 50MB ? Get it to start a job that does the log backup.

    The alert would have the following properties :

    type : SQL Server performance alert condition

    object : SQLServer:database

    counter : Log File(s) sizes (KB)

    instance : that database you are log shipping

    alert if counter "rises above" value "50000"

    And the response would be to start the log backup job

  • How about creating a SQL Agent Alert that fires when the log file is bigger than 50MB ? Get it to start a job that does the log backup.

    The alert would have the following properties :

    type : SQL Server performance alert condition

    object : SQLServer:database

    counter : Log File(s) sizes (KB)

    instance : that database you are log shipping

    alert if counter "rises above" value "50000"

    And the response would be to start the log backup job

    thank you. I have tried this option and did the following:

    1. Removed the existing 'schedule to run the log backup for the log shipping backup job'

    2. Created the alert by selecting above options but by selecting 1024 KB for test purpose

    3. Configured to get the alert to notify the DBA group

    4. The .ldf file size of the database is 13 GB. and the DBA group getting email several times per minute and log backup job is firing, but the backed up log size is less 450KB

    that means it's not a working solution. Then I have selected the counter 'log growths' but then even if the generated log is more than 1024 KB, the log shipping backup job is NOT firing(I did make sure by running the rebuild index task for that database and after finishing the backup, I had manually ran the LS backup job & I can see the backed up log size as 4GB). So this option also not working

    Please advice

    thanks

  • Hi,

    Did anybody create log shipping & ship the log file to secondary based on fixed size instead of every 5 mins or 15 mins? For example if the log file size is 50 MB, then start the logshipping log backup

    thanks

  • any more thoughts to implement this?

  • Occam's razor says that the obvious answer it ususally the correct one.

    It's a simple fix. You simply need to schedule the process to occur more often. That's it. You don't need any complex algorithms to fire it off. Just increase the frequency to every 5, 10, or 15 minutes.


    My blog: SQL Soldier[/url]
    SQL Server Best Practices:
    SQL Server Best Practices
    Twitter: @SQLSoldier
    My book: Pro SQL Server 2008 Mirroring[/url]
    Microsoft Certified Master: SQL Server, Data Platform MVP
    Database Engineer at BlueMountain Capital Management[/url]

  • Robert is correct on this. To control log file size in log-shipping, backup the log more frequently.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • What about using the following query to check the log size?

    select sum(size*8/1024) LogSize from sys.database_files where type = 1

    http://msdn.microsoft.com/en-us/library/ms174397.aspx

  • rambilla4 (2/10/2010)


    Hi,

    Did anybody create log shipping & ship the log file to secondary based on fixed size instead of every 5 mins or 15 mins? For example if the log file size is 50 MB, then start the logshipping log backup

    thanks

    The theory behind this seems like it may work.

    You would need a stored proc that could check the used size of the log file, perform your predefined condition checks, and then execute the Tlog backup job if conditions are met. The stored proc that performs all of this would need to run frequently on your server from a SQL Agent job.

    The alternative of course would be to just have the Log Backup occur more frequently and then log ship more frequently.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

  • Given the stated problem--e.g. that copying 500Mb in 30 minutes is causing serious user impact--I really don't see how doing some sort of trickery to log ship according to log size would help anyway; the bandwidth usage is going to be pretty much the same because the same amount of changes are being made to the transaction logs! All that would happen if you set it to log ship every 50Mb is that you'd get 10 separate copies happening over that 30 minutes rather than the single 500Mb one. Same applies if you set a shorter log shipping interval. The only advantage you'd get is that you'd get slowdowns every 5 minutes instead of every half an hour.

    Wouldn't you be better off looking at some means of throttling the log shipping traffic if it's hogging all your bandwidth?

  • paul.knibbs (2/17/2010)


    Wouldn't you be better off looking at some means of throttling the log shipping traffic if it's hogging all your bandwidth?

    You can also upgrade to SQL Server 2008 and compress the log backups.


    My blog: SQL Soldier[/url]
    SQL Server Best Practices:
    SQL Server Best Practices
    Twitter: @SQLSoldier
    My book: Pro SQL Server 2008 Mirroring[/url]
    Microsoft Certified Master: SQL Server, Data Platform MVP
    Database Engineer at BlueMountain Capital Management[/url]

  • Another alternative is to determine if 500MB every 30 minutes is legitimate transactions.

    I ran across a scenario where nearly 1TB was created a day in transactions due to some methods employed. We were not doing that kind of real transactions. The root cause was the creation and drop of tables (not temp tables) in the database throughout the day for various queries and processes. Changing the code significantly improved this behavior and significantly reduced our logs. Check your processes and verify that there is nothing that is creating transaction log volume that shouldn't be.

    Jason...AKA CirqueDeSQLeil
    _______________________________________________
    I have given a name to my pain...MCM SQL Server, MVP
    SQL RNNR
    Posting Performance Based Questions - Gail Shaw[/url]
    Learn Extended Events

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply