Ola Hallengren's Maintenance Solution - Calling DatabaseBackup from another stored proc

  • I'm testing the implementation of Ola Hallengren's maintenance solution in my environment (SQL 2008R2), and one thing that I wanted to improve is automatic distinction between small and large databases. I want to set a threshold of database size, and any DB that is larger than that threshold gets backed up to 20 files instead of 1 file to speed things up.

    Now, I could do this manually of course (gather a list of large DBs and then add steps to Ola's jobs that would be identical to the regular steps but use @NumberOfFiles = 20 argument). But I'd rather automate it and make it universal.

    So I wrote a stored procedure to handle the distinction between large databases and small databases, and then call DatabaseBackup twice. Here is the code.

    CREATE PROCEDURE [dbo].[MultiStageFullBackup] (@Databases nvarchar(max), @Directory nvarchar(max), @CleanupTime int, @NumberOfFiles int, @LargeDatabaseSizeThresholdMB int)

    /* This procedure creates a list of databases greater than @LargeDatabaseSizeThresholdMB, then calls Ola Hallengren's DatabaseBackup, supplying @databases, @Directory

    and @CleanupTime parameters to it. Other parameters for DatabaseBackup are set as follows:

    @BackupType = 'FULL', Verify = 'Y', @checksum = 'Y', @LogToTable = 'Y'

    Databases whose size exceeds @LargeDatabaseSizeThresholdMB are backed up to multiple files (as specified by @NumberOfFiles parameter).

    All other files are backed up to 1 file each.

    The most common @databases parameter is 'USER_DATABASES', for example:

    EXECUTE DBA.dbo.MultiStageFullBackup @databases = 'USER_DATABASES', @Directory = 'Z:\', @CleanupTime = 732, @NumberOfFiles = 20, @LargeDatabaseSizeThresholdMB = 20000

    */

    AS

    BEGIN

    DECLARE @LargeDBList nvarchar(MAX);

    WITH Large_Databases_CTE AS

    (

    SELECT TOP 1000 d.name DatabaseName

    FROM sys.master_files mf

    INNER JOIN sys.databases d ON d.database_id = mf.database_id

    WHERE d.database_id > 4

    GROUP BY d.name

    HAVING ROUND(SUM(mf.size) * 8 / 1024, 0) > @LargeDatabaseSizeThresholdMB

    ORDER BY ROUND(SUM(mf.size) * 8 / 1024, 0) desc

    )

    SELECT @LargeDBList = COALESCE(@LargeDBList + ',' ,'') + DatabaseName

    FROMLarge_Databases_CTE

    set @databases = @databases + ISNULL ((', -' + REPLACE(@LargeDBList, ',',',-')), '')

    --Back up small databases

    EXECUTE DBA.[dbo].[DatabaseBackup] @databases = @databases, @Directory = @Directory, @BackupType = 'FULL', @verify = 'Y', @CleanupTime = @CleanupTime, @checksum = 'Y', @LogToTable = 'Y'

    -- Back up large databases, if any, to 20 files each

    if @LargeDBList IS NOT NULL

    BEGIN

    set @databases = @LargeDBList

    EXECUTE DBA.[dbo].[DatabaseBackup] @databases = @databases, @Directory = @Directory, @BackupType = 'FULL', @NumberOfFiles = @NumberOfFiles, @verify = 'Y', @CleanupTime = @CleanupTime, @checksum = 'Y', @LogToTable = 'Y'

    END

    END

    GO]

    For the most part, it works fine. However, when I simulate failure of network or storage issues, the backup experiences a Semaphore Timeout error. Since I use Red Gate SQL Backup, it is capable of dealing with this situation - namely, it retries the backup and normally after one retry it succeeds. But after retrying the failed backup (and succeeding), the backup job stops.

    To illustrate. Let's say we have databases DB1, DB2 and DB3, backed up by the backup job in that order.

    DB1 backs up successfully.

    DB2 experiences a Semaphore Timeout, Red Gate retries and successfully finishes the backup (it validates and can be restored).

    The job then stops and DB3 is never backed up.

    This only happens if I call Ola's DatabaseBackup from my stored procedure. If I call it directly from a job step using CmdExec step, then the backups proceed normally after Red Gate retries.

    Would anyone have a clue as to why I'm having this issue and what I could do to amend the stored procedure to avoid it?

  • Have you actually tested backing up to 20 files for purposes of performance? I've seen backups to just 3 files absolutely kill performance because they didn't back up to 3 different physical disks and the motion of the RW heads made the backups take nearly twice as long.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Yes, I have indeed tested it and it works for me (but your mileage may vary). In my environment, backing up larger DBs to multiple files allows me to shave off fairly significant amount of time from the maintenance period.

  • So Ola himself graciously gave me the solution to this issue that worked.

    Instead of calling the stored procedure from a T-SQL job step, the stored procedure should be called from a CmdExec step, using SQLCMD and the -b option.

    Example:

    sqlcmd -E -S $(ESCAPE_SQUOTE(SRVR)) -d DBA -Q "EXECUTE DBA.dbo.MultiStageFullBackup @databases = 'USER_DATABASES', @Directory = 'Z:\', @CleanupTime = 732, @NumberOfFiles = 20, @LargeDatabaseSizeThresholdMB = 20000;" -b

    When performed in this fashion, the backups behave correctly.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply