Automation of TLOG Restoration

  • I have a folder location on a SQL server that has daily TLOG backups residing in the side that gets added to daily.

    I need a script that I can run that I can point at the folder location and will restore the latest TRN file to a database.  I don't want to specify the name of the tlog I am restoring as there will be multiple trn files located inside with different names i just want to restore the latest one.

     

     

     

     

    • This topic was modified 5 years, 4 months ago by  Wecks.
  • Thanks for posting your issue and hopefully someone will answer soon.

    This is an automated bump to increase visibility of your question.

  • Hi, Log shipping does these exact steps.  If you have already considered Log Shipping, what different behavior are you looking for?

  • Are you keeping a database in norecovery mode (or standby)? IF so, log shipping would be easier.

    Are you removing old log files from here? There is no good way to just know which file to restore. A script has to walk through them all, or has to know what has already been restored to a database.

  • If you want to be able to list the latest file in a directory - you can use Powershell:

    PS> dir | Select -Last 1

    If there are other file types - you can filter them using -Exclude or through the pipe-line with a where:

    PS> dir -Exclude *.bak | Select -Last 1
    PS> dir | ? {$_.Extension -ne 'bak'} | Select -Last 1

    You can expand on this to use Invoke-SqlCmd to query the destination to check if that file has already been restored and even perform the restore.  Lots of possibilities...

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Hi Steve,

     

    So the background is the log files are being Tlog shipped from an external company once a day to a file share.  This external company has provided us with the latest full backup and the logs to bring the DB up to date and will continue to provide tlog files on a schedule of once a day which will need to be restored in standby mode so that the data can still be accessed in read-only mode.  The issue is that they don't want to have to restore the logs manually.

    The script needs to check a local location which will contain lots of logs and will restore the latest one.

     

     

  • OK, the process I'd follow is to treat these like ETL. I want to "load" the files and then move them.

    My script would need to:

    • Find the next file to load. This is likely some restore from filelistonly that determines the next LSN I need.
    • Load this file
    • move this file to another folder
    • repeat

    You can check restorehistory, which should have the last file restored. If the files are named correctly, scripting is easier, but still need to scan them to find the next file to restore.

  • The following script can be used to set up log shipping standard and Disconnected log shipping but for the purpose of Disconnected log shipping, I have modified it slightly.

    I recommend that you restore a full copy of the database to the secondary server prior, this job you can define where the full backup is located and it will create a copy job but the best results manually take the full backup of the database you want to implement log shipping for and restore it in Standby mode (This is a SQL Server Enterprise feature) which will be required if you want to access the data in read-only mode. As the copy Job is not required I have modified the script to notify you that the job can be deleted.

    Once the database has been restored in standby mode you can run this script on the secondary server. once you have copied this script into SQL adjust the variables, I have commented out what you don’t need to concern yourself with. The script will create a restore job, that utilises the sqllogship.exe located C:Program FilesMicrosoft SQL Server110ToolsBinn. It works like a charm without having to specify the order in which to apply the logs. The script will create 2 other jobs one is a copy job that can be deleted and the second is a job that will alert if the log shipping fails. Schedules will also be created in line with my requirements so feel free to modify the schedules in the script or after they have been created if you want to use SSMS UI.

    DECLARE @LS_Secondary__CopyJobId AS uniqueidentifier

    DECLARE @LS_Secondary__RestoreJobId AS uniqueidentifier

    DECLARE @LS_Secondary__SecondaryId AS uniqueidentifier

    DECLARE @LS_Add_RetCode As int

    EXEC @LS_Add_RetCode = master.dbo.sp_add_log_shipping_secondary_primary

    @primary_server = N’serverinstance’ –If implementing disconnected TLOG shipping Leave as is serverinstance

    ,@primary_database = N’DBNAME’ –Would recomend Full backup is restored in Standby mode before running this script.

    ,@backup_source_directory = N’\NOT_REQUIRED_COPYJOB_CAN_BE_DELETEDna’ –Not Required

    ,@backup_destination_directory = N’\SERVERSHARE’ –Create a share that points to where the TRN files will be deposited.

    ,@copy_job_name = N’NOT_REQUIRED-COPYJOB_CAN_BE_DELETED’ –This will create a copy job but the job can be deleted after.

    ,@restore_job_name = N’TLOG_RESTORE_JOB_DBNAME’ — This will create the job that will restore the tlog files.

    ,@file_retention_period = 4320

    ,@overwrite = 1 –This sets to Restore in Standby Mode.

    ,@copy_job_id = @LS_Secondary__CopyJobId OUTPUT

    ,@restore_job_id = @LS_Secondary__RestoreJobId OUTPUT

    ,@secondary_id = @LS_Secondary__SecondaryId OUTPUT

    IF (@@ERROR = 0 AND @LS_Add_RetCode = 0)

    BEGIN

    DECLARE @LS_SecondaryCopyJobScheduleUID As uniqueidentifier

    DECLARE @LS_SecondaryCopyJobScheduleID AS int

    EXEC msdb.dbo.sp_add_schedule

    @schedule_name =N’DefaultCopyJobSchedule’

    ,@enabled = 1

    ,@freq_type = 4

    ,@freq_interval = 1

    ,@freq_subday_type = 4

    ,@freq_subday_interval = 15

    ,@freq_recurrence_factor = 0

    ,@active_start_date = 20191014

    ,@active_end_date = 99991231

    ,@active_start_time = 0

    ,@active_end_time = 235900

    ,@schedule_uid = @LS_SecondaryCopyJobScheduleUID OUTPUT

    ,@schedule_id = @LS_SecondaryCopyJobScheduleID OUTPUT

    EXEC msdb.dbo.sp_attach_schedule

    @job_id = @LS_Secondary__CopyJobId

    ,@schedule_id = @LS_SecondaryCopyJobScheduleID

    DECLARE @LS_SecondaryRestoreJobScheduleUID As uniqueidentifier

    DECLARE @LS_SecondaryRestoreJobScheduleID AS int

    EXEC msdb.dbo.sp_add_schedule

    @schedule_name =N’DefaultRestoreJobSchedule’

    ,@enabled = 1

    ,@freq_type = 4

    ,@freq_interval = 1

    ,@freq_subday_type = 4

    ,@freq_subday_interval = 15

    ,@freq_recurrence_factor = 0

    ,@active_start_date = 20191014

    ,@active_end_date = 99991231

    ,@active_start_time = 0

    ,@active_end_time = 235900

    ,@schedule_uid = @LS_SecondaryRestoreJobScheduleUID OUTPUT

    ,@schedule_id = @LS_SecondaryRestoreJobScheduleID OUTPUT

    EXEC msdb.dbo.sp_attach_schedule

    @job_id = @LS_Secondary__RestoreJobId

    ,@schedule_id = @LS_SecondaryRestoreJobScheduleID

    END

    DECLARE @LS_Add_RetCode2 As int

    IF (@@ERROR = 0 AND @LS_Add_RetCode = 0)

    BEGIN

    EXEC @LS_Add_RetCode2 = master.dbo.sp_add_log_shipping_secondary_database

    @secondary_database = N’DBNAME’

    ,@primary_server = N’serverinstance’ –Leave as default if configuring Disconnected Tlog Shipping

    ,@primary_database = N’DBNAME’

    ,@restore_delay = 0

    ,@restore_mode = 1

    ,@disconnect_users = 1

    ,@restore_threshold = 45

    ,@threshold_alert_enabled = 1

    ,@history_retention_period = 5760

    ,@overwrite = 1

    END

    IF (@@error = 0 AND @LS_Add_RetCode = 0)

    BEGIN

    EXEC msdb.dbo.sp_update_job

    @job_id = @LS_Secondary__CopyJobId

    ,@enabled = 1

    EXEC msdb.dbo.sp_update_job

    @job_id = @LS_Secondary__RestoreJobId

    ,@enabled = 1

    END

    — ****** End: Script to be run at Secondary ******

  • this is similar to what I did and still is in use. Every hour, I copy the transaction log backup to another workstation via powershell. An agent job restores the log backups to a standby database.It takes less than one minute, so, it's perfect for reporting.

    I followed this article closely(https://www.mssqltips.com/sqlservertip/1584/auto-generate-sql-server-restore-script-from-backup-files-in-a-directory/), except without step 6, which puts database in recovery mode. I also used standby mode, which makes the database read-allowed

    - restore full database backup with norecovery

    - if available, restore differential backup with norecovery

    - restore transaction log with norecovery, scheduled with agent job every hour

    more reference:

    https://www.sqlservercentral.com/forums/topic/t-sql-script-to-automatically-restore-the-transaction-logs

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply