February 15, 2010 at 11:32 pm
There are 3 days of full backups and TLog backups in backup and trn folders respectively. I need to identify previous days full backup (8pm) and successive hourly transaction log backups (from 9pm) and restore them automatically(job) everyday. TLog backup format is '..backup_2010121800.trn'. the files are copied every hour and restore has to be an hour after they are copied.
The problem is that if the copy job that copies from other server fails, older backup might be applied. So I wanted to create a mechanism to prevent it. ex. - a table that stores previous restore info and maybe read from it before applying restore. Its fine if the script works without requiring the table.
Please help
February 16, 2010 at 6:33 am
jsb12 (2/15/2010)[/b
The problem is that if the copy job that copies from other server fails, older backup might be applied.
Well i dont think it will restore wrong backup if copy gets failed.
just try this approach (code is untest , u can mould it according to ur need)
let say : @last_backup_time : 10:00 AM
it means @restore_time : 11:00 AM
now
if select getdate() = @restore_time
set @lstr = 'RESTORE DATABASE [TEST] FROM DISK = :\REPORT\MSSQL.4MSSQL\Backup\test_backup_' + @last_backup_time + '.bak WITH FILE = 1'
exec (@lstr)
keep above code in a Sp and call it in hourly job
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
February 16, 2010 at 6:36 am
sounds like a job for logshipping?
---------------------------------------------------------------------
February 16, 2010 at 7:14 am
yes custom log shipping, has to be that way
February 16, 2010 at 7:19 am
jsb12 (2/16/2010)
yes custom log shipping, has to be that way
Are you using a third-party backup compression product or something?
If you really need custom log shipping, you're going to have to write some custom code to manage the process and keep track of which backups exist and whether they have been successfully applied or not.
Paul
February 16, 2010 at 7:21 am
hi Bhuvnesh. Thank you.. I also need to apply the transaction logs that follow the full backup too. i.e after 8pm full backup I may need another job to restore the log backups from 9pm previous day to current day 7pm. In that case may be i need to verify that it is previous days full backup that has been restored before applying logs.
This is where i thought it might be a problem because what happens if the copy job that brings backups to the server fails and I'm left with old backups when the jobs for full and log backups run. Kindly help with this too I'm not sure how to proceed.
February 16, 2010 at 7:25 am
I cannot see a need for customised logshipping here. Just use the SQL supplied out of the box logshipping. Once the initial full backup is restored you don't have to apply any more full backups, just keep the log backup chain going.
---------------------------------------------------------------------
February 16, 2010 at 7:31 am
george sibbald (2/16/2010)
I cannot see a need for customised logshipping here.
Me either, but then we haven't exactly been over-blessed with detail 😉
That's why I asked about the third-party compression thing - if the log backups are large it is common to use compression on those too, especially if the target site is connected over a relatively slow or congested link. Maybe 'jsb12' will enlighten us as to why it has to be custom log shipping at some stage.
February 16, 2010 at 8:13 am
sure, the reason for the custom log shipping is that theres no two way trust between domains. So we pull files by ftp from our domain which is trusted by the other domain, not via standard log shipping. And the requirement was to restore full backups too daily.
February 16, 2010 at 8:30 am
jsb12 (2/16/2010)
sure, the reason for the custom log shipping is that theres no two way trust between domains. So we pull files by ftp from our domain which is trusted by the other domain, not via standard log shipping. And the requirement was to restore full backups too daily.
so is the requirement to bring the database up to a certain point each day and then bring it online for reporting purposes, or is it just for DR and you just want to keep restoring logs to the database.
---------------------------------------------------------------------
February 16, 2010 at 8:46 am
George has a good point. It seems like you have two requirements.
1. Restore full
2. keep warm standby.
You can use one server for this, but it's really two tasks. And I'd use two databases in a custom procedure. Use your own log shipping here to make it easy to move full and log backups. Send the logs to the warm standby, send the fulls to a 2nd db. That way if you have an issue transferring the fulls, or a corruption issue, it doesn't affect the warm standby (necessarily).
The procedures are easy. I'd use scripting to move the logs, keeping them aligned with names that include date/time. I would be careful about removing logs from the source until you are sure they are restored on the secondary. That might mean keeping 2-3 days around. Account for weekends/holidays and people not able to check jobs.
For the fulls, I'd copy them with the same name each day. Not necessarily a backup with the same name on the source, but when I droppped it on FTP, I might change to a standard name so that it's easy to restore each day with minimal scripting issues.
February 16, 2010 at 8:52 am
yes George, this is part of DR, and its for restoring logs and also full backups. Regarding restoring the full backup part, I'm not too sure why.
Thank You Steve, but right now the database is small, but it might grow to a few 100 gigs in the future probably early next year. And there is another database which will likely be custom log shipped. both databases are only a few gigs right now.
February 16, 2010 at 9:19 am
jsb12 (2/16/2010)
yes George, this is part of DR, and its for restoring logs and also full backups. Regarding restoring the full backup part, I'm not too sure why.Thank You Steve, but right now the database is small, but it might grow to a few 100 gigs in the future probably early next year. And there is another database which will likely be custom log shipped. both databases are only a few gigs right now.
Then its a requirement from someone who does not know what they are asking for. To cover yourself verify why they have asked for that but the full backup is not required to keep the log restores going, so your task is simplified.
without writing or designing it for you you will need a table holding info on the last file copied and restored and use that info to decide next log to restore, sounds like SSIS would be good for this as you can include error handling. The Log backup names are predictable so if you can be sure only 1 hourly backups will be taken you can make predictions about the name of the log to restore. Good thing is if its the wrong log SQL will refuse to restore it and no harm done.
---------------------------------------------------------------------
February 16, 2010 at 9:42 am
there might have been a reason for that requirement, I was told once but I dont recollect. Sorry for that. Will try to find out.
Regarding "without writing or designing it for you you will need a table holding info on the last file copied and restored and use that info to decide next log to restore" - How do I get the results/output of SSIS package to a table?
and yes log backups are hourly except for the the time the daily full backup is scheduled. there is a two hour break and also there is another break for the Sunday maintenance window for 3 hours
February 16, 2010 at 10:07 am
The reason for a full could be to verify backups, or keep a copy handy. That's not a bad requirement, though it's different than log shipping. You can ignore the full if you keep logs going, but if you have the bandwidth and time, keeping the full isn't a bad idea, IMHO
Viewing 15 posts - 1 through 15 (of 21 total)
You must be logged in to reply to this topic. Login to reply