May 17, 2007 at 2:08 pm
We are getting ready to implement some form of log shipping. Couple of questions...
1) We want to do some sort of auto-failover. Can anyone suggest a good method of determining when it is appropriate to fail over? I've seen plenty of suggestions to simply have the secondary server or monitor server periodically attempt to connect to the primary database server, and if it can't initiate a failure. This seems like a good start, but if something simple like a line being down between the primary server and whatever server is doing the monitoring, it could trigger an unnecessary failover.
2) What does using the built-in log shipping buy you that you can't do yourself with a few scripts? Seems like you could write your own scripts to do backups, push them to secondary servers, and restore them just as fast as you could set up log shipping. Am I missing something?
Thanks,
JAS
The Redneck DBA
May 17, 2007 at 6:29 pm
Log Shipping itself does not have automatic failover. You will need to "bolt on" something extra to have automatic failover. Even then it still miight not be automatic. What happens to the client processes connecting to the primary server?? What happens to the servers if they are both up and running but there is no network connectivity between them? Best failover method for Log Shipping is to keep it manual. A manual failover for an up-to-date standby server will only take a minute or two anyway. If you want true automatic failover you should setup clustering instead.
I've setup Log Shipping for dozens of clients. The most reliable method has always been using simple t-sql scripts. Using the inbuilt Log Shipping and other third-party applications have always failed at one point or another
The log shipping operation is simply backup, copy, restore. Backup script backs up to a given directory. Copy process looks for files in the backup directory and copies them to a directory on the standby server. Restore process looks for files in the restore directory and restores them to the database.
By setting up individual scripts for each of these operations you avoid any single part of the process holding up another. eg: if you don't want to run restores on the standby database, maybe you're doing some reporting or something, you can stop the restore step but the backup and copy continue on unaffected.
Once you have the basic steps working you can add all sorts of secondary steps. Like compressing the backup files before copying them. Moving files to archive directories. Deleting files after a period of time. etc...
--------------------
Colt 45 - the original point and click interface
May 20, 2007 at 7:16 pm
Hi Phil,
I have log shipping via having a linked server and stored procedures executing remotely. But as you say, it’s unreliable!
I’d be interested to hear in a little more detail how you do the :-
Copy process looks for files in the backup directory and copies them to a directory on the standby server. Restore process looks for files in the restore directory and restores them to the database.
Is this done be using xp_cmdshell using DOS commands or executing some sort of batch file?
Thanks Warwick.
May 20, 2007 at 9:37 pm
The important part is not really how its done, just that each process is seperate and unrelated to the other processes that just happen to make up Log Shipping. This allows a lot of flexibility in what you use to perform each task.
As its a seperate process, the copy can be anything you're comfortable with to get the process done. Depending on the environment it can be an xp_cmdshell proc, batch file, robocopy, , etc... At one client site we do use xp_cmdshell and a DOS copy, at another the WAN link isn't 100% reliable so we call Robocopy in a scheduled batch file.
For the restore we generally have a stored proc that executes xp_dirtree to return a directory listing for a given directory. This could also be easily done using xp_cmdshell and the DIR command. Either way the results are stored in a temp table that is used to dynamically build a RESTORE statement. Also, while I've not done it myself, I've seen an implementation where a scheduled VBScript trawls the directories and loads files into database table. This in turn fires a trigger which performs the restore.
--------------------
Colt 45 - the original point and click interface
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply