November 21, 2003 at 12:00 am
Comments posted to this topic are about the content posted at http://www.sqlservercentral.com/columnists/ckempster/customlogshipping.asp
Chris Kempster
www.chriskempster.com
Author of "SQL Server Backup, Recovery & Troubleshooting"
Author of "SQL Server 2k for the Oracle DBA"
November 26, 2003 at 2:37 am
Hi all
Just a quick comment re the log shipping, we have been testing our large COM+ app against the log shipped db, after a restore of logs, the COM+ components seem to be caching the connections to the instance, a shutdown via component services of the affected compoments resolves the issue but is far from ideal. The senior AP's are investigating the issue and how the connections are being pooled/cached.
Just something to watch out for.
Cheers
Ck
Chris Kempster
Author of "SQL Server 2k for the Oracle DBA"
Chris Kempster
www.chriskempster.com
Author of "SQL Server Backup, Recovery & Troubleshooting"
Author of "SQL Server 2k for the Oracle DBA"
November 26, 2003 at 8:48 am
At first blush, Chris has put together
a well thought solution for an option
of creating SQL log shipping.
Excellent work Chris!
November 26, 2003 at 10:40 pm
Hi,
Just wanted to say that the log shipping idea seems very sound in its implementation here. In the conclusion you say that having users still logged in while doing a recovery is bad, Im not sure if I would want Yukon to be allowing this, because (imho) why would you want people working on data that is about to change.. I sort of think that kicking everyone out is a good idea before recovery..I obviously cant see the good side to not kicking hem out... 🙂
------------------------------
Life is far too important to be taken seriously
November 26, 2003 at 10:47 pm
hem = them
------------------------------
Life is far too important to be taken seriously
November 26, 2003 at 11:13 pm
Hi there
Well is a good point actually and (i suppose) really depends on your requirement for the standby DB.
In our case I wanted to avoid having 3 databases, the live OLTP, the standby and then the reporting database. If I can use the standby for all reporting functions (its not warehoused with time series data) and users dont mind the 10min cycle of data change by applying new logs, then its win win all round. Kicking users then becomes a problem, esp for those running long complex queries. To get around this I only restore every 2hrs and everyone is aware of the disconnection. If Yukon had an option to go either way, id see it as a "nice feature" more than anything that makes the standby more flexible and usable to my clients.
Cheers
Ck
Chris Kempster
Author of "SQL Server 2k for the Oracle DBA"
Chris Kempster
www.chriskempster.com
Author of "SQL Server Backup, Recovery & Troubleshooting"
Author of "SQL Server 2k for the Oracle DBA"
November 27, 2003 at 2:30 pm
Oh yeah! i forgot about the long running queries scenario..!! I suppose it would be a good thing for Yukon to have (if indeed it will) once again sounds good 🙂
------------------------------
Life is far too important to be taken seriously
April 1, 2004 at 7:54 am
I really think this is a good set of routines and sp's for a log shipping solution. I ran across it while looking for a solution to keep 235 databases (53 gigs) within 24 hours of snyc at our standby server offsite. Replication wasn't an option as I had to leave the online databases user-changeable (add, delete tables, add, drop databases - as DBA I have no control of that, and I'm not notified so snapshot's really weren't in our business model).
I worked through mod's for Chris's script, and we emailed back and forth some - then decided it wasn't a viable option as I have other restrictions that just make it not possible (only a t-1 line, and only permitted to transfer 50 gigs a month).
If anyone out there has any suggestions as to how to do this, I'm open to them:
Thanks for your help Chris.
Thanks, and don't forget to Chuckle
September 3, 2005 at 5:45 am
Chris, first let me thank you for this contribution, you have saved me literally days of work and its far better than I could have achieved on my own.
I have one small problem with it though. When gzip is called to zip up the BAK files it occasionally fails to delete the original file and therefore causes and error email to be sent.
I am using a Job running iunder the SQL Server agent and its last step is to also FTP and then MOVE the files. The MOVE command also returns an errorlevel (despite succeeding).
I suspect that this issue is caused by attempting to delete a file (i.e. a backup) that is still open.
Have you come across this problem and/or do you have any suggestions to get around it?
Many Thanks,
Jason
September 16, 2007 at 11:46 pm
Chris
Excellent work on this! We have been using this for a couple of our databases which run the MSDE flavor and it works great.
Thanks
Ravi
April 14, 2008 at 5:06 am
We had problems with files being locked. After the backup and after the zip just add a little wait (e.g. 1 minute).
April 20, 2008 at 10:44 am
This approach seems very well thought out. I would like to test it, but the link to scripts appears to be broken (404 not found).
Can someone provide an updated link?
Thanks!
April 21, 2008 at 10:38 am
Alex Tomic (4/20/2008)
This approach seems very well thought out. I would like to test it, but the link to scripts appears to be broken (404 not found).Can someone provide an updated link?
Thanks!
Hi Alex,
I've uploaded it to my SkyDrive at http://cid-e39114477b2c887f.skydrive.live.com/browse.aspx/Public.
I've been using for over a year now (with some tweaks). It's pretty good but there's quite a few hard coded things that should be pulled out into a settings table and also here is no automatic retry if an FTP fails - which can be surprisingly often if you are log shipping 30 databases at every 15 minutes!
But its the best one I have found at the right price 😉 - so many thanks to Chris Kempster for sharing it.
Jason
March 17, 2009 at 10:43 pm
HI,
The link to the the zip file containing the code and scripts is broken.
Any ideas where I can get the source for this?
Cheers
James
March 17, 2009 at 10:45 pm
jando999 (3/17/2009)
HI,The link to the the zip file containing the code and scripts is broken.
Any ideas where I can get the source for this?
Cheers
James
Ignore me, must learn to read previous posts
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply