January 26, 2005 at 1:27 am
Hey people!
I have a web app which connects to a MSSQL (2000 standard) database.
This data is becoming more important as the days go by so I need to look at a good backup/restore process.
Currently I export the DB to another test machine manually at night. This provides daily backups (I have also recently read that I could automate this by scheduling it as a DTS package - wish I knew that 3 months ago ). If my db goes down though, this other machine needs to be moved to the IDC and installed.. so turn around time to recover isnt exactly very good. plus I only have one daily backup to recover from so will lose a bit of data.
I would like to extend the backups further.. to include hourly backups and also have a better method to restore when a failure occurs. I looked at clustering but you need enterprise SQL which is simply too expensive for us. I then read elsewhere that replication is a good option and provides a good recovery time.
Q: How important is the data?
A: very important. We could lose 4 hours.. perhaps 24 hours. but anything longer than that and I'm a dead man
Q: how much cash do you have to spend?
A: I have enough for a second machine and license (SQL+2K)
SO, if I ran hourly/daily backups via a DTS package.. then replicated my DB using SQL Rep to a secondary box in the data centre that would be a fairly good solution yeah?
any suggestions you guys have would be a great help. I'm not a DBA - I have just been given this responsibility and I want to make sure I dont screw anything up!
cheers
Dave
January 26, 2005 at 5:50 am
Replication is an option, however you run in problems when you make changes to the schema on the publishing server.
Your best bet would be to use Log Shipping. This is only MS supported in the Enterprise Edition, however this can be (fairly) easily accomplished using a homegrown method. In fact a search on this site should provide you with all the information and scripts to get this done.
The basic way of doing this is to make sure that your database is in full recovery mode. Once a night you will backup your database (and store that offsite). Then every so often (time frame that you determine), you schedule the job to do a transaction log dump, then to copy that dump over to the remote server and restore it on there. This gives you all the data, including schema changes on the remote server. Dumping the transaction logs also allows you to do a point in time restore in case of bad data, or an accidental drop of a table etc.
January 26, 2005 at 5:51 am
example scripts.... http://www.sqlservercentral.com/scripts/contributions/1125.asp
January 26, 2005 at 11:38 am
Rather than using DTS to copy data to a "backup" file, I would recommend using SQL's builtin backup functionality. Combining full and transaction log backups into a backup strategy gives you a great combination of disaster recovery as well as things like restoring to a given point in time.
January 27, 2005 at 9:54 am
If you have the extra cash (possibly 1-2 K), look into upgrading to SQL2K Enterprise from sStandard. Enterprise has built in log shipping functionality which sounds like a tailor made fit for you.
RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."
January 27, 2005 at 3:47 pm
Log Shipping is fairly easy to script out yourself as well.
You can find several homegrown log shipping scripts online that perform the same functionality however do not require enterprise edition of SQL.
February 1, 2005 at 11:27 pm
hey guys,
thanks a lot for your info.. i'll have a good read on this site about the log shipping options with standard and see how we go!
thanks again for all your responses
dave
February 2, 2005 at 10:45 am
Dave,
At the very least, implement a backup plan using SQL Server Backup rather than DTS. As Jason pointed out, you can schedule a combination of full database backups, differential backups, and transaction log backups that will allow you to recover to a point in time and minimize the chance of losing data.
Homegrown log shipping is a good option for disaster recovery if you can afford another server and license. BTW, the difference in price between Standard & Enterprise is more like $15,000 per processor. Here's another log shipping link:
http://www.sql-server-performance.com/sql_server_log_shipping.asp
Greg
Greg
February 23, 2005 at 9:02 pm
thanks Greg!
I have just bought the new box and license and I am starting the project in a few days! I will let you know how I go - if your able to help along the way as well I would be eternally grateful!
February 27, 2005 at 9:57 pm
another question for you guys if you dont mind..
ok assuming I have setup my environment. I have a secondary server where I ship the logs too as well as a backup schedule.
Low and behold my original machine gets fried. The motherboard goes, it takes the HDD with it and the machine is completey gone. I can now remote into my backup machine, change the computer name and the IP and reboot. My application which uses dsn's to connect will therefore re-establish the connection once the box has booted.
OR.. is this not the right way to do it? If you change the computer name does it screw with the SQL instance? Are there any other problems which may happen? If thats the case then should I be changing all of my dns's rather than changing the name/ip?
cheers
dave
February 28, 2005 at 5:40 am
David,
You will also need to change the name of the server within SQL Server itself. This is stored in the master database.
Here is a recent forum posting.
http://www.sqlservercentral.com/forums/shwmessage.aspx?forumid=5&messageid=163802
February 28, 2005 at 5:43 am
A better course of action would probably be to change the DSN.
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply