February 13, 2008 at 11:59 am
I have situation where one of our sister consult company data center with slow bandwith. I am migrating application from their data center to our main data center. I have to migrate 300GB worth of database.
I need suggestion what is the best way to do it:
Consideration:
1) slow bandwidth (other database that I migrated took 13hrs to move 70GB worth of backup file)
2) Minimum (1 to 3hrs) downtime allowed as it is online application
They way I was planning is to take full database backup week before at slow data center and trasfer and restore the database and take differential backup everyday. Once the database restore at main data center transfer differential backup and roll forward. I am not sure how big the differential going to be.
Any other suggestion?
February 13, 2008 at 12:24 pm
You are definately on the right track. The first step would be to take the full and restore it at the new datacenter. I then recommend taking as many diff backups as needed to keep the file size to a reasonable transfer. The goal is to be able to restore that last differential in the time period you are allowed to be down.
Hope this helps.
MD
Marvin Dillard
Senior Consultant
Claraview Inc
February 13, 2008 at 12:39 pm
Marvin, just to give that a thoughts once I do the full database backup and differential backup, differential backups are incrementing day by day so the last differential backup I am taking is include all the changes from last full database backup right. Shouldn't be that big comparing 300GB size of database. I mean at least I have to do full database backup week before to get those over the network. Am I looking at transfering big differential backup file at the last moment when I am running with down time?
February 13, 2008 at 1:04 pm
Amu
You are correct. Diffs are only what's changed since the last full backup. The size of the diff backup will grow each day until the next full backup. Another option to consider is log shipping or perhaps even replication.
As for the full, I'd see if there was someway to place it on a differerent type of medial maybe super DLT and use Fedex to get it to the new data center. I'd hate to transfer over 300GB when bandwidth is an issue.
Marvin Dillard
Senior Consultant
Claraview Inc
February 14, 2008 at 3:18 pm
Thank you.
February 26, 2008 at 5:23 pm
One suggestion... You may try using lightspeed. It will reduce your backup size. Then use robocopy for copying.
Regards
Utsab Chattopadhyay
February 27, 2008 at 7:02 am
Native SQL Server backups also compress very well with WinRAR or 7Zip (or Winzip). You could incorporate a windows batch file to compress the backups. The backup files will compress down to approximately 15% (or less) of the original size. You can then send the smaller file across the slow connection and uncompress on the other end. In theory it will take less time to compress the file, send it across the wire and decompress on the other end, than it would to send the uncompressed file.
Hope This Helps,
"Key"
MCITP: DBA, MCSE, MCTS: SQL 2005, OCP
February 27, 2008 at 9:33 am
Usually backup software that compresses the backups on the fly not only needs al lot less space for the backups, it does the backup in less time than native sql backup.
If you use winzip or gzip you additionally need a long time for the compress, because these tools are made to _not_ use a lot of cpu resources during the process.
regards
karl
Best regards
karl
February 27, 2008 at 10:24 am
Karl is absolutely correct. The compression size and speed of the backup (and restores) using third-party tools can be huge. We had a 250Gb database that compressed to around 32Gb using these tools. Time of backup went from 4 hours (native) to around 40 minutes (Lightspeed and SQLSafe). There are several good tools out there. Download an eval copy from Idera, Quest, etc., prove it out and make your point with management.
-- You can't be late until you show up.
February 27, 2008 at 11:34 am
I am using WinRAR for backup compression on the database servers with disk space issues. For the most part, database backups are written to the local SQL box, and a windows scheduled job pulls the backup files to a centralized Enterprise backup server that also functions as a Virtual Tape Library (VTL). I am building a business case for an Enterprise level SQL Server backup machine that will have all the SQL boxes writing their backups to it, so that I have more immediate access to the database backups. If I need anything more than 24 hours old, I have to open a ticket and have the server admins pull something from the VTL. If database restore requests are older than 14 calendar days, I have to request a tape from our off site vendor. Lacking access to the server room, I have to involve a server admin (and a ticket) again once the tape comes in. This policy gets reviewed every time I have to access an offsite tape.
I have evaluated SQL Litespeed from Quest Software. I am satisfied with the GUI as well as the T-SQL scripting support for backups and restores. Management has tasked me with not only gathering the timing data and sizing data, but with also writing the business case for using SQL Litespeed at the Enterprise level.
For those looking to future versions of SQL Server:
SQL Server 2008 Enterprise Edition will use has an option for compressing a backup. I have not had an opportunity to grab a SQL Server 2008 CTP to evaluate myself.
See below for more details:
SQL Server 2008 Books Online (February 2008)
Backup Compression (SQL Server)
http://msdn2.microsoft.com/en-us/library/bb964719(SQL.100).aspx
Happy T-SQLing,
"Key"
MCITP: DBA, MCSE, MCTS: SQL 2005, OCP
February 27, 2008 at 11:47 am
Amu,
With the option of using a SQL backup compression tool (SQL Litespeed) or using compression after taking a native backup (WinRAR or WinZip), there is a cost involved with the software that you choose. All of the products recommended by others and myself can be downloaded for an evaluation period. Hopefully something suggested will fit your needs in regards to downtime for the copied system for the full backups as well as the incremental backups and the bandwidth issues you are facing.
Please keep us posted while you sample products. I, and others, are interested as to what you end up choosing as a solution for your environment/situation.
Regards,
"Key"
MCITP: DBA, MCSE, MCTS: SQL 2005, OCP
March 14, 2008 at 12:36 pm
Might be irrelevant here, but have you considered whether it is possible to split the 300GB database over multiple files\filegroups and then do piecemeal restores?
March 16, 2008 at 6:44 pm
Thank you all for the response.
I am going with differential backup for now. I have big implementation this weekend to migrate databases from one data center to another.
We are also in process of evaluating Quest and Idera compression backup tools.
Thank you all.
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply