How best to push large DB Copies over the wire on a regular schedule

  • I’m trying to not be verbose so if you have any questions please don’t hesitate to ask.

    In short the problem I’m looking to solve is how best to provide to our software vendor a copy of our rather large DB (weighs in at between 225 and 250 GB) on a bi-weekly to monthly basis and do it via an internet connection.

    Currently the process is:

    1)Recover to temp SQL Server a recent copy of the DB

    2)Change the Recover Model to SIMPLE and flush out the Log file so that it is as small as possible.

    3)Detach the DB and then compress the files via WinRar. I choose to do this instead of doing a standard backup because I just don’t have an extra 500-600 GB on any SQL Server box to use for this process that can also compress the DB in a reasonable time frame

    4)Once the files are compressed I sending them over the wire via FTP.

    Does anyone have any suggestions on how I might could do this better? The only constraints are it must work over an FTP connection.

    I've never had to do much with replication (Not that I think that would even be realistic for this kind of scenario but I could be wrong) and so I don’t know if it is feasible r not but what about Log Shipping or possibly something newer in the area of SQL technology?

    Any suggestions are much appreciated. I have a feeling the answer is going to be no but I’m asking all the same.

    Thanks

    Kindest Regards,

    Just say No to Facebook!
  • Sounds like it would be worth your while investing in a tool that creates compressed backups. Should be able to get a backup file about 20% uncompressed size.

    ---------------------------------------------------------------------

  • Check how large your differential backups would be.

    Send them a full backup once and and then differential backup every day or so (probably more data sent overall but in smaller chunk).

    If your DB has too much activity and the differential backups are too big, then you should do what is suggested above, use a backup compression tool (sql lightspeed, sql backup from red gate or anything else).

    You can also migrate to SQL 2008, which has it's own backup compression (so the folks receiving the backup will not have to decompress it before using it).

    If all that is not possible you can also do a small application in .net to read a file (your db file or a backup) and compress it on the fly while sending it by FTP (you could even use 7Zip compression to compress it even more).

    This can be done in a 10-20 lines application.

  • Oliiii (10/13/2010)


    Check how large your differential backups would be.

    Send them a full backup once and and then differential backup every day or so (probably more data sent overall but in smaller chunk).

    If your DB has too much activity and the differential backups are too big, then you should do what is suggested above, use a backup compression tool (sql lightspeed, sql backup from red gate or anything else).

    Unless all future full backups were done with the 'copy only' option the differential base would be changed and the full backup would need to be sent again. In any case the differential would continue to grow until that too became unwieldy.

    the compression tools normally have a free 'restore only license' or can compress in standard zip format.

    ---------------------------------------------------------------------

  • If you are looking to speed up the transfer of the compressed file, then your options are limited. If you can get away from FTP, you could use a product like RocketStream - which copies files in multiple streams.

    To give you an example, I recently used this product to copy a 150GB (compressed) backup file across the WAN to our DR site. Without this product - the copy took 26+ hours. With the product, it took less than 6 hours.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

  • First off thanks to everyone for the suggestions. I wasn't expecting but maybe 1 or 2 and of that probably not anything I could implment because of various restrictions. The good news is of these ideas there is one that may be possible and somethingf I will check into.

    Because several have suggested the idea of sending copies of the log file backups each day (whcih woudl certainly be a lot smaller) I thought it might be prudent to mention that we don't do native SQL Server backups. We use Microsofts

    DPM to do the backups and it doesn't work like other backup systems at least I don't think it does. DPM records changes to bits themselves and so you could perform some action that would be very small in the Transaction log but be very big in terms of the backup because a large number of bits have changed or flipped.

    I don't pretend to be a backup guru but I don't believe there's anyway to add another backup tool to what DPM is doing and switching from DPM to something else is also not likely.

    Thanks

    Kindest Regards,

    Just say No to Facebook!
  • Replication probably wouldn't work in this scenario, nor mirroring. Log shipping might work for you but that involves the FULL backup and T-Log backups.

    Personally, I would still take a FULL backup even though you are using DPM. I like to have FULL backups on hand. Are you not sending backups to an "offsite" location?

    If you started taking a FULL backup, you could save yourself some time from your current methodology of the restore, detach, compress. Take the FULL backup, then compress it as you are currently doing. If you are using a 3rd party tool, ie RedGate then you can compress as you backup.

    Steve

  • Does the data have to go over the wire, or could you just buy a few 1TB USB drives and send them back and forth? Sending a few hundred gigs of data over the net does no favors for any other users of those connections.

  • Ross McMicken (10/14/2010)


    Does the data have to go over the wire, or could you just buy a few 1TB USB drives and send them back and forth? Sending a few hundred gigs of data over the net does no favors for any other users of those connections.

    This is how we use to send the DB but it took longer to ship it then it does to push it over the wire.

    Kindest Regards,

    Just say No to Facebook!
  • Steve-3_5_7_9 (10/14/2010)


    Replication probably wouldn't work in this scenario, nor mirroring. Log shipping might work for you but that involves the FULL backup and T-Log backups.

    Personally, I would still take a FULL backup even though you are using DPM. I like to have FULL backups on hand. Are you not sending backups to an "offsite" location?

    If you started taking a FULL backup, you could save yourself some time from your current methodology of the restore, detach, compress. Take the FULL backup, then compress it as you are currently doing. If you are using a 3rd party tool, ie RedGate then you can compress as you backup.

    Steve

    To my understanding (the Sys Admins do this part) all the backups of eveyrthing (not just SQL DB's) are duplicated off site.

    I pondered the idea of perhaps using Log Shipping. My main concern with any type of additional backup be it native SQL backup or someother third party like Redgate is how (if any) it may interfer with what is already in place. DPM DB Backups aren't like native SQLK Server Backups and my concern is that since the DB is set to a FULL recovery model will act of some third party backup done in addition to what DPM is doing cause DPM to incorrectly see what has and what has not been backed up since it (DPM) last did a Log backup.

    Currently DPM gets backups of the Log file every 30 minutes and it does a full backup (called an Express Backup) every 12 hours. The log file gets reset after each Express backup. If I thrown in some other backup of the DB be it a native SQL Server backup or by using some third party like RedGate, will that

    A) Interfer with DPM if the 2 should attempt to do a backup at teh same time

    B) Cause DPM to incorrectly see what has and what has not been backed up as far as the Transaction Log file goes

    Thanks

    Kindest Regards,

    Just say No to Facebook!
  • I don't know much about DPM. I did find this at social.msdn.microsoft.com

    ===

    "DPM takes VDI backups not native backup. Full backups would not hurt, log backups would."

    ===

    so it looks like you couldn't conduct log shipping unless DPM can do it.

  • Steve-3_5_7_9 (10/14/2010)


    I don't know much about DPM. I did find this at social.msdn.microsoft.com

    ===

    "DPM takes VDI backups not native backup. Full backups would not hurt, log backups would."

    ===

    so it looks like you couldn't conduct log shipping unless DPM can do it.

    Thanks for sharing.

    Kindest Regards,

    Just say No to Facebook!
  • YSLGuru (10/12/2010)


    Change the Recover Model to SIMPLE and flush out the Log file so that it is as small as possible.

    Generally, but this doesn't always happen!

    7zip is free, use this to compress the native backup files. Alternatively a solution such as Redgate SQLBackup is your other option

    -----------------------------------------------------------------------------------------------------------

    "Ya can't make an omelette without breaking just a few eggs" 😉

  • - Why don't you just directly 7zip your existing full backup file(s) ?

    If you precede your fullbackup with a logbackup, wouldn't your fullbackup be smaller ?

    - using 7zip, you can split it into chuncks if you want and password protect it if you want.

    rem Usage: 7za <command> [<switches>...] <archive_name> [<file_names>...]

    rem [<@listfiles...>]

    rem

    rem <Commands>

    rem a: Add files to archive

    rem b: Benchmark

    rem d: Delete files from archive

    rem e: Extract files from archive (without using directory names)

    rem l: List contents of archive

    rem t: Test integrity of archive

    rem u: Update files to archive

    rem x: eXtract files with full paths

    rem <Switches>

    rem -ai[r[-|0]]{@listfile|!wildcard}: Include archives

    rem -ax[r[-|0]]{@listfile|!wildcard}: eXclude archives

    rem -bd: Disable percentage indicator

    rem -i[r[-|0]]{@listfile|!wildcard}: Include filenames

    rem -m{Parameters}: set compression Method

    rem -o{Directory}: set Output directory

    rem -p{Password}: set Password

    rem -r[-|0]: Recurse subdirectories

    rem -scs{UTF-8 | WIN | DOS}: set charset for list files

    rem -sfx[{name}]: Create SFX archive

    rem -si[{name}]: read data from stdin

    rem -slt: show technical information for l (List) command

    rem -so: write data to stdout

    rem -ssc[-]: set sensitive case mode

    rem -ssw: compress shared files

    rem -t{Type}: Set type of archive

    rem -v{Size}[b|k|m|g]: Create volumes

    rem -u[-][p#][q#][r#][x#][y#][z#][!newArchiveName]: Update options

    rem -w[{path}]: assign Work directory. Empty path means a temporary directory

    rem -x[r[-|0]]]{@listfile|!wildcard}: eXclude filenames

    rem -y: assume Yes on all queries

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • ALZDBA (10/24/2010)


    If you precede your fullbackup with a logbackup, wouldn't your fullbackup be smaller ?

    No, I don't think a full backup will be any smaller because of backing up the log.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 15 posts - 1 through 15 (of 18 total)

You must be logged in to reply to this topic. Login to reply