June 6, 2013 at 1:50 am
Comments posted to this topic are about the item Quickly Copy Data
June 6, 2013 at 3:43 am
Thanks. If for no other reason than the extremely useful general technical advise on copying large files.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
June 6, 2013 at 5:40 am
I get frustrated with the "use this free tool" approach. I'd pay for a tool that offered reliably high lossless compression capability.
The other point I'd make is that people are so used to copying stuff around their local infrastructure that they get sloppy.
If you were talking about synchronising DBs then rather than copying large backup files I'd look at log shipping the changes.
At a file level perhaps we should have a facility that concentrates on keeping a catalogue of file timestamps and shifting that catalogue around. Concentrate on syncing the metadata not the data.
If someone wants a particular file then the catalogue is checked to see if an update has taken place in the past 'x' minutes. If so sync the file, if not then skip to serve the file.
June 6, 2013 at 5:49 am
The one issue I have is on large databases when upgrading from one version of SQL Server to another on a different server. Log Shipping won't help there due to the different versions of SQL. The nice thing about SQL2008R2 Standard Edition is the compressed backup. I LOVE that feature and it is free with standard Edition. Taking databases from SQL2000 and 2005 to 2008R2 and seeing how much faster the backup runs and is 80% smaller is great.
June 6, 2013 at 7:28 am
Put the files on a USB drive and drop it in the mail or get in your car and drive them to the other location. :hehe:
June 6, 2013 at 7:32 am
I have mailed the USB Drive to get log shipping started as well. Overnight shipping is about a 14 hour windows, and the local connections were substantially faster than the remote ones.
As for the complaint about free tools, a robust free tool like 7 zip is much more likely to have less friction then a commercial product. You are moving files between networks, so the more open the tool the less likely you will run into an unnecessary gotcha.
June 6, 2013 at 7:34 am
Robert.Sterbal (6/6/2013)
....so the more open the tool the less likely you will run into an unnecessary gotcha.
What was that? Flame on? Seriously: I believe it totally depends on the tool and its use.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
June 6, 2013 at 7:43 am
My experience with tools that have to last a long time, and I've been compressing files for 20+ years is that open source projects offer substantial benefits for use when you have to be interoperable. Your mileage may vary.
June 6, 2013 at 7:44 am
krowley (6/6/2013)
Put the files on a USB drive and drop it in the mail or get in your car and drive them to the other location. :hehe:
I was about to suggest the same thing. Sometimes the network connection between two remote locations just can't beat transporting a single 2 TB "packet" of data at 65 MPH. That's called thinking outside the box.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
June 6, 2013 at 7:49 am
Robert.Sterbal (6/6/2013)
My experience with tools that have to last a long time, and I've been compressing files for 20+ years is that open source projects offer substantial benefits for use when you have to be interoperable. Your mileage may vary.
I am certainly not disagreeing with your experience. It just seemed a little "black and white".
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
June 6, 2013 at 7:56 am
The expression I grew up with is, "Never underestimate the bandwidth of a station wagon full of tapes, speeding down the highway."
June 6, 2013 at 7:56 am
The expression I grew up with is, "Never underestimate the bandwidth of a station wagon full of tapes, speeding down the highway."
June 6, 2013 at 8:00 am
The expression I grew up with is, "Never underestimate the bandwidth of a station wagon full of tapes, speeding down the highway."
June 6, 2013 at 8:18 am
Looks like we need to get introduce deduplication to the DBMS world. In the virtual world, VMWare and Hyper V only keep one copy of the same block in memory, and when doing backups for that matter. Granted, we have transaction log and differential backups, but deduplication for database backups should be built into the backup subsystem. So if only 1% of the database blocks have changed, the backup command should only back up those blocks. During restores, ALL of the blocks should be retrieved without DBA intervention.
Backupand restore is an area where SQL Server isn't even where DB2 was in V3, back in the early 1990's.
June 6, 2013 at 8:25 am
I copy SQL Server backups over the Internet. I have two remote servers connected using a (paid for) LogMeIn Hamachi2 Peer-to-Peer VPN. I then transfer the files over the VPN using rsync (a Linux application not used much in the Windows world but is available as part of the Cygwin install). No need to compress the backups - just let rsync (client/server) work out which blocks have changed and it will only transmit the differences.
Its all command line stuff so create a script and launch it from another step in your SQL backup job following the actual backup.
Viewing 15 posts - 1 through 15 (of 34 total)
You must be logged in to reply to this topic. Login to reply