November 6, 2008 at 4:56 am
How do you guys handle moving large files across networks - not just local networks but ones with a high enough rtt that ftp becomes a bit pointless?
Any good solutions out there - ideally ones allowing a TB to transfer in under a day...!
November 6, 2008 at 5:02 am
to be honest, i dont belive in any specail softwares for this case, it always depends upon the network connection between servers, this is the base, if this is fast enough then its fine, i am having a backup transfered ( aroung 30 files with a total capacity of 800GB) each day, normally using scripts π
November 6, 2008 at 7:53 am
Try ROBOCOPY
Thanks!!
The_SQL_DBA
MCTS
"Quality is never an accident; it is always the result of high intention, sincere effort, intelligent direction and skillful execution; it represents the wise choice of many alternatives."
November 6, 2008 at 7:55 am
Also try Network Data Mover, this was used by a colleague of mine to rebuild a crashed server :w00t: .They pulled all the files overnight π
The_SQL_DBA
MCTS
"Quality is never an accident; it is always the result of high intention, sincere effort, intelligent direction and skillful execution; it represents the wise choice of many alternatives."
November 6, 2008 at 8:00 am
FTP
* Noel
November 6, 2008 at 8:07 am
ROBOCOPY is good for moving files across a network. It has command line switch options for automatic retry and to resume a file copy at the point of failure, so it works well if the WAN is not perfectly stable.
If you are moving database backup files, you should consider using a backup utility, like Litespeed, that compresses your backup files so that you donβt have to move as much across the WAN.
November 6, 2008 at 9:58 am
RichardB (11/6/2008)
How do you guys handle moving large files across networks - not just local networks but ones with a high enough rtt that ftp becomes a bit pointless?Any good solutions out there - ideally ones allowing a TB to transfer in under a day...!
Is this to transfer backups off the server to a SAN for Backup type of thing?
first thing I'd look at is compressing the backup... Hyperback, RedGate's tool, Idera's tool... anything... you'll find the DB Backup will be much smaller and easier to move.
November 6, 2008 at 10:20 am
did you check out backup compression softwares ?
They typically reduce the size to +/- 20%
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution π
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
November 6, 2008 at 11:31 am
I use the application, TeraCopy. This is very useful for my 50+ GB files that I need to move from server to server.
November 7, 2008 at 4:01 am
Hi,
I use SQL Backup Devices to backup a large database (> 4 GB in size)
example script
BACKUP DATABASE [TEST] TO [TEST1], [TEST2], [TEST3], [TEST4], [TEST5], [TEST6] WITH INIT , NOUNLOAD , NAME = N'TEST backup', NOSKIP , STATS = 10, NOFORMAT
1- split the backup into 2-3 GB in size
2- zip each backup device using WinZip
3- copy zipped backup devices to remote server
4- unzip the zipped backup devices on the remote server
5- run a restore of the backup devices on the remote server
example script
-- Step 1. This checks that the physical and logical names match the
-- 'with move' clause in step 2. If they do, you can carry out step 2.
USE master
GO
restore filelistonly
from
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST1.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST2.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST3.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST4.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST5.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST6.bak'
-- Step 2
restore
database TEST
from
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST1.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST2.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST3.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST4.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST5.bak',
disk = 'd:\Microsoft SQL Server\mssql\backup\TEST6.bak'
with move 'TEST_Data' TO 'd:\microsoft sql server\mssql\data\TEST_data.mdf',
move 'TEST_Log' TO 'd:\microsoft sql server\mssql\data\TEST_Log.ldf'
It takes time but works every time.
If you have WinZip v11 onwards you can use the WinZip Command Line Editor to automatically zip the backup files after the backup is down.
Adib
November 7, 2008 at 4:15 am
RichardB (11/6/2008)
How do you guys handle moving large files across networks - not just local networks but ones with a high enough rtt that ftp becomes a bit pointless?Any good solutions out there - ideally ones allowing a TB to transfer in under a day...!
I follow the same as
adibjafari
kshitij kumar
kshitij@krayknot.com
www.krayknot.com
November 7, 2008 at 4:52 am
Well, we do similar to abi... trouble is when its 10 100GB files it still hurts! We are testing Qpress to compress them - pretty impressive speed on that!
The compressed backup is obviously a potential winner - but even with it down 20% whats the best tool to copy it?
Robocopy is solid, but slow. FTP is neither solid - nor with the latency/vpn stability we have is it that fast, and frequently ends up with corrupt files.
I see posts about EseUtil in another thread - anyone tried?
To the chap mentioning TeraCopy - hows that working out for you? Is that over a network with some latency?
Cheers all for input
Rich
November 7, 2008 at 7:32 am
I have been using TeraCopy for a few months now. So far, it has never failed me. I just finished copying a 99 GB file accross my network and it completed with success again. It has a nice interface that displays percentage of completion and list file names while copying. I'll never go back to a robocopy or Explorer copy for a large file again. This works way to well.
November 7, 2008 at 8:38 am
I routinely have to backup remote production databases and then restore them to local test servers. It's simple to have the host machine to the full backup to a file via command line. I then use the 7-Zip version found at PortableApps.com to compress the file and split it up into chunks. At DCT we wrote and FTP Move utility that works from the command line. My command file does the backup, compresses that, and uploads the result to my FTP server.
ATBCharles Kincaid
November 9, 2008 at 4:44 pm
RichardB (11/7/2008)
Well, we do similar to abi... trouble is when its 10 100GB files it still hurts! We are testing Qpress to compress them - pretty impressive speed on that!The compressed backup is obviously a potential winner - but even with it down 20% whats the best tool to copy it?
Robocopy is solid, but slow. FTP is neither solid - nor with the latency/vpn stability we have is it that fast, and frequently ends up with corrupt files.
I see posts about EseUtil in another thread - anyone tried?
...
The advantage of ESEUTIL is that it doesn't flood the memory on the source host during the copy (as it uses unbuffered IO). Qpress's later versions also support an unbuffered IO mode, so you could potentially do your Qpress operation from the source host with the destination being the remote host, saving a round of local file writes then reads (compared to compressing to a local dest then moving/copying to remote).
Are you using a standard SMB copy operation to your destination currently? If so, and Qpress-directly-to-remote isn't an option, is there any chance you could use an SMB 2.0 capable host on both ends of the transfer (Win2008 or Vista)? I've not used it personally but it's apparently a lot more efficient than SMB 1.0 as far as round-trips are concerned, and more resilient to brief drops in connectivity.
Regards,
Jacob
Viewing 15 posts - 1 through 15 (of 19 total)
You must be logged in to reply to this topic. Login to reply