January 20, 2010 at 5:43 pm
I'm trying to create one SSIS package that I can deploy to all servers to use for backing up databases. The package will be controlled by variables as for how to function, and run by a job.
The last step I'm trying to do is to copy the backup files using an Execute Process task to run Robocopy which copies the local backup files to a file server.
The problem I'm running into is that Robocopy can end with a return code of 0-7 that is all successful, and 8-16 for combinations of actions where at least one action failed. The execute process task can only look at one return code, and then only for a match.
Short of dynamically generating a batch file to run Robocopy (as in http://www.sqlservercentral.com/Forums/FindPost709542.aspx), is there any other way to use Robocopy in this SSIS package so that return codes < 8 will allow the job be successful, while return codes > 7 will fail the job?
Thanks,
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
March 8, 2010 at 11:12 am
I have been looking into this same issue. I've been experimenting with a PowerShell script that executes robocopy, not having much luck with that currently.
Have you found any solutions?
March 8, 2010 at 11:26 am
What I reverted to was to have the servers do their backups locally. Then, on the file servers where the backups are being stored, running a scheduled task that runs a batch file to copy all of backups to their respective directory on the file server. The batch files end with the error level checking as seen in the article I referenced in the original post.
On some of the sql servers, we have large san drives that we copy the backups to from other servers. On these servers, I run the batch file as a sql job.
On most of these, I utilize the Robocopy /MIR option. However, some of my older servers don't have enough disk space to store multiple backups locally. For these servers, I do not utilize the /MIR option, and follow up the backup to the server with a call to ForFiles.exe to delete files older than x days. In this way, the file server will always have 2-4 weeks of backups (depending on backup frequency), while the local server will have at least the last backup on it.
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
March 8, 2010 at 12:00 pm
I work with a number of environments that the backups are not done locally at all, they are all written to a fileshare on a file server that is backed by SAN, they all get backed up from there. In this method it takes longer to do the backup but so far is still faster when considering you have to back it up to a local file system (that is usually backed by SAN) and then copy it elsewhere.
If the backup was successfully, you know it was "copied" as well..
CEWII
March 8, 2010 at 2:29 pm
The problem we ran into is that our file server had a local hard drive crash, and was down for 3 days while the entire system was being rebuilt. It is unacceptable to go without backups for that long, so backups are now being done locally (so we know it gets done... and the backup will be faster also), and then the backup is copied to the file server BY the file server. If it's down, when it comes up it will run the batch files to copy the backups from the servers to the file server.
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
March 8, 2010 at 2:42 pm
I guess the batch file is the only way to go, either that or just let the job step fail. I really like the robocopy method because it will copy the latest fulls as well as trans log backups. I've just been struggling with a good way to implement it.
I've been trying to develop a simple DR solution for quite some time. It went from Mirroring to Log Shipping to a few SQL Agent jobs. I'm trying to implement a system that is as simple as possible and to have all our SQL data replicated to a DR server. It doesn't have to be replicated immediately or failed over automatically, that why the SQL Agent jobs with some SP's seemed like the best fit for us.
March 8, 2010 at 3:02 pm
Wayne,
I would have to take a little different approach.. It is unacceptable for the fileserver to be down for 3 days..
CEWII
March 8, 2010 at 8:52 pm
Elliott W (3/8/2010)
I work with a number of environments that the backups are not done locally at all, they are all written to a fileshare on a file server that is backed by SAN, they all get backed up from there. In this method it takes longer to do the backup but so far is still faster when considering you have to back it up to a local file system (that is usually backed by SAN) and then copy it elsewhere.If the backup was successfully, you know it was "copied" as well..
CEWII
I agree with Elliott, I don't think it is good idea to copy backup files locally then copy to fileshare, if you are doing this you are wasting lot of resources. We even don't copy backup files across Dev Environment for refreshes, because we don't want to waste bandwidth. We restore Dev environment from Prod fileshare, this is done at firewall level, our network team had opened port one way to see Prod fileshare from Dev environment.
As Elliott mentioned read\writes to SAN is much faster compared to Local file system.
EnjoY!
March 8, 2010 at 9:04 pm
WayneS (1/20/2010)
Short of dynamically generating a batch file to run Robocopy (as in http://www.sqlservercentral.com/Forums/FindPost709542.aspx), is there any other way to use Robocopy in this SSIS package so that return codes < 8 will allow the job be successful, while return codes > 7 will fail the job?
Why does it have to be dynamic? I don't think that the one at that thread is. Can't you just write a static BAT file and pass it the file names and paths as parameters?
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply