Need to copy backup files from one Server to another

  • Hi,

    I've an emergency requirement to copy Source server database backup files to destination Server through xcopy command. Backup job on source server runs daily, so once this job get completes all databases backups needs to be moved to destination server. But here the main concern is "the backup files on destination server shouldn't be overwritten, they should be placed separately as Source server job runs daily".

    We've a command which overwrites backups on destination server. But we need to keep backups on destination at-least for 4 weeks (means : retention should be 4 Weeks).

    Please assist me in completing the task ASAP.

  • Can you just change the destination of the backup files to a different location so that they do not overwrite any files that are already there?

  • I can think of several options:

    1. the easiest is to include the backup date/time in the backup filename. This will generate unique filenames so existing files won't be overwritten with the copy/move action to the destination server

    3. you could rename the files during the copy/move action to the destination server.

    2. you could copy/move the files to the destination server and place it there in seperate folders (name the folder to date/time).

    Run a separate job to scan the files and delete then when they are older then the retention time.

    ** Don't mistake the ‘stupidity of the crowd’ for the ‘wisdom of the group’! **
  • HanShi (9/23/2014)


    1. the easiest is to include the backup date/time in the backup filename. This will generate unique filenames so existing files won't be overwritten with the copy/move action to the destination server

    Run a separate job to scan the files and delete then when they are older then the retention time.

    I think this is the ideal option.

  • I have a vendor that does this, overwrite same file all the time, I run a Powershell script. You can use the MoveTo or CopyTo method. I assume the destination directory exists in this script.

    Here's a stripped down untested version of it:

    #This script assumes you have r/w rights to both directories.

    $file = get-item 'netapp1\sql_backup\SQLOrig\DatabaseName.FUL'

    $createDate = get-date -Date $file.CreationTime

    $createFile = "$($file.BaseName)_" + [string](get-date -Date $file.CreationTime -Format yyyy_MMdd_HHmmss) + $file.Extension

    $copyTo = "netapp2\sql_backup\SQLNew\$createFile"

    #Probably only need to do test-path if you use CopyTo.

    if (-not (Test-Path $copyTo)) {

    #$file.MoveTo($copyTo)

    $file.CopyTo($copyTo)

    }

  • I've had good results with robocopy, which is included on Windows. It is really fast. It is smart enough to retry, and I can control what to copy depending of creation date. I can even save the log.

    I used it few years ago to copy big backup files (over of 100GB) over a WAN, it took few days but it worked.

    Here's the command I use inside a Win batch file to copy backup files from one drive to another, but same server. It deletes everything from from the destination that is older than 8 days and skips existing backup files, copying the new stuff only.

    ROBOCOPY D:\source S:\target *.* /Z /R:3 /LOG:C:\backup_scripts\mirror_backup.txt /MIR /MAXAGE:8

    As usual, test on dev environment before deploy.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply