This article will describe an example where we have a script running once a day taking full backups, verifying the backups, copying them across the LAN, making an archive and sending email notifications. There is also an option to upload the backup to an FTP server.
I believe that having a solid database backup and recovery procedure is the first and most important step in ensuring organizational data is safe from loss due to a crashed hard drive or failed raid controller on the data server. My preferred method of performing backups is using PowerShell and Shared Management Objects (SMO), executed from the Windows task scheduler. I will explain to you with an example.
In my example, the database backup routine is run from a separate server from the data server. This separate server can be domain and file server, performing important tasks for the LAN such as serving files, Active Directory, DNS, or Exchange as well as other services. The database backups being run on this server can run alongside file, Exchange and system state backups. Exchange backups save emails. System state backups include registry settings, active directory data and important system configuration. Having all of these backups together in one place makes it easy to manage your recovery process.
PowerShell and the .NET SMO objects must be installed onto the server so this script can run. PowerShell ships with Windows Server 2008, but earlier versions of Windows Server will require this be installed. SMO comes installed with SQL Server 2008, but again it must be installed if earlier versions of SQL Server are installed on the machine, or if as in our case, the script is running from a separate machine to the data server, where SQL Server may not be actually installed at all.
The task must be run under an account with the appropriate permissions. The account must have permissions to backup the database, and also to save database backup files to the destination folders. Running this under domain administrator will easily achieve this.
I will now explain each part of the script:
The first part of the script is to reference the SMO objects and extensions in the script
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
Then we'll create a variable to store the current date, for archiving purposes
$a = Get-Date $b = "dbName" + $a.Day + "-" + $a.Month + "-" + $a.Year + ".bak.zip"
Next we delete the previous backup on the data server with the following command.
Remove-Item \\dataserver\folder\dbName.bak
Now we create a server object
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") ("dataserver")
We must create a backup object to perform the backup, and a restore object to verify that the backup is valid. We use the SMO objects below to do this
$dbBackup = new-Object ("Microsoft.SqlServer.Management.Smo.Backup") $dbRestore = new-object ("Microsoft.SqlServer.Management.Smo.Restore")
The name of the database must be specified and we can assign that to a property.
$dbBackup.Database = "dbName"
Now we can specify the backup file, which must reside on the data server.
$dbBackup.Devices.AddDevice("C:\folder\dbName.bak", "File")
We wish to backup the database, not transaction logs, so we specify this.
$dbBackup.Action="Database"
Next we set the backup file to have one backup set. This is equivalent to WITH INIT in the Transact-SQL BACKUP statement. However we don't really need to do this, since we deleted the previous backup
$dbBackup.Initialize = $TRUE
Now we perform the actual backup
$dbBackup.SqlBackup($server)
After the backup has occurred, we'll check if the new backup file exists. Having deleted the previous backup, checking the existence of this file is the first step in checking if the backup has succeeded. If not, we send email notification and exit
if(!(Test-Path \\dataserver\folder\dbName.bak)){ $smtp = new-object Net.Mail.SmtpClient("emailserver") $smtp.Send("from", "to", "Backups not working", "Action required immediately for Full Backup") Exit }
We verify that the backup is valid. This is the second step to ensuring that the database backup has completed successfully. The SqlVerify method is same as 'RESTORE VERIFYONLY' in Transact-SQL. It returns true if the backup is valid, otherwise it returns false. If backup not valid, we send email notification and exit.
$dbRestore.Devices.AddDevice("C:\folder\dbName.bak", "File") if (!($dbRestore.SqlVerify($server))){ $smtp = new-object Net.Mail.SmtpClient("emailserver") $smtp.Send("from", "to", "Backups not valid", "Action required immediately for Full Backup") Exit }
After verifying that the new backup file is ok, we copy the file from the dataserver to local domain server. The backup file now resides on another physical machine at time of backup, thereby having the backup located on multiple machines.
Copy-Item \\dataserver\folder\dbName.bak C:\Data\dbName.bak
Next we create the daily archive zip. With SQL Server 2008, there is a compression option with backups, but this is not available with earlier versions. Since we are performing the compression in PowerShell, we can zip database from earlier versions of SQL Server as well. We compress the backup archive to cut back on storage space with this command.
Write-Zip C:\Data\dbName.bak -OutputPath C:\Archives\$b
The last part of the script emails the completion time. This is a good data point to keep an eye on, to have an idea of long the backup should be taking
$c = Get-Date -format t $smtp = new-object Net.Mail.SmtpClient("emailserver") $smtp.Send("from", "to", "Database Backup Finish Time", $c)
Another option for the script is to FTP the backup files to an FTP server. We could upload the compressed archive with this code:
$webclient = New-Object System.Net.WebClient $uri = New-Object System.Uri("FTP-server-site") $webclient.UploadFile($uri, $location)
This script can be run to perform a full backup. Ideally this is done at the end of the business day. The file backup service can then copy this onto external hard drive, to be taken offsite as soon as this is completed or on the following day.
During the day I run hourly differential backups, and transaction log backup every ten minutes. All of these are copied to a network folder and an external hard drive in similar fashion. If the database server crashes, we can restore backups to another physical machine with SQL Server pre-configured, and be up and running within minutes.
To restore from a single full backup, run the Transact-SQL commands from the management studio:
restore database dbName from disk='C:\folder\dbName.bak' with recovery,replace
This will restore database dbName on the alternate physical data server to the point when the full backup was taken. Note that the backup must be copied to the destination folder on the new destination database server. The script can easily be run on the database server, but as stated, I like to have it running alongside file and system state backups from our domain server. I believe having all backups running in one location makes the backup process easier to manage.
This is a simple solution, but one I believe is very effective. This can be used in configurations where having a few minutes of downtime is acceptable in the event of disaster. There are many organizations where having a few minutes of down time is not acceptable. More complex configurations such as database mirroring, log shipping or clustering must be used in these cases. These would better ensure uptime of 100%. Increased complexity usually is harder to manage, requiring an increased skill set from the database administrator. This same backup procedure could run alongside those other systems as well.
With the example given we can read our emails every day to determine how database backups are performing. The whole script follows at the bottom of the article.
Cheers,
Jim
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null $a = Get-Date $b = "dbName" + $a.Day + "-" + $a.Month + "-" + $a.Year + ".bak.zip" Remove-Item \\dataserver\folder\dbName.bak $server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") ("dataserver") $dbBackup = new-Object ("Microsoft.SqlServer.Management.Smo.Backup") $dbRestore = new-object ("Microsoft.SqlServer.Management.Smo.Restore") $dbBackup.Database = "dbName" $dbBackup.Devices.AddDevice("C:\folder\dbName.bak", "File") $dbBackup.Action="Database" $dbBackup.Initialize = $TRUE $dbBackup.SqlBackup($server) if(!(Test-Path \\dataserver\folder\dbName.bak)){ $smtp = new-object Net.Mail.SmtpClient("emailserver") $smtp.Send("from", "to", "Backups not working", "Action required immediately for Full Backup") Exit } $dbRestore.Devices.AddDevice("C:\folder\dbName.bak", "File") if (!($dbRestore.SqlVerify($server))){ $smtp = new-object Net.Mail.SmtpClient("emailserver") $smtp.Send("from", "to", "Backups not valid", "Action required immediately for Full Backup") Exit } Copy-Item \\dataserver\folder\dbName.bak C:\Data\dbName.bak Write-Zip C:\Data\dbName.bak -OutputPath C:\Archives\$b $webclient = New-Object System.Net.WebClient $uri = New-Object System.Uri("FTP-server-site") $webclient.UploadFile($uri, $location) $c = Get-Date -format t $smtp = new-object Net.Mail.SmtpClient("emailserver") $smtp.Send("from", "to", "Database Backup Finish Time", $c)