April 18, 2018 at 11:50 pm
Comments posted to this topic are about the item Do you check your backups?
April 18, 2018 at 11:56 pm
Thanks for the article! Great information. Could you share the Powershel script? Thanks
April 19, 2018 at 1:00 am
Regarding the cost, don't ask if you can afford it, ask if you can afford not to
Cheers
April 19, 2018 at 2:28 am
sqlfriends - Wednesday, April 18, 2018 11:56 PMThanks for the article! Great information. Could you share the Powershel script? Thanks
Here you go. May need a bit of tweaking.
# CopyBackupFilesToAzure
#
# You need to supply the following mandatory parameters
#
# SourceDirectory - The directory where backups are kept. E.g E:\BACKUP\MyserverName
# ServerName - The name of the server / instance
# BackupType - Backuptype to copy (Options are FULL, LOG, DIFF, MySQL)
# StorageAccountName - The name of the azure storage account
# StorageAccountKey - The key to access the storage account
#
# Usage
#
# .\CopyBackupFilesToAzure -SourceDirectory "E:\Backup\MyServerName" -ServerName "MyServer" -BackupType "LOG" -StorageAccountName "myazurestorage" -StorageAccountKey "mykey"
param
(
[Parameter(Mandatory=$true)]
[string]$SourceDirectory,
[Parameter(Mandatory=$true)]
[string]$ServerName,
[Parameter(Mandatory=$true)]
[string]$BackupType,
[Parameter(Mandatory=$true)]
[string]$StorageAccountName,
[Parameter(Mandatory=$true)]
[string]$StorageAccountKey
)
FUNCTION Generate-StorageContext
{
param
(
[Parameter(Mandatory=$true)]
[string]$StorageAccountName,
[Parameter(Mandatory=$true)]
[string]$StorageAccountKey
)
$TryCounter = 0
$StorageContext = $null
WHILE ($TryCounter -lt 3 -and !$StorageContext)
{
$StorageContext = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey -ErrorAction Stop
$TryCounter = $TryCounter + 1
if (!$StorageContext)
{
Start-Sleep -s 45
}
}
Return $StorageContext
}
Function CheckForCreateContainer
{
param
(
[Parameter(Mandatory=$true)]
[object]$SContext,
[Parameter(Mandatory=$true)]
[string]$SContainer
)
$ContainerNotThere = $null
$null = Get-AzureStorageContainer -Context $SContext -Name $SContainer -ErrorAction SilentlyContinue -ErrorVariable ContainerNotThere
if ($ContainerNotThere)
{
$null = New-AzureStorageContainer -Name $SContainer -Context $SContext -ErrorAction SilentlyContinue
Start-Sleep -Seconds 10
$ContainerNotThere = $null
$null = Get-AzureStorageContainer -Context $SContext -Name $SContainer -ErrorAction SilentlyContinue -ErrorVariable ContainerNotThere
}
if ($ContainerNotThere)
{
$ContainerStatus = "Unable to find / create container"
}
else
{
$ContainerStatus = "Exists"
}
Return $ContainerStatus
}
Function CheckForAndTranferFileToBlobStorage
{
param
(
[Parameter(Mandatory=$true)]
[object]$SContext,
[Parameter(Mandatory=$true)]
[string]$SContainer,
[Parameter(Mandatory=$true)]
[string]$SFileName,
[Parameter(Mandatory=$true)]
[string]$SBackupURI,
[Parameter(Mandatory=$false)]
[boolean]$SOverWrite = $false
)
[object] $FileDoesNotExist
if ($FileDoesNotExist)
{
$FileDoesNotExist.clear()
}
$null = Get-AzureStorageBlob -Blob $SBackupURI -Container $SContainer -Context $SContext -ErrorAction SilentlyContinue -ErrorVariable FileDoesNotExist
if ($FileDoesNotExist -or $OverWrite)
{
if ($CopyToAzureError)
{
$CopyToAzureError.clear()
}
if($SOverWrite)
{
$null = Set-AzureStorageBlobContent -File $SFileName -Container $SContainer -Blob $SBackupURI -Context $SContext -Force -ErrorAction SilentlyContinue -ErrorVariable CopyToAzureError
}
else
{
$null = Set-AzureStorageBlobContent -File $SFileName -Container $SContainer -Blob $SBackupURI -Context $SContext -ErrorAction SilentlyContinue -ErrorVariable CopyToAzureError
}
if($CopyToAzureError)
{
Return "Copy Error [$SBackupURI] to container [$SContainer]"
}
else
{
Return "Success"
}
}
else
{
Return "File In Storage"
}
}
#------------------------------------------------------ START ------------------------------------------------------#
$ProgressPreference = ’SilentlyContinue’
#Check we have one of the allowed backup types
if($BackupType -notin "FULL", "DIFF", "LOG", "MySQL")
{
Write-Output "ERROR - You can only specify FULL, DIFF or LOG for BackupType"
Exit 1
}
$ContainerName = $ServerName.ToLower() + "-" + $BackupType.ToLower()
#Generate the storage context
Write-Output "Generating Storage Context For Storage Account [$StorageAccountName]"
$StorageContext = Generate-StorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
#Check to see if the container exists for the storage account. If not, create it.
Write-Output "Checking for container [$ContainerName] in Storage Account [$StorageAccountName] (Will attempt create if container not present)"
$ContainerStatus = CheckForCreateContainer -SContext $StorageContext -SContainer $ContainerName
if ($ContainerStatus -notin "Exists","Created")
{
Write-Output "-----------------------------------------------------------------------------------"
Write-Output "Container does not exist for the specified storage account and could not be created"
Write-Output "-----------------------------------------------------------------------------------"
Exit 1
}
#Set the fileextention and directory parts
switch ($BackupType)
{
FULL {$FileExtension = '.bak'}
DIFF {$FileExtension = '.bak'}
LOG {$FileExtension = '.trn'}
MYSQL {$FileExtension = '.SQL'}
}
switch ($BackupType)
{
FULL {$DirectoryPart = '*\FULL'}
DIFF {$DirectoryPart = '*\DIFF'}
LOG {$DirectoryPart = '*\LOG'}
MYSQL {$DirectoryPart = '*\MySQL*'}
}
#Get the list of files based on the type of backup file and backup type
$FileListing = Get-ChildItem -Path $SourceDirectory -Recurse | where {!$_.psIsContainer -and $_.Extension -eq "$FileExtension" -and $_.DirectoryName -like $DirectoryPart}
#Get file listing, check if in Azure storage account and tranfer if not there
foreach ($File in $FileListing)
{
$SourceFileName = $File.FullName
$StorageAccountSubDirectory = (((($File.DirectoryName.Replace("$SourceDirectory","")).SubString(1).ToLower()) -replace "\\","") -replace $BackupType,"")
$StorageURI = ($StorageAccountSubDirectory + "`\" + ($File.Name)).replace("\\","\$BackupType\").ToLower()
$FileCopyStatus = CheckForAndTranferFileToBlobStorage -SContext $StorageContext -SContainer $ContainerName -SFileName $SourceFileName -SBackupURI $StorageURI
Write-Output "Processing File [$SourceFileName] : Status [$FileCopyStatus]"
}
April 19, 2018 at 2:37 am
Nice, we don't use Azure, but backups are tested regularly for creating internal data sets as well as scheduled tests from tape.
As well as verifying the backups can be restored, it's also useful to test your own ability to restore rather than the automated verification. Under pressure, you'll be glad you can put together all the restore commands!
April 19, 2018 at 3:31 am
Thank you very much for this article.
Would it be possible for you to post a general ball-park estimate of the costs ?
Just something like x$ per database per GB per month
TIA
Henrik
April 19, 2018 at 3:53 am
henrik staun poulsen - Thursday, April 19, 2018 3:31 AMThank you very much for this article.
Would it be possible for you to post a general ball-park estimate of the costs ?Just something like x$ per database per GB per month
TIA
Henrik
Hi
For the virtual machine testing the backups in the UK with SQL Standard and a Premium 128GB managed OS disk running an L4 VM (4Core, 32GB Ram, 678GB SSD Temp storage) running on average 104 minutes per day it comes out at around £60 per month.
For our backup storage this depends. Based on a 5GB database backup and keeping 31 daily backups, 104 weekly backups and 96 monthly backups we would be looking at £18 per month. This is stored using the Hot Tier. When the archive tier comes to UK storage accounts we are estimating around a 60% reduction in costs as we will make use of the Cool and Archive Tiers.
April 19, 2018 at 4:11 am
We do manual restore tests.
I wrote a script to random select the database and the point in time to restore. Otherwise we might select the databases that are easy to restore, or only the most important databases.
April 19, 2018 at 4:20 am
k.schouten - Thursday, April 19, 2018 4:11 AMWe do manual restore tests.
I wrote a script to random select the database and the point in time to restore. Otherwise we might select the databases that are easy to restore, or only the most important databases.
We tend to restore all our databases on a daily basis to ensure all our daily backups for all our databases are able to be restored. I get a good old manual restore help reassure you the automation is doing it's job 🙂
April 19, 2018 at 5:58 am
hi,
Thank you very much for your reply. Not massively expensive, so it might be an attractive solution for my manager.
Best regards
Henrik
April 19, 2018 at 6:06 am
We are finding it very cost effective and even more so once the Archive tier comes into play. More cost effective than a NAS / File Server, backups are off site and 3 copies held locally in the Azure DC (Standard LRS)
April 19, 2018 at 9:08 am
We use Commvault. I have a few scripts that check backupset to make sure everything is backed up. We do test restores every so often and Commvault verifies that the backups are good.
April 19, 2018 at 10:06 am
Thanks much for the script.
April 19, 2018 at 11:00 am
Do you check for corruption using DBCC CHECKDB?
April 19, 2018 at 11:03 am
Hi. Yes. We basically copy the latest backup from azure storage. Restore it and run check dB against it.
Viewing 15 posts - 1 through 15 (of 19 total)
You must be logged in to reply to this topic. Login to reply