October 5, 2016 at 8:26 am
I am doing native SQL backups on a windows 2012 server to an S3 bucket. The full's (40gb) never make it from the cache drive to the S3 bucket. What am i doing wrong
October 31, 2016 at 8:05 pm
I think you'll need to provide more info. I assume you have a virtual machine in AWS that's running SQL Server. The version shouldn't matter, but are you backing up to a URL from SQL Server or have the s3 bucket mapped or something else?
November 1, 2016 at 10:10 am
I have mapped the S3 url as a drive.
November 4, 2016 at 10:27 am
I believe you have to backup to local storage on the EC2 instance and then upload the file to S3 using this method, if the file is going to be larger than 5GB in size.
https://aws.amazon.com/blogs/aws/amazon-s3-multipart-upload/
MCITP SQL 2005, MCSA SQL 2012
March 21, 2017 at 11:27 am
Hello rwyoung01
You can now export .bak file from a EC2 instance running SQL Server to S3.
http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/SQLServer.Procedural.Importing.html
But if you want to code everything by you check my GitHub https://github.com/mariopoeta/backupsqlservertos3
May 3, 2017 at 12:47 pm
I suggest using PowerShell when it comes to copying large files between your local/AWS. Use Write-s3object cmdlet to copy your .bak file after making the backup through SQL. You need to download Powershell for AWS
- Upload to S3
$s3Bucket = 'yourbucketnameinaws'
$backupPath = 'D:\BKUP_TEST_AWS\'
$region = 'us-east-1'
$accessKey = 'writeyourkeyhere'
$secretKey = 'writeyoursecretekeyhere'
$fileName = 'TEST_backup.bak'
$filePath = Join-Path $backupPath $fileName
Write-S3Object -BucketName $s3Bucket -File $filePath -Key $fileName -Region $region -AccessKey $accessKey -SecretKey $secretKey
I
May 9, 2017 at 8:56 am
I have not tried it myself, but all of the syntax for backing up direct to an Azure blob looks like it should work with S3. It may be worth experimenting with it.
Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.
When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara
May 9, 2017 at 10:26 am
EdVassie,
The script I posted works. I have uploaded about 750 GB of .bak files (backing up the SQL database to 12 .bak files) first to S3 storage, I used scriptblock start-job to transfer the .bak files in parallel. After the files are loaded to S3 storage, I then used Cloudberry Drive to map directly to S3 storage and run restore command in EC2 instance. We are implementing a DR plan to AWS.
I use SQL encryption when running the backup job locally so all files are encrypted, I also use -ServerSideEncryption AES256 parameter to enable server side encryption in S3. You do not need to unencrypt anything when you download the files again from S3 when server side encryption is used.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply