July 28, 2016 at 11:18 am
We have a bucket in AWS S3 where backups from production are being copy to.
My task is to copy the most recent backup file from AWS S3 to the local sandbox SQL Server, then do the restore.
I have installed all of the AWS tools for windows on the local server. Credentials to connect to AWS S3 work, etc.
My local server can list all of the files in the S3 bucket. I can successfully download a single file if I specifically name that file.
Here is an example of that working pulling the most recent copy from July 25, 2016.
aws s3 cp s3://mybucket/databasefile_20160725.zip E:\DBA
My goal is to have a copy script that only pulls the most recent file, which I won't know the name of.
Nothing I google or try is getting me the correct syntax to do this.
Any help would be appreciated. Thanks.
September 8, 2016 at 4:07 am
Using only the CLI this is pretty difficult to do, but using the AWS python SDK or powershell its much easier. Here is one way to do it in python:
import boto
import os
from boto.s3.connection import S3Connection
from boto.s3.key import Key
#Load the keys into memory within the OS
os.environ['AWS_ACCESS_KEY_ID'] = "XXXXXXXXXXXXX"
os.environ['AWS_SECRET_ACCESS_KEY'] = "XXXXXXXXXXXX"
#Create local variable for the bucket
s3bucket = "mybucket"
#connect to the bucket
conn = S3Connection()
bucket = conn.lookup(s3bucket.lower())
#get the contents of the bucket
l = [(k.last_modified, k) for k in bucket]
#sort the list
key = sorted(l, cmp=lambda x,y: cmp(x[0], y[0]))[-1][1]
#Download the latest modified file
print 'Downloading %s' % (key.name)
with open(key.name, 'wb') as f:
key.get_contents_to_file(f)
f.truncate()
MCITP SQL 2005, MCSA SQL 2012
September 8, 2016 at 8:35 am
Yep, I would go with the python option. Saved as a script, can be executed by anything.
May 3, 2017 at 12:57 pm
Even this post is real old, but I would like to add that I use powershell to copy files between AWS S3 and my local machine. Here is an example:
$accessKey = 'keyinfo'
$secretKey = 'secretekey'
$region = 'us-east-1'
$bucket = 'bucketname'
$keyPrefix = '/foldernameinAWS/'
$localPath = 'C:\SQLBackups\'
$objects = Get-S3Object -BucketName $bucket -KeyPrefix $keyPrefix -AccessKey $accessKey -SecretKey $secretKey -Region $region | Sort-Object LastModified -descending | Select-Object -First 1 | select key
foreach($object in $objects) {
$localFileName = $object.Key -replace $keyPrefix, ''
if ($localFileName -ne '') {
$localFilePath = Join-Path $localPath $localFileName
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -AccessKey $accessKey -SecretKey $secretKey -Region $region
}
}
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply