January 25, 2021 at 12:25 pm
We backup over the weekend our full backups to a backup server with a large EBS volume. Once this is complete we have a second step to invoke a powershell script remotely on the backup server that uses Rclone to sync the backups to S3. Perhaps this is a daft idea - and I'd be more than willing to hear what other suggestions you have.
However, there are a few problems with this. Firstly, it takes soooo long - and at scale this isn't going to work. Secondly when the powershell script invokes on the backup server, if the rclone stops working for whatever reason the SQL job just assumes it has completed successfully. So my question is, is there a way for the backup job to be "aware" of whether the rclone S3 sync has completed successfully or not - and obviously if it fails then the job is considered a failure and notifies accordingly.
Any ideas or suggestions?
thanks.
January 25, 2021 at 4:24 pm
You need to have a way to detect this worked or not and then send a failure code back to the SQL Job. However, what I might do is estimate the time this takes and then have some checks run to verify the byte size in S3 matches that you have locally.
If rclone stops, can you detect that and restart it? It should continue with the transfer. If you have a check that looks hours later, this can alert you to the process.
Ultimately the SQL Job is a synchronous item, and you need to report back to it the status of your PoSh script.
January 25, 2021 at 5:19 pm
That's what I was thinking... perhaps I need to dig into rclone a bit and suss out the error return codes and make sure I am getting that back in the event of success or failure
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply