Hello,
We do have requirement to create separate backup folders (Full,Diff and Log) using ola script rather than separate database folder.
Once backup completed, we are moving only full backups from the SQL server to centralized backup server using robocopy.
Can anyone please assist with this request?
April 27, 2021 at 10:49 pm
Yes, but not the way you think. It's a mistake to move only the FULL backups. You have no chance of restoring to a given time without the log files. None.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 28, 2021 at 12:01 am
(Maybe the OP should watch Grant Fritchey's videos on backup and restore on Youtube).
Seriously, though... if you don't understand how backups and restores and log files are related, you're in for a world of trouble. You'll think you have a usable backup, but if you can't apply the correct log files to your restored DB, you're going to lose all the modifications made since the last full backup. Grant's videos on the subject are great.
April 28, 2021 at 12:04 am
We already keep 1 month of backup on drive and anything above that should transfer to shared backup network and that is the requirement of client.
We already keep 1 month of backup on drive and anything above that should transfer to shared backup network and that is the requirement of client.
Does that one month of backup include transaction log backups?
On your original question, do 3 different runs with Ola's code for the 3 separate folders (which I don't recommend... it's better to have one folder per database, IMHO). As for the ROBOCOPY thing, I'd call it from xp_CmdShell but most folks believe the myth of it being a high security risk so a batch job with (ugh!) a Windows Scheduled Task might do the trick there.
I'm still concerned for the client because of not copying the tlog backups with the FULLs.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 28, 2021 at 3:15 am
Yes, one month backup includes full, diff and txlog. I had the same concern and First thing that I asked to client why only full?
April 28, 2021 at 8:13 pm
EasyBoy wrote:We already keep 1 month of backup on drive and anything above that should transfer to shared backup network and that is the requirement of client.
Does that one month of backup include transaction log backups?
On your original question, do 3 different runs with Ola's code for the 3 separate folders (which I don't recommend... it's better to have one folder per database, IMHO). As for the ROBOCOPY thing, I'd call it from xp_CmdShell but most folks believe the myth of it being a high security risk so a batch job with (ugh!) a Windows Scheduled Task might do the trick there.
I'm still concerned for the client because of not copying the tlog backups with the FULLs.
You don't need xp_cmdshell - or Windows Task Scheduler. SQL Server Agent can use either the operating system (batch) subsystem or Powershell subsystem for a job step. You can even create proxies for each subsystem to define specific permissions and access.
Not against using xp_cmdshell - but showing there are alternatives available.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
April 29, 2021 at 5:54 pm
Update:
Well, i have updated ola script @DirectoryStructure parameter as following to fulfil the requirement.
@DirectoryStructure nvarchar(max) = '{ServerName}${InstanceName}{DirectorySeparator}{BackupType}{DirectorySeparator}{DatabaseName}_{Partial}_{CopyOnly}',
April 30, 2021 at 12:28 am
Jeff Moden wrote:EasyBoy wrote:We already keep 1 month of backup on drive and anything above that should transfer to shared backup network and that is the requirement of client.
Does that one month of backup include transaction log backups?
On your original question, do 3 different runs with Ola's code for the 3 separate folders (which I don't recommend... it's better to have one folder per database, IMHO). As for the ROBOCOPY thing, I'd call it from xp_CmdShell but most folks believe the myth of it being a high security risk so a batch job with (ugh!) a Windows Scheduled Task might do the trick there.
I'm still concerned for the client because of not copying the tlog backups with the FULLs.
You don't need xp_cmdshell - or Windows Task Scheduler. SQL Server Agent can use either the operating system (batch) subsystem or Powershell subsystem for a job step. You can even create proxies for each subsystem to define specific permissions and access.
Not against using xp_cmdshell - but showing there are alternatives available.
True dat! Things like a Cmd Task, etc, work fine... well, kind of. With the understanding that I've not recently spent much time (I'm thinking precisely zero in the last decade) trying to figure out how to do it, how do you pass parameters between job steps in the same job without having to save to a permanent table in a database somewhere? How do you process the resulting output for a CmdExec task when you do a simple thing like a bare-bones DIR command to get a list of file names?
And no... I'm not being sarcastic or ironic here. I don't know of an easy way to pass results of one job step to another... not even a parameter. If you know, I'm all ears because then I could actually make alternative recommendations to the use of xp_CmdShell.
Heh.. and shifting gears a bit, I don't respond very well to suggestions of using SSIS. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
April 30, 2021 at 1:43 am
Jeffrey Williams wrote:Jeff Moden wrote:EasyBoy wrote:We already keep 1 month of backup on drive and anything above that should transfer to shared backup network and that is the requirement of client.
Does that one month of backup include transaction log backups?
On your original question, do 3 different runs with Ola's code for the 3 separate folders (which I don't recommend... it's better to have one folder per database, IMHO). As for the ROBOCOPY thing, I'd call it from xp_CmdShell but most folks believe the myth of it being a high security risk so a batch job with (ugh!) a Windows Scheduled Task might do the trick there.
I'm still concerned for the client because of not copying the tlog backups with the FULLs.
You don't need xp_cmdshell - or Windows Task Scheduler. SQL Server Agent can use either the operating system (batch) subsystem or Powershell subsystem for a job step. You can even create proxies for each subsystem to define specific permissions and access.
Not against using xp_cmdshell - but showing there are alternatives available.
True dat! Things like a Cmd Task, etc, work fine... well, kind of. With the understanding that I've not recently spent much time (I'm thinking precisely zero in the last decade) trying to figure out how to do it, how do you pass parameters between job steps in the same job without having to save to a permanent table in a database somewhere? How do you process the resulting output for a CmdExec task when you do a simple thing like a bare-bones DIR command to get a list of file names?
And no... I'm not being sarcastic or ironic here. I don't know of an easy way to pass results of one job step to another... not even a parameter. If you know, I'm all ears because then I could actually make alternative recommendations to the use of xp_CmdShell.
Heh.. and shifting gears a bit, I don't respond very well to suggestions of using SSIS. 😀
You have 512 bytes (or less, depending) you can use to rather easily pass stuff from step to step. If that's not enough, you can use chunks of 7500 bytes each, but it's more code to pull the value(s).
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
April 30, 2021 at 2:37 am
You have 512 bytes (or less, depending) you can use to rather easily pass stuff from step to step. If that's not enough, you can use chunks of 7500 bytes each, but it's more code to pull the value(s).
Where in the steps? How is this done? Like I said, I've not looked into this in at least a decade. A link on how to do this would be especially helpful (although I Google it, as well).
--Jeff Moden
Change is inevitable... Change for the better is not.
April 30, 2021 at 2:47 am
Heh... lordy... never mind. Nothing has changed in the last decade. Jeffrey Yao has an article on the subject. These aren't what I call "easy" methods and some of them don't allow concurrency.
https://www.mssqltips.com/sqlservertip/5731/how-to-pass-data-between-sql-server-agent-job-steps/
I'll stick with stored procedures and calls to xp_CmdShell. 😀
--Jeff Moden
Change is inevitable... Change for the better is not.
April 30, 2021 at 2:51 am
Heh... lordy... never mind. Nothing has changed in the last decade. Jeffrey Yao has an article on the subject. These aren't what I call "easy" methods and some of them don't allow concurrency.
https://www.mssqltips.com/sqlservertip/5731/how-to-pass-data-between-sql-server-agent-job-steps/
I'll stick with stored procedures and calls to xp_CmdShell. 😀
Lol. I didn't know that. I figured out using description and the extended properties on my own. I would never use the error log for that, wow, bizarre.
You don't need concurrency -- a given SQL job can only run one up, i.e., there are never multiple jobs of the same name running at the same time in the same instance, at least iirc.
Changing the description is easy. The other not so much, but I still prefer it to writing data permanently to an actual table. And personally I would never use xp_cmdshell just to do that.
Edit: added word never to correct the meaning of the sentence.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
April 30, 2021 at 11:23 am
Yes... I'm well aware (and very thankful) that a given job can only be run once at a time. The concurrency I'm talking about is when you have more than 1 job that does the same thing across multiple file loads. It easier to avoid "concurrent stuff" in stored procs because of the use of temp tables instead of having to do naming differences of permanent tables (and other hacks) especially as staging tables just because you're running stuff in parallel.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 30, 2021 at 2:05 pm
Yes... I'm well aware (and very thankful) that a given job can only be run once at a time. The concurrency I'm talking about is when you have more than 1 job that does the same thing across multiple file loads. It easier to avoid "concurrent stuff" in stored procs because of the use of temp tables instead of having to do naming differences of permanent tables (and other hacks) especially as staging tables just because you're running stuff in parallel.
Hmm, I still don't see the issue with concurrency. I'm probably overlooking something, but as long as you name the extended property starting with the job name, natch, what's the issue with concurrency again?
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply