April 8, 2013 at 6:47 am
Rao.V (4/10/2012)
Hi all,I am using idea of script you mentioned to verify the backup status of backup files location in different location. Same like RESTORE VERIFYONLY.
The 'C:\Temp\Paths.txt' is having backup file locations of different servers. The code is working perfectly when I use one path but its failing to execute when more than one path is specified.
Please help me on what modifications to be done to the script in order verify the backup file status of different locations.
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null
$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") ("servername")
$dbBackup = new-Object ("Microsoft.SqlServer.Management.Smo.Backup")
$dbRestore = new-object ("Microsoft.SqlServer.Management.Smo.Restore")
$a = Get-Content "C:\Temp\paths.txt"
foreach ($path in $a)
{
$dbRestore.Devices.AddDevice($path, "File")
if (!($dbRestore.SqlVerify($server))){
$body = $path
$emailFrom = "email"
$emailTo = "email"
$subject = "Backup Files Health"
$smtpServer = "serverip"
$smtp = new-object Net.Mail.SmtpClient($smtpserver)
$smtp.Send($emailFrom, $emailTo, $subject, $body)
Exit}
}
Thanks
Vinay
Exit
gsc_dba
May 17, 2013 at 9:12 am
KenpoDBA: I have a PoSH script that loads a considerable amount of data into a database in different stages. It would be great to include a backup (with CopyOnly = 1) prior to the load, and after each stage, in case there were issues with the load.
I would not necessarily need to create a copy of the backup, but if there were issues, I would have a backup to restore to the original state. Can't do that in using SQL agent.
May 17, 2013 at 9:34 am
Wiered as it may seem, I am relatively new to Powershell. I notice in the script you declared a variable that append the current date to the backup file name. If so, how do you delete the file using the name "dBName.bak"?
Br. Kenneth Igiri
https://kennethigiri.com
All nations come to my light, all kings to the brightness of my rising
May 17, 2013 at 10:15 am
I would still prefer the job being scheduled in the agent. This avoids putting all your eggs in one basket by counting on the one scheduler to be maintained correctly. (I don't count on our SAN to not be down for 3 days. I don't count on our enterprise scheduler not stopping because it thinks the license expired.) There is a good solution that does most maintenance available already (). Ola's solution provides log files with details. It also has the option to clean old log files or backups. It should normal run without issue.
If there is no DBA to monitor the maintenance, then perhaps a few alerts to a non-DBA so that they let management know that they need to find a DBA to fix it.
If somebody needs a simple button on the desktop, then perhaps the PSH can get the maintenance history, check file space, start jobs, copy specific backup files, etc.
I have been thinking about using PSH to start jobs because we have been using several instances on some beefy hardware. It would be nice to be able to run related jobs (e.g., backup) in order across instances with one start. (I manually schedule them to not overlap now.)
Even if a PSH script is complex enough to replace Ola's solution, I would still want to run it in the agent. Perhaps as a PSH job step. Perhaps it could be instance aware to allow redundancy or complex scheduling? (I was thinking to create a job step using PSH to detect and wait if the same job is running on another instance of the same server. That would be easier than replacing Ola's solution.)
RandyHelpdesk: Perhaps Im not the only one that does not know what you are doing. 😉
May 17, 2013 at 2:43 pm
Whether or not PS is the best to approach is definitely debatable. However I appreciate the way the article was presented in an easy to understand approach. I may never need it but feel very confident that in the event that I do I will be able to handle it.
Great job!
May 17, 2013 at 6:11 pm
dbishop (5/17/2013)
KenpoDBA: I have a PoSH script that loads a considerable amount of data into a database in different stages. It would be great to include a backup (with CopyOnly = 1) prior to the load, and after each stage, in case there were issues with the load.I would not necessarily need to create a copy of the backup, but if there were issues, I would have a backup to restore to the original state. Can't do that in using SQL agent.
Why is it necessary to load the data using PowerShell to begin with? What's so special about the data that would require such a thing? Not a challenge. Just a curiosity.
Also, even if PowerShell is required, it may be quite easy to do interim backups with SQL Agent. I;d have to know more about the overall process to say that definitively but a lot of people overlook some of the power built into SQL Agent.
As a side bar, what size is your database? The ones I'm working with (~200GB) aren't huge but I can guarantee that I only have room to make 2 copies in the backup stagig disks. If I had to do interim backups, they'd most likely be Point-In-Time log backups which would also keep the log file from freaking out during the multiple loads.
--Jeff Moden
Change is inevitable... Change for the better is not.
May 17, 2013 at 6:33 pm
Rao.V (4/10/2012)
Hi all,I am using idea of script you mentioned to verify the backup status of backup files location in different location. Same like RESTORE VERIFYONLY.
I realize that this post is over a year old but I have to ask... Why are you building PowerShell to do the same thing as functionality built in to SQL Server?
--Jeff Moden
Change is inevitable... Change for the better is not.
May 17, 2013 at 6:43 pm
freakypriest (1/24/2011)
We don't like turning on xp_cmdshell on all our servers as this is a security problem.
Actually, it's only your perception that makes you think it's a security problem. The only people that can turn it on are the very same people that can use it. Except for a 3ms or so delay that an attacker's software would experience, there is no substantial extra security realized by having xp_CmdShell turned off.
Again, I realize the post that I'm responding to is more than a year old but I wanted to explain that thinking xp_CmdShell is a security problem is an emotional and preferential response rather than anything substantial. Any one who can get in with "SA" privs can turn it on, use it against you, turn it off and delete the miniscule logging of the event before you even realize that someone hacked you.
The key to security has nothing to do with xp_CmdShell disabling xp_CmdShell. The key to security is to keep unwanted people out of your system as "SA" and to limit what the SQL Server Service and SQL Agent can do if you do get hacked. In fact, because a lot of people believe that disabling xp_CmdShell will somehow increase your security, they may become lax in the security they should be doing.
For the record, I leave xp_CmdShell enabled for my DBAs and then I make sure I take care of the real security problems. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
May 17, 2013 at 6:45 pm
sqlchaser (5/17/2013)
Whether or not PS is the best to approach is definitely debatable. However I appreciate the way the article was presented in an easy to understand approach. I may never need it but feel very confident that in the event that I do I will be able to handle it.Great job!
+1 to that!
--Jeff Moden
Change is inevitable... Change for the better is not.
May 19, 2013 at 1:38 pm
Excellent straightforward article. Regarding how and whether to schedule it, I won't say. But this is a great article to help a person discover clever ways of using PowerShell alongside SQL Server.
Appreciate the article. Thank you!
Viewing 10 posts - 31 through 39 (of 39 total)
You must be logged in to reply to this topic. Login to reply