January 4, 2011 at 9:27 pm
Comments posted to this topic are about the item Powershell Database Backup Script
January 5, 2011 at 3:35 am
Great as (for me) an introduction into the world of PowerShelling. In your text I think you syntactically need to to point out the values that need change e.g. "mailserver" should be "<mailserver>" so it stands out.
January 5, 2011 at 5:46 am
Please explain me about the variable $location. It is used in the line $webclient.UploadFile($uri, $location) to send the file to a FTP server, but $location is never initialized nor populated.
Alberto
------
Alberto De Rossi
Microsoft Certified Solutions Associate - SQL Server
Microsoft MVP - Data Platform
Power BI User Group Lima - Community Leader
January 5, 2011 at 6:49 am
Jimmy,
Have you found a method to provide percent completed with a PowerShell script? We use P$ to run ad hoc backups, but the most frustrating part is that there is no progress meter or percentage complete.
Regards, Irish
January 5, 2011 at 7:54 am
I can't think of any real honest business case where taking backups outside of native SQL is necessary. I hate it when people choose to show the coolness of powershell by showing the most bloated method for backing up DBs on the planet. What DBA in their right mind would choose to take backups and put them in the windows scheduler? You don't get history and it's harder to tell when the job is running and get stats on it.
If you have to run backups in powershell (again, WHY?), then at least call the script from an agent job so you can at least get some history and other job stats.
This is a case of using powershell just because you want to use it, and not because it's the best tool for the job.
Watch my free SQL Server Tutorials at:
http://MidnightDBA.com
Blog Author of:
DBA Rant – http://www.MidnightDBA.com/DBARant
January 5, 2011 at 8:07 am
Let me add that thought the author makes note of the fact that there more complex scenarios may result in even more complex scripts, this script is also very complex when compared to using the SQL Server Management Studio tools for building a backup plan, let alone a full maintenance plan.
I'm prejudiced against this approach - it was fine when there were no alternatives to scripting - but doing this today seems more a case of showing off scripting chops than actually using one's time productively.
While I have issues with Microsoft's increased use of CLIs and scripting, they've generally gone about it in a way to which I subscribe: Build scripts using a GUI so they can be generally error free while enabling access to the code for replication (though I still think doing it over in the GUI would be as fast or faster) and possible advanced automation.
But that's just me.
January 5, 2011 at 8:19 am
KenpoDBA (1/5/2011)
I can't think of any real honest business case where taking backups outside of native SQL is necessary. I hate it when people choose to show the coolness of powershell by showing the most bloated method for backing up DBs on the planet. What DBA in their right mind would choose to take backups and put them in the windows scheduler? You don't get history and it's harder to tell when the job is running and get stats on it.If you have to run backups in powershell (again, WHY?), then at least call the script from an agent job so you can at least get some history and other job stats.
This is a case of using powershell just because you want to use it, and not because it's the best tool for the job.
Kenpo,
I agree, but I can present a case where it might make sense.
Assume for just a minute that you are a consultant DBA and the request is for you to ensure that a backup can be taken by any unskilled schmo (read as computer operator).
We could create an Agent Job that could be executed from SSMS by said operator, but let's say their skill set is not at a level where they can retain how this is completed. However, they are smart enough to double click on an icon on their desktop or the Server desktop that will complete the operation for them, passing all of the parameters, making sure that the previous backup is eliminated, someone is e-mailed that the backup is complete, and providing redundancy of the created file with little or no thought.
That's kind of where I am at. I have run into a number of folks that have the title of DBA foist upon them when they have no business looking at a RDBMS. In order for me to ensure that there is some level of recoverability the KISS principal applies. "If you need a backup outside of the schedule, just click here."
SQL Server Native backups are simple to implement and manage and this is by design. However, the unskilled, simpleminded user can still wreak havoc by completing the process incorrectly. :w00t:
Regards, Irish
January 5, 2011 at 8:28 am
Thanks Jeffrey...
however in the scenario you described, doing a native SQL backup and putting it into an agent job is still the best solution. All you have to do now is call the job from a PS script. This way the DBAs can support the job from their end, and the OPs guys can support it from their end. And you still get all the benefits from having it in SQL.
Just because a process can use PS doesn't mean the entire solution has to be written in PS. You can use it just as a step in the process.
Watch my free SQL Server Tutorials at:
http://MidnightDBA.com
Blog Author of:
DBA Rant – http://www.MidnightDBA.com/DBARant
January 5, 2011 at 8:53 am
Nice article and interesting read. As an intro to powershell I can see the use for it. As Sean has stated though - I would rather the jobs to backup the databases be done through SQL Agent at a minimum. Knowing the history of the backup jobs is essential. Being able to see backup size from a central location is also nice to have for trending purposes.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
January 5, 2011 at 8:57 am
I would also agree with Kenodba, but am very interested in understanding how PS could be leveraged otherwise for dba.
BTW, I am always nervous about deleting a previous days backup, until I know my current backup is good, assume there is not a HD space constraint in place. Just a thought...
January 5, 2011 at 9:04 am
If you guys are interested, I have tons of powershell videos on my site for free. Everything from beginning, to some pretty cool task-based stuff.
Watch my free SQL Server Tutorials at:
http://MidnightDBA.com
Blog Author of:
DBA Rant – http://www.MidnightDBA.com/DBARant
January 5, 2011 at 10:10 am
I could see using this script on a SQL express installation without SQL Agent installed. Windows Scheduled Taks would be used for scheduling it.
I am not sure how I would use a SQL script to delete the old backup without using XP_CMDSHELL. So if I need to use DOS commands to delete the old backup why not use Powershell for file deletes and backups.
Thanks Jimmy for the article
David Bird
January 5, 2011 at 10:42 am
Yeah, in the limited scope of having sql express I could see how initially you might want to do this, but it's still not the best option. Even assuming you decided to use PS to run the backup you're still better off calling a backup SP from invoke-sqlcmd instead of the overly-complicated SMO method.
Also, I don't really want to put that much effort into backing up just a couple express installs. Most companies aren't overrun with them so they're the exception and I'm not going to hinge my entire backup scenario on the LCD.
But if I were faced with this issue, I'd still code it in Agent, I would just run it from another box. Or I would schedule it in an SSIS pkg from another box. Make something you can gather history for the standard. Keep SQL where SQL belongs.
And as for cmdshell... I talk about this all the time. It's not evil, and it's not even unsafe. You just have to lock it down and make sure that it's being used well. So what's wrong with using cmdshell for deleting your backup files? That's what I do. Cmdshell is no more dangerous than having an sa acct. You protect your sa, so protect your cmdshell and use it wisely. But it's not inherently evil or bad or even ill-advised. It just got a bad rep there for a while cause it came unprotected out of the box and got exploited. Now it's locked down by default, so open it up only to the DBAs and start using it again.
Watch my free SQL Server Tutorials at:
http://MidnightDBA.com
Blog Author of:
DBA Rant – http://www.MidnightDBA.com/DBARant
January 5, 2011 at 2:38 pm
if you coded this as part of a winform, then there is a progress bar:
$Form = New-Object Windows.Forms.Form
$progress = New-Object System.Windows.Forms.ProgressBar
$Form.controls.add($progress)
this is the only way i can think of
cheers
January 5, 2011 at 2:49 pm
i'm sorry i have missed this out
$location variable is a reference to the location of the file on the server
could be
$location="C:\Data\dbName.bak" to upload the daily backup file
or
$location="C:\Archives\"+$b to upload the archive zip of the backup file
thank you
Viewing 15 posts - 1 through 15 (of 39 total)
You must be logged in to reply to this topic. Login to reply