February 22, 2010 at 2:35 pm
jpSQLDude (2/22/2010)
No concern about security? The ILOVEYOU virus came via .vbs -- no one ever turns that off on your production servers?
The ILOVEYOU virus did not work by exploiting security holes in VBScript. In fact, I know of NO virus/trojan/etc. that ever has. That's because although they may run in VBScript, thats not where their security "exploit" is. In the case of the ILOVEYOU virus the security exploit was in MS Outlook, and not in VBscript, and MS fixed that in Outlook long ago.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
February 22, 2010 at 2:53 pm
RBarryYoung (2/22/2010)
This is a SQL 2008 forum, NOT SQL 2000. Please clarify what version of SQL Server you have.Also, DTS is pretty awful, the only reason to use it is if you have nothing else available.
I have SQL 7, 2000, 2005 and 2008 servers in production that I must back up. I'm not sure what is the appropriate forum to post for "All"?
Also, as I see it, I have three options for backups:
1. Come up with TWO systems, one for versions 7 and 2000, and a different one for 2005 and 2008 (just use SSIS). I don't like this because now I have to develop and support two different systems.
2. Use a 3rd party tool. I feel that SQL backups are ridiculously complicated enough that overlaying another tool adds significantly to the complexity.
3. Develop ONE system that works with all databases. Using the tools built into SQL and Windows certainly isn't perfect, but it allows me to build one system that works on all OS verions and all SQL versions, doesn't cost anything additional, adds some complexity by either using batch files or by VBScript but not on the level of an entirely separate 3rd party tool.
These are just my thoughts, if anyone has a better way of thinking I am all ears! (doing FOR IN DO loops in command line batch files is a major PITA!)
February 22, 2010 at 3:03 pm
jpSQLDude (2/22/2010)
RBarryYoung (2/22/2010)
This is a SQL 2008 forum, NOT SQL 2000. Please clarify what version of SQL Server you have.Also, DTS is pretty awful, the only reason to use it is if you have nothing else available.
I have SQL 7, 2000, 2005 and 2008 servers in production that I must back up. I'm not sure what is the appropriate forum to post for "All"?
If you post in the 2008 forums you will get 2008 answers. If you post in the SQL 2000 forums, you will get SQL 2000 answers. Seems pretty clear then that you should post on the lowest common denominator if you want one solution for three different platforms.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
February 22, 2010 at 3:30 pm
jpSQLDude (2/22/2010)
Also, as I see it, I have three options for backups:1. Come up with TWO systems, one for versions 7 and 2000, and a different one for 2005 and 2008 (just use SSIS). I don't like this because now I have to develop and support two different systems.
Personally, I don't see a problem with this option. Once you have it implemented for the SQL 7 and 2000 systems it's going to be fairly stable. In fact, I would hazard to say that you probably won't touch those systems again until you move them to 2005/2008. When that happens, you can easily implement your 2005/2008 methodology.
2. Use a 3rd party tool. I feel that SQL backups are ridiculously complicated enough that overlaying another tool adds significantly to the complexity.
I wouldn't really say this is an option at all. These tools do not necessarily perform maintenance and may be just the backups (compression/encryption). Unless these 3rd party tools remove all maintenance tasks (e.g. integrity checks, index maintenance, statistics, etc...), then they are just one component of option 1 or 3.
3. Develop ONE system that works with all databases. Using the tools built into SQL and Windows certainly isn't perfect, but it allows me to build one system that works on all OS verions and all SQL versions, doesn't cost anything additional, adds some complexity by either using batch files or by VBScript but not on the level of an entirely separate 3rd party tool.
I am not sure I like this solution, since it requires an understanding of how you decided to build the system. What happens in 3, 5, 7 years when you have moved on and the new DBA has to support this? How hard is it going to be for the next person to support? At least with the built-in tools (SSMS maintenance plans, SSIS, etc...) and T-SQL you have a much better chance of supporting the system long term.
At least, that's my opinion 😉
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
February 22, 2010 at 4:05 pm
AndrewSQLDBA (2/22/2010)
I never use a third party tool when working with SQL Server, it can just about anything, and it can sure handle copying a file across the network to another box. I also, do not store anything locally, I always copy the backup files on to a network share. But I code the DTS or SSIS Package to do that. I just do not trust mixing apps when the original one can do the same thing.Andrew SQLDBA
I agree not use 3rd party tools if SQL server does the same, we use these tools only for SQL 2000 and 2005, we have tried other methods copying files across network, you are going to waste lot of bandwitdh if you are copies files across, I told what we faced before, it is your call.
EnjoY!
February 22, 2010 at 5:25 pm
Thank you very much for all your helpful responses, I really appreciate it!
I will tell you what I have currently set up -- maybe you will have some feedback... maybe this will give you some ideas.
First, at this enterprise they use 3rd party tape backup with some software you've heard of -- which includes a SQL Server plugin so it can suck backups right out of SQL while its running. There are so many things wrong with this it isn't even worth going into! (But I will if someone really wants me to.)
So I figured do the opposite: SQL knows how to back up SQL, so I will let it. I have Weekly Full, Daily Differential (on my really big 150+ GB databases), and hourly Transaction Log backups -- to flat files right on the same disk subsystem.
Of course the most bullet-proof SAN can still blow up, so I want to move those files off the server ASAP, to a remote file server. Two important considerations: 1. I want to compress them to reduce network overhead, and 2. I want to encrypt them so my data is more secure.
So I have ALL of my SQL servers backing up to one directory on one file server. That way I can have the tape backup system just pull those files, as a tertiary backup (if I want to). And I also want to mirror that directory to my local workstation -- that way I can actually do restores (not just talk about doing them!), and actually document up the process for each database, etc.
[As an aside, I end up with the actual SQL data, the backups, the compressed/encrypted backups, and the mirror of those backups = 4 copies of the data = no one gets fired for making TOO MANY backups! :-P]
IMPLEMENTATION
=============
So here is how I have it currently implemented on a couple of servers (to be implemented on all of them... maybe!)
This might look complicated, but it is actually VERY VERY easy. I intentionally made it simple, I just have a tendency to document stuff up like crazy!
STEP 1 -- SQL BACKUPS
I just use the Maintenance Plan of whatever version of SQL is running (2000 is awful, but it works, so I just use it -- let SQL be SQL).
So I end up with a new file every hour in one directory. Wouldn't that be nice, if it were true! I haven't even tried that 130 GB database yet, my guess is it will take more than 1 hour. So I want the next step to be flexible, I do NOT want to time how long the backup takes then rely on that timing -- what if the server is getting pounded??
STEP 2 -- 7-Zip
So I have a simple batch file that runs 7-Zip and compresses + encrypts any new .BAK or .TRN file in Directory 1, over to Directory 2.
I have removed almost everyone and everything from the directory which contains my .bat files and my copy of 7-Zip and Robocopy. Unfortunately the way 7-Zip works is you put the password right there in plain text in the batch file, so IF someone got access to that directory, and they knew what they were doing, then the could see the password. At that point, they would already have access to the entire server, so getting hold of the backups is not the top concern in that situation. In other words, no badguy should have access to the password.
Now the way I get the 7-Zip batch file to run is via a Windows Scheduled Task. I just have it run every half hour. I could just have it run whenever, every 5 minutes if I want. The way I picture it is I have 7-Zip "watching" D1 -- every time there is are new files, it zips them over to D2. The way 7-Zip works is, if it has already zipped a file in D1 over to D2, it does NOT do it again, it just skips it. Nice, just what I want. And it is supposed to skip over open files -- like Mr. 130GB database that very well may still be writing to the file when the 7-Zip batch file sweeps on by. It actually does skip this file, but it creates a 32K dummy file. This isn't really a problem, but I am trying to get my buddy Igor in Russia to fix this for me!! 😀
http://sourceforge.net/projects/sevenzip/forums/forum/45797/topic/3562958
So I end up with another copy of my backups on the same disk subsystem as the original backups. So I want to mirror these compressed/encrypted files to a central file server. Robocopy does exactly that -- again, it does NOT copy open files (if they are still being compressed/encrypted by 7-Zip), and it only copies the files once (I don't want to copy Mr. 130GB file over my network any more than I have to!) So I also have a Scheduled Task that runs a batch file every 30 minutes (staggered a few minutes after 7-Zip), so I can get those files off the SQL server over to the file server.
I have all backups start at 5 minutes after midnight (Windows Server has a couple things that by default happen at midnight), and 7-zip happens 10 minutes after every hour, and Robocopy is scheduled for 15 past every hour.
So every hour at 5 after some form of backup file is created, a few minutes later it is compressed and encrypted, and a few more minutes later it is copied off the server. Nice. I can even tighten everything up if I want -- I could do Transaction Log backups every 10 minutes, or whatever the client requires for a particular database, then just have the 7-Zip and Robocopy batch files happen more frequently as well. In that case, the Log backup files should be smaller, so everything should happen faster anyway.
CONCLUSION
=========
This may seem like a lot, but really I just have SQL backup SQL, 7-Zip compresses (and that seems to work), and Robocopy copies (and that also seems to work just fine).
There are special cases where stuff could break, like what if the power goes out during one of those operations? Well, what if you were just doing SQL backups, would that be better?? (That's why I asked if you have one big honkin' VBScript, or a number of smaller ones??) Or what if you had a VBScript doing it instead of batch files, can you/do you have all sorts of error checking and handling in your scripts?
Also, since I am using batch files it is ridiculously easy to log everything to text files, so theoretically I could wade thru logs and see what if anything happened. I am currently working to put PK's and delimiters into my logs, so I can suck them into into SQL and monitor real time -- now that would be nice! And again, very simple.
And finally, I am NOT doing rocket science here -- so one of the concerns mentioned is will my system be able to be sustained if I am no longer here? Hmmm... batch files + 7-Zip + Robocopy. If the next DBA can't figure those out... they are gonna have bigger problems!!!
Oh, and BTW, all of this is "free" (except for my time, of course!)
As always, I appreciate your thoughts and feedback.
Thank you very much!!!
February 23, 2010 at 10:36 pm
Do you have a central server that could oversee these tasks (maybe excluding the backups themselves) ? If so, and it's a recent version, then I would probably put something together in SSIS and use that same server to monitor and generate reports as needed.
I can understand your aversion to using third party tools but one you might find helpful is Hyperbac. I used to zip my files after backup and then copy and FTP them around; having the compression (and encryption if desired) done as part of the SQL backup saves a huge amount of time - much much faster than a native backup and external zip process.
Matt.
Viewing 7 posts - 16 through 21 (of 21 total)
You must be logged in to reply to this topic. Login to reply