February 5, 2006 at 4:58 pm
Hi Everyone,
I have been asked to purchase an "Open Database Connector" for our backup software (Legato).
Now, this is all fine and dandy, but I am having some trouble justifying it (to myself).
The Transaction log in being backed-up every hour.
The DB is being backed-up every 4 hours.
In our current scenario - I assume the very worst that could happen is that we lose 1 hours worth of work. For the purposes of the databases on this server, that is a more than acceptable level of risk / maintenance plan.
So, ultimately, my question is;
Am I missing something?
Do we need the "Open connector"?
Gavin Baumanis
Smith and Wesson. The original point and click device.
February 6, 2006 at 10:33 am
I don't know the option but I would say not.
Have they supplied you with valid reasons why they want the option?
February 7, 2006 at 2:22 am
Sounds like backups you are doing are fine and if you are able to take full db every 4hrs sounds like small databases and lots of disk is available. I would use the backup tool / agent to automate the backups you are taking and take to use it to take these to external tape etc. Sounds like you don't need the agent to perform the sql server backups and I suspect the agents cost $'s.
February 7, 2006 at 4:54 am
Hi,
Backing up the database and transaction logs to tape is fine if you only ever need to restore the database as a whole, or to a particular point in time.
Many of these database backup agents, as well as providing the ability to backup the database when it is in use, also provide the ability to backup and restore individual database tables or rows of data.
If you need this kind of capability, then check the functionality of the Legato agent to see if it provides this.
Also worth checking to see what impact the open database agent has on performance of the system. How does it access the open database?
David
If it ain't broke, don't fix it...
February 7, 2006 at 7:02 am
Hi,
one thing to keep in mind is the reliability. Native sql server backup has never failed for me, but in my company we've had both Arcserve and Veritas Backup Exec doing backups without any errors, only to find no data on tape when we needed it! If that was just wrong configuration, or hardware issues - that is of no interest to me, i want error messages on failure!
With native backup to disk, if the backup gives no errors, backup's good!
karl
Best regards
karl
February 7, 2006 at 7:44 am
This is not always the case... its burned many people before. Just because a backup completes with no errors does not mean it is valid. The first thing that you should do is verify the backup file using the RESTORE VERIFYONLY command. This will tell you if the backup file is complete. You should also however do periodic restores into a development environment to guarantee recoverability. Just because a backup file is complete does not mean it can be recovered from. If one disk sector or disk pointer goes bad, the file may still be viewed as complete and verify successfully but still may not be able to be restored. The only way to guarantee recoverability is to physically restore the file. In SQL 2005 they have added checksum functionality into the RESTORE VERIFYONLY command so that this caveat will not come up but in SQL 2000 it's happened more than once.
February 7, 2006 at 7:58 am
Yes, that's in my opinion the main reason for having standby servers via logshipping, if the backups are restored to the standby you _know_ they are valid...
In the cases i mentioned above there was _nothing_ on tape, even though verify was configured in the case of arcserve - and no error messages.
ok, that was about 8 or 10 years ago, i hope things are better nowadays...
karl
Best regards
karl
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply