May 22, 2015 at 6:49 am
We're setting up an 'Analysis' server area and am considering creating a 'copy' of a transactional db on this new server (along with several other dbs). So I'm researching possible solutions and seeking best practices. Looking for advice, thoughts, ideas...
1) Could take weekly Full backups and nightly Differential backups and Restore them on the Analysis Server
2) Could use replication
3) Could use Log Shipping
Ultimately we would want this to be fully automated and it's fine if the Analysis DB is a day behind the DB being copied.
This Analysis Server will not be used as part of a fail-over or backup strategy.
(Should have included, we have a mix of 2010, 2012 and are moving to 2014 servers)
May 22, 2015 at 7:57 am
If you'll need to read this "copy" database, log shipping might not be for you because depending on how you implement it, users either won't be able to connect, or will be booted off every time a log file is restored.
Are you using Standard Edition or Enterprise Edition (or whatever)? Will the 2 servers be physically far apart (e.g. in different countries)?
May 22, 2015 at 8:02 am
I'm leaning toward using the weekly and nightly backups.
The servers are not in the same location - but a nice pipe between the two.
May 22, 2015 at 8:16 am
I generally opt for backup/restore operations in this kind of situation, as they are easy to implement and manage, and available in standard and enterprise editions. Unless database size/time constraints/readable db availability dictate otherwise, go for the backup option.
May 22, 2015 at 8:29 am
I have used T.replication in the past (the databases were on the same instance ) and I would just point the users (30 of them) to the subscriber via report builder.
May 22, 2015 at 9:34 am
I appreciate the questions, answers and advice. Thanks!
May 23, 2015 at 5:44 am
Another option, if you are in the 2012/2014 realm is to look at Availability Groups. You can have one that is read only and keeping it up to date is easy.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 24, 2015 at 12:15 pm
I'm with cuningham. If you can backup/restore, this is the simplest and most reliable.
Log shipping works well, but if you get out of sync, lose a file, etc., you're starting this, which is administratively a pain. It will fail when you're busy and you'll re-set the system and the first restore WITH RECOVERY, so you'll end up doing it again.
Replication is more real time, but also brittle. Works great when it works. When it fails, it's a pain to fix.
June 1, 2015 at 7:02 am
Why not set up Always On with a secondary read, or enable SQL mirroring between the two instances?
With Mirroring, you can create a daily snapshot of the mirrored db and have the clients connect to that for analysis?
Both are quick and easy to implement.
June 2, 2015 at 6:57 am
We do this on two or three systems - backup live between 6pm and midnight then restore that backup to test server from 4a.m. onwards. It's a good way of checking your backups too.
With the Ola Hallengren scripts which add date and time to backup file names, I wrote an SSIS package using a script to put the latest backup file name and database name into a table and then I run that as the first jobstep and the next sql jobsteps to restore the databases use that table to select the restore filename.
June 2, 2015 at 7:15 am
Justin Manning SA (6/1/2015)
Why not set up Always On with a secondary read, or enable SQL mirroring between the two instances?With Mirroring, you can create a daily snapshot of the mirrored db and have the clients connect to that for analysis?
Both are quick and easy to implement.
True, assuming its Enterprise edition?
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply