April 9, 2014 at 1:16 pm
Hello,
I am reaching out to the users for some suggestions. Currently, we have SQL2008 R2 that I have configured to update from log files from an offsite location each hour due to requirements. We have 75 databases and each database contains 1300 tables. As you know, during the restore process the databases are not accessible. To work around this we have created several jobs to pulls the most common data or frequently used queries into tables into tables every night so we can access them and run reports. This data is only updated every 24 hours and the data need is beginning to be more and more real-time (or as close as we can get). I would like to know of some suggestions to have/create duplicate databases that are read only, not in recovery mode and will update each hour after the log files have been loaded into the main database. Trying to schedule reports and procs to run in the "available" window when the restore process is idle is very cumbersome.
Any and all suggestions are welcome.
April 9, 2014 at 8:46 pm
zz14hl1 34362 (4/9/2014)
Any and all suggestions are welcome.
75 databases x 1300 tables likely rules out replication. The overhead and management would be pretty ridiculous.
I would like to know of some suggestions to have/create duplicate databases that are read only, not in recovery mode and will update each hour after the log files have been loaded into the main database.
SQL Server 2012 AlwaysOn Availability Groups. You can have a real-time read-only replica that's not in recovery mode and doesn't require you to disconnect during updates (like log shipping or something else would). But you'll have to pay for it 🙂
Okay so say this isn't possible? Well you said you already load this stuff into other tables. Maybe you can just speed up whatever ETL you have and use replication on just those tables. Does the ETL take too long or is it too intensive to run hourly? Maybe it's doing a full ETL instead of only trying to process updates like a normal data warehouse, in which case you could look into doing something about that (like using timestamps on rows, unlikely on 75 x 1300 tables unless they are already there, or with Change Data Capture or something which would also be cumbersome to set up on that scale).
Long story short: I reckon they will probably have to pay for closer to real-time figures especially on that scale.
April 11, 2014 at 8:05 am
I figured as much. :(. I was looking into Attunity Replicate. I need some way to create a copy of these databases or a subset there of that I can query all the time. How can a task that seems so simple be so hard.:-D:-D. I will have to keep looking, there has to be something. I would think that if a company had a local database, but wanted an live copy offsite and used the log shipping method, there would be a better way than to keep the database in standby/read-only. There just has to be. 😀
June 16, 2014 at 6:04 am
Old thread I know, but did you ever look into Database Snapshots?
http://msdn.microsoft.com/en-us/library/ms190677(v=sql.105).aspx
http://msdn.microsoft.com/en-us/library/ms175876(v=sql.105).aspx
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply