January 9, 2014 at 2:26 pm
I am a BI developer recently asked to research the best way to ensure that a database used for reports has the most up to date data from the OLTP database in production without slowing the production database at all....preferably not even connecting to it. Can someone give me some insight on this and how to go about it.
January 9, 2014 at 2:53 pm
Can this be done with mirroring?
January 9, 2014 at 4:47 pm
replication
January 9, 2014 at 5:34 pm
The fastest way to have your reports have up to date information is not using a warehouse, period. Hook to the source DBs.
The next fastest way is transactional replication, but that's not a warehouse either, just a copy of the original database. So is Mirror/Snapshot. These are your next 'fastest' methods though without working off the original database.
Obviously, both of these require continuous connection to the source DB. Mirror/Snapshot though is typically used in DR solutions so simply having your snapshot update every 5-10 minutes and reporting off the DR solution may be your best bet.
The next fastest, and rediculously complex, choice is to build your own transformation components based on message queues (broker queues). All changes have triggers that deliver changes to the queues and the warehouse is constantly updating off these items, and transforming the information into your warehouse structure. Anything based off triggered controls into an asynchronous update falls into here, broker is just the most familiar to me.
Finally, you get into data pushes. This is standard warehouse practice where your data is 'x' time behind. The fastest pushes will retrieve only the deltas and process them into the warehouse, the slowest will truncate/reload everything. Only you know your data load well enough to know what kind of lag expectation you can have here.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
January 10, 2014 at 3:40 am
Would it be possible to create triggers on the OLTP database to perform the same inserts and updates on the reporting database?
January 10, 2014 at 4:05 am
Possible, yes. Good idea, probably not. Why re-implement your own transactional replication?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 10, 2014 at 7:02 am
What is the difference between transactional replication and merge replication? Does either immediately upsert the data in the subscriber database?
January 10, 2014 at 7:11 am
Merge is for lots of sources and a destination where the destination needs the merged set of changes from all sources and those merged changes may get pushed back out to the sources, transactional is for a source and multiple destinations where the destination(s) get the changes from the source. (simplifying)
Merge is many times more complex to set up and keep working than transactional.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 10, 2014 at 7:12 am
Thanks Gail, it turns out transactional is exactly what we need.
January 14, 2014 at 10:15 am
One more question. We have the transactional replication in place. Every 5 minutes we are running jobs executing tsql merges from the subscriber database tables (I inherited this) to various tables which are used in reports. Is it possible to use triggers in the subscriber database so that as soon as the replication changes the subscriber database the trigger can be fired to update the report table immediately?
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply