August 15, 2008 at 1:35 pm
Hi,
My scenario is like this:
I have already existing application tables and need to create some more tables for generating reports. The data needs to be populated in the report tables when there is a new data is inserted into the application tables or if there are any updates in the application tables.
So, I am planning to write triggers on these tables, but I need to write no of triggers on different tables.
My question is, is there any way I can do it as a batch process within database? Take all the updated and newly inserted data and populate the respective reports tables?
Please help me here to find out whether to go for triggers or batch process. If yes, how can I do that?
Thanks in advance.
Thanks,
Chandra
August 15, 2008 at 1:44 pm
It depends on how much latency your users are willing to have in their reports, how busy your system is, will triggers or a batch process put less load on your server, and are you doing storing a "copy" of the data or aggregation?.
Once you answer those questions then you can move forward with a plan on how to implement.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
August 15, 2008 at 2:21 pm
Hi,
Thanks for your reply and looking at your questions I can say like if we can update the reports data twice in a day, that will work. And I think like triggers may affect the performance of the system but not batch processing I guess, not sure.
And regarding data, it is just a copy and not a aggregation of data. And my another question is, can we schedule a job which will take care of this in SQL Server 2005? This may be a dumb question...Sorry..
Thanks,
Chandra
August 15, 2008 at 2:36 pm
Have you considered replication or database snapshots?
As far as performance, I would think that batching it will have less of an overall impact then triggers. It really depends on the number of transactions the database and server are already handling. If you are just copying the data, why can't you just report out of the "live" data? Is there that much of a performance issue with that? Odds are you could improve your select performance with some judicious indexes.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
August 15, 2008 at 3:08 pm
You are right.. but the data is coming from different databases... so, to do report, it will be a huge task for me connecting different databases based on clients. So, i am doing a central repository for reports which is not client dependent.
I need to use indexes also in my reporting tables.
Thanks,
Chandra
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply