April 22, 2014 at 11:01 pm
Hi All,
We have a production database which is busy throughout the day and need to generate report from the same. The tables undergoing inserts/updates will be used to select the data for the reports. Due to this I am facing low performance. I know this setup is very rare and may be poor but please suggest any other possible ways for the betterment of the query.
Thank you.
______________________________________________________________Every Problem has a Solution; Every Solution has a Problem: 🙂
April 22, 2014 at 11:54 pm
Does the report have to run during business hours?
Can the report be scheduled to run outside of hours?
How often does the report need to be run?
Does the user need to input parameters to run it?
Has the background query / SP's been performance tuned?
The catch 22 of what you describe is that if you boost the performance of the report you will sacrifice the performance of your production system.
April 23, 2014 at 1:39 am
Pls find my reply..
Does the report have to run during business hours? - Yes
Can the report be scheduled to run outside of hours? - No, because the end users needed the report as soon as they update/change the data
How often does the report need to be run? - 20-25 times within the business hours
Does the user need to input parameters to run it? - It Depends, but not necessarily.
Has the background query / SP's been performance tuned? - Tried as much as we could. The SP uses few remote queries and lot of temp tables. Tempbd is in separate drive only.
______________________________________________________________Every Problem has a Solution; Every Solution has a Problem: 🙂
April 23, 2014 at 5:07 am
Do you have another SQL instance available? You could setup mirroring between the two, create a database snapshot and have the report run from that. That way you won't be impacting the production db. You would have to schedule the db snapshot and there is a chance that it won't be up to the second data, but once a report is ran on live data for a system that is busy it is report is obsolete anyway; since data is always changing there is a chance that by the time someone analyzes the report at least some of the data has already changed.
Database Mirroring and Database Snapshots (SQL Server)
Joie Andrew
"Since 1982"
April 23, 2014 at 5:28 am
We have mirroring on another server and a snapshot as part of high availability. Snapshot is taken every 12 hours. But snapshot takes more physical reads than the production database. I am yet to see how it works. Understand that the first time run will take more time and from second run onwards it will be fine.
Will check it.
______________________________________________________________Every Problem has a Solution; Every Solution has a Problem: 🙂
April 23, 2014 at 6:22 pm
How old is considered stale data? 15 mins?
An old fashioned low tech but also low demand option would be to log ship to your alternate server or instance an extract of the tables affected by your reports (if it is small enough you could log ship the whole DB).
Execute the reports against that.
IF you have your schedule set up right and your log backups set to 15 mins you would always be running against a 15 minute old set of data.
Process depends on the size of the data changes so you would have to "fettle" to find the right balance of data availability and currency.
If you were log shipping to another instance on the same server you can ration the available memory to limit the cpu consumption in favour of production or reports. Most reports will use less cpu if they have plenty of RAM available.
Does your RAM regularly top out?
Do you ration your RAM?
Are you executing against really large tables? Do you actually need to execute against all the table or could you narrow the amount of data to be parsed by creating a view or CTE and executing your query against that?
April 23, 2014 at 6:24 pm
I know I am asking a lot of questions but there is more than one way to skin a cat in SQL and to know which one to use we need to know what breed of cat...
April 23, 2014 at 8:50 pm
karthik babu (4/22/2014)
Hi All,We have a production database which is busy throughout the day and need to generate report from the same. The tables undergoing inserts/updates will be used to select the data for the reports. Due to this I am facing low performance. I know this setup is very rare and may be poor but please suggest any other possible ways for the betterment of the query.
Thank you.
At this point, I recommend you read the article at the second link "Helpful Links" in my signature line below.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply