July 20, 2006 at 2:08 am
Hi,
I observed that when our sql server 2000 database small is small. The performance
is okay and good.
However, as database size increases. Some of the query will be slow, block and have
other issues.
Is there any trend/pattern on this area that I can reference ?
Do I have to monitor the database using some tool ?
Anyone got any good recommendation/advise on this area ?
July 21, 2006 at 7:11 am
Hi,
You could run profiler regularly and capture long-running queries. eg: Event->SQL:Batch Completed, Duration->Greaterthanorequalto 2000.
I just run a little script in QA locally on the server at the end of my maintenance routine, I run it half a dozen times and take the min, max and average execution times in fractions of a second. I replaced my non-clustered indexes with clustered ones this week and shaved 0.2s off my execution time 0f 0.8 to 1.3s.
set statistics time on
set statistics io on
declare @dtstart datetime
select @dtstart=getdate()
select loads_of_rows
from important_view
where most_reports_get_their_data
select getdate()-@dtstart
Substitute your own query in place of the italics (mine selects data from a view that underpins all the major reports on the system).
If you are getting blocking, experiment with table hints (NOLOCK) and re-visit your code looking for places where you are unnecessarily delaying commiting transactions (one killer is returning control to the user before commiting). Remember if you have nested transactions, SQL Server doesn't release locks until ALL transactions are commited (transaction level is 0).
David
If it ain't broke, don't fix it...
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply