October 27, 2008 at 4:54 am
Hi All,
I have one database on production having size about 150 GB and more than 1000 tables. Its running slow some times, my question is that how can i find performance issue.
Many tables does not have any primary key, indexes, etc. Please suggest me what is best practice to find the performance issue.
Rajnish
October 27, 2008 at 5:06 am
Use 'SQL Server profiler' primarily. Depending upon concurrent client requests consider your memory and disk configurations. When your system get slow use OS utilities to monitor CPU and memory usage at that time. And for tables without indexes consider them separately and build indexes. You can also use database engine tuning advisor.
DBDigger Microsoft Data Platform Consultancy.
October 27, 2008 at 5:08 am
The first thing you should do is make sure that all tables have primary keys and are properly normalised.
Run profiler for a while during your busy times. Record the rpc:completed and sql:batch completed events. Make sure you have the textdata, duration, reads and cpu columns at a minimum.
Once the trace is complete, save the results into a table and look for the top 5 or so worse performing queries. Take those and see what you can do to optimise them. It may mean changing the code, it may mean adding indexes, it may require both.
Once you've got those fixed, repeat the profiler trace and get another batch or queries to fix.
If you need help with the optimising, post the queries here and someone will help.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
October 27, 2008 at 10:05 am
Thanks all,
I will do the same and get back for further suggestion.
Rajnish
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply