August 25, 2008 at 2:07 am
Hi,
What steps needs to be taken to resolve Slow Running Store Procedure?
Thanks!
August 25, 2008 at 2:34 am
One way could be the analysis of the execution plan.
1) Identify hard steps (higher percentages)
2) Identify SCANs (Index, Table, Clustererd Index) and try to define index to eliminate them.
August 25, 2008 at 2:51 am
Thanks..
Do we have any tool in sql server to identify which transcation or select statement is taking time ?
I have SP with 100 of select statement then how to proceed with this senario..
will i go with one statement and see execute plan but this will be lengthy procedure? don't we have any tool which will produce execution plan for all statement?
Thanks!
August 25, 2008 at 7:19 am
Run a profiler trace and capture the duration for all sql statements. That's your best bet. Then select those statements with the highest duration and tune from there.
August 26, 2008 at 6:07 am
Like told in the optimisation step, just take a look of the execution plan's a look for Queries which are using a FULL TABLE SCAN etc. Try to change this to a INDEX SEEK by setting correct Index on your tables. Maybe you can post the SP so someone can analyse it.
August 26, 2008 at 6:20 am
SET STATISTICS IO ON
GO
SET STATISTICS TIME ON
GO
then tun the proc. That will give you IO and time breakdowns for each statement.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 26, 2008 at 1:50 pm
I have a data warehouse query which joins 8 tables together, 1 table has extremely high scan count & logical reads, how can I reduce high scan count & logical reads?
August 26, 2008 at 2:31 pm
Add appropriate indexes and make sure that the query is written so as to use them. Without seeeing the query (and preferably also the execution plan) it's hard to be more specific.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 26, 2008 at 9:03 pm
Thanks Gail, I wish I could post table structures, indexes, query and its execution plan. But, the query is Cognos application and could take 3 pages. I have updated statistics, created a couple of indexes, defragged all indexes(table is over 10k pages), ran sys.dm_db_missing_index_details (none is missing), although there is 1 or 2 unused indexes when I ran dm_db_index_usage_stats. Is there anything else that I can check?
August 27, 2008 at 1:50 am
Can you save the exec plan as a .sqlplan file, zip it and attach it? That will help a bit.
Otherwise, you can check and see if the indexes are being used (use the exec plan) on the table with lots of reads, and see if there's perhaps a more appropriate index for the query, based on the filters and joins used on that table.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 5, 2008 at 6:39 am
Yihong,
You stated your query in a Data Warehouse uses 8 tables joined together. Usually the DW queries are denormalized to a point where very few joins are used. Is there a way to flatten out the query a bit by denormalizing the data?
September 5, 2008 at 8:48 am
this db design definitely
September 5, 2008 at 8:53 am
oops.. what I did?
This db design definitely has flaws, the query is from cognos application so that I have to live with the current query. the only thing that I can do for now is to lower the logical reads, re examine indexes.
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply