December 29, 2012 at 5:15 am
Hi,
I have to improve the performance of some stored procedure as they take long time to execute.
I checked the cost in Execution plan but some shows 100% n some other than that...so can anyone plz tell me what should be the cost of a execution plan so that I can know in which SP I need to make change.
_______________________________________________________________
To get quick answer follow this link:
http://www.sqlservercentral.com/articles/Best+Practices/61537/
December 29, 2012 at 8:05 am
The operator costs within an execution plan will always add up to 100%. If you are running multiple stored procedures in a batch then the relative batch cost of all stored procedures in the batch will add up to 100%. Make sense?
If you have a specific question about the actual execution plan attach the .sqlplan file to this thread so we can have a look.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
December 30, 2012 at 11:13 pm
can we improve the %of table scan elements in Query execution plan?
_______________________________________________________________
To get quick answer follow this link:
http://www.sqlservercentral.com/articles/Best+Practices/61537/
December 30, 2012 at 11:56 pm
kapil_kk (12/30/2012)
can we improve the %of table scan elements in Query execution plan?
Maybe if you posted the actual execution plan, people can actually try to help you.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
December 31, 2012 at 3:12 am
it takes about 13 min to execute 🙁
_______________________________________________________________
To get quick answer follow this link:
http://www.sqlservercentral.com/articles/Best+Practices/61537/
December 31, 2012 at 3:52 am
Hi there,
Firstly there is quite a big descrepancy between estimate and actual row counts throughout the plan this (normally) points to out of date statistics.
use "UPDATE STATISTICS" to ensure that they are up to date and re-execute the query.
Please post that plan if performance has not improved
Dave
December 31, 2012 at 12:57 pm
Hm, there's a few nasty pieces to this but the biggest problem is the fctTxPayPlan table.
Estimate: 39,000 rows. Actual: 23 billion.
Might problem there. The table's a heap, get an index on it. It's predicating on dimLocationID and ID, use those as the leading edge of your clustered index for fctTxPayPlan and you'll get a ton of improvement. Updating statistics here won't help, as you don't have an index in play there and I doubt you've manually created columns statistics. Even a nonclustered index on those two fields and including any columns you need from that table would significantly improve this query.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply