September 27, 2004 at 10:06 am
I have a question regarding a situation that we ran into (and solved)
recently that have got me wondering what's really happening.
We ran into a stored procedure that seemed to be exhibiting a
parameter sniffing issue (sorta). The stored procedure was performing
poorly in that it took about twenty minutes to execute. We took the
bulk of the code out and ran it as a script and it took about one minute to execute. These results were consistent no matter which executed first.
We looked at estimated execution plan but saw nothing that would suggest poor use of indexes, etc.
Seemed to me that this could be a parameter sniffing issue BUT the
stored procedure performs poorly even on the first execution (when the plan should be tailored to the data passed in). Our first attempt was to add the WITH RECOMPILE option but no improvement. Now it seems like it's not parameter sniffing. Several futile attempts later we tried to simply copy the parameters into local variables and use the locals in the body of the procedure. This fixed the problem.
Question: Can anyone explain the difference in execution time between the procedure and the script? Why does the use of local variables help?
September 27, 2004 at 11:13 am
Were the local variables the exact same type as the parameters? I assume when you checked the execution plan you used an EXEC of the SP instead of copied SQL, right? If you now change the SP back to not use the variables, it still performs poorly once more, correct (perhaps someone else changed something without your knowledge or statistics were updated)?
It sounds like this isn't likely the case for you (since you didn't mention making this type of change), but in the past we had similar symptoms that were the result of QUOTED_IDENTIFIER and ANSI_NULLS settings. A stored procedure has its own settings for those values which can be different than the default values for Query Analyzer. This causes the SP to behave differently than the exact same SQL copied and pasted within Query Analyzer.
September 27, 2004 at 12:51 pm
>>> QUOTED_IDENTIFIER and ANSI_NULLS settings <<<
Now that is food for thought. I use a template for defining stored procedures and it does explicitly set these two values before performing the CREATE PROC.
Thanks for the input...
September 28, 2004 at 7:24 am
generally your experience goes against what I'd normally expect.
I assume you're running with sql2k ?
All I know states that if you replace parameters with local variables the proc will run much more badly. I assume there are no implicit data conversions.
You say you took most of the code, so what didn't you take? - I usually paste a proc into QA. change the params into declares and set a known set of values, comment out the create proc and as lines etc. and run - this usually runs slower than the proc.
Things to look for that will kill code are functions used in select statements where the function does a select - generally you might want to be careful of any functions, temp tables which cause recompiles, dml etc. etc. as covered in technet.
You might want to examine what happens in profiler as against the graphical showplan - you'll sometimes get a better feeling for what your proc is doing ( set all the showplan options etc. in profiler )
profiler will also better enable you to see which part of the proc is running longest.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
September 28, 2004 at 11:29 am
Is your server fully service packed? There was a problem with sql2K where it would ignore indexes for certain sps where a parameter was passed in and used in the where clause. I came across this problem a couple of years ago and it was fixed by the service pack.
September 29, 2004 at 12:03 pm
dgentry,
The environment is SQL 2000 SP3 but there is some post-SP3 patch that has not been applied.
Colin,
>>>You say you took most of the code, so what didn't you take? - I usually paste a proc into QA. change the params into declares and set a known set of values, comment out the create proc and as lines etc. and run<<<
This is exactly what I did.
Aaron,
>>>Were the local variables the exact same type as the parameters?<<<
Yes!
>>>I assume when you checked the execution plan you used an EXEC of the SP instead of copied SQL, right?<<<
No. Do you have reliable experience that the results would be different?
Thanks to all...
September 29, 2004 at 12:27 pm
Assuming the parameters were kept as variables and not hard coded into the SQL I personally haven't seen any differences between the EXEC of a SP and its content copied directly into QA with the exception of the QUOTED_IDENTIFIER/ANSI_NULLS issue. My main point is that the SQL runs differently when taken out of the SP and so looking at the plan for the SQL outside of the SP will not tell you what the SP version is trying to do. Get the plan for the EXEC of the SP and compare it to the plan for the pasted code. The difference (hopefully there is one) may clue you in to where the problem resides.
If you want a text based plan to compare and can afford to actually execute the SP instead of just getting an estimated plan you can use one or more of the following:
SET STATISTICS TIME ON
SET STATISTICS IO ON
SET STATISTICS PROFILE ON
I'm not aware of how to obtain an estimated plan in text format. If anyone does know I'd appreciate the info.
September 30, 2004 at 2:14 am
You can still develop the code in QA for a proc and results/plan will be the same ( for the same data )
To really see what is happening you need to run a profiler trace when you run the code and capture the plan text within profiler - beware that the output is quite vast, but if you really want to see this is how.
I never use estimated plans for debugging btw.
The only other factor which may affect results is if the plan is in cache or not and if the data is in cache. Probably a silly question but your proc doesn't start with sp_ ?
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
September 30, 2004 at 10:14 am
Colin,
Thanks, again, for your response.
>>>I never use estimated plans for debugging btw.<<<
I'll keep that in mind.
>>>Probably a silly question but your proc doesn't start with sp_ ?<<<
There are no silly questions (plenty of silly answers, though...)
My answer is threefold - No, No and Hell NO! 😉
October 10, 2004 at 2:32 pm
Shendricks>
I can confirm this problem and I do not know why. It seems that there is a problem in the optimizer that 'guesses' the value of the input parameters and yields a sub-optimal execution plan.
Look at the differences in the Join types used and the estimated rowcounts that exists in the version of the SP with input variables used directly and the version where the input variables are copied into local variables and used instead of the passed parameter-variables.
I know of no other solution then doing the copying. If you find out, please let me know
Regards, Hanslindgren!
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply