July 7, 2011 at 4:21 pm
I have a proc that contains 62 plans and runs in ~16m when run in SSMS. When I copy the code out of the proc, declare a set of local variables that match the input parameters of the proc and set those to the same values I passed into the proc the script runs in ~2.5 minutes.
I saved the actual execution plans from both runs and compared them. I found that 5 queries from the proc call were accounting for 85% of the total cost...and of those 5 all had a different plan from the same statement when run in the context of the script.
I am executing both under the same login.
Things I have tried:
> add WITH RECOMPILE to the proc
> add OPTION (RECOMPILE) to the 5 statements with different plans
> run the proc, then the script in the same SSMS query window, one after the other
> run the script, then the proc in the same SSMS query window, one after the other
> run the proc, then the script, in two separate newly opened query windows to make sure I had the same server options and no residue
> run the script, then the proc, in two separate newly opened query windows to make sure I had the same server options and no residue
Each of these attempts yielded the same exact plans from the proc execution, a different, more expensive one than the one when running the same statement as a script. I am at a loss as to why I could be receiving different plans.
Any ideas?
BuildClrVersion v2.0.50727
Collation SQL_Latin1_General_CP1_CI_AS
ComparisonStyle 196609
Edition Standard Edition (64-bit)
EditionID -1534726760
EngineEdition 2
IsClustered 0
IsFullTextInstalled 1
IsIntegratedSecurityOnly 0
IsSingleUser 0
LCID 1033
ProductVersion 9.00.3080.00
ProductLevel SP2
SqlCharSet 1
SqlCharSetName iso_1
SqlSortOrder 52
SqlSortOrderName nocase_iso
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
July 7, 2011 at 4:29 pm
Update statistics?
The proc can do parameter sniffing, meaning it looks at the values of the parameters and checks those against the stats to get a row estimate.
The statements can't sniff, as they are variables. Hence you get a rougher estimate based on general distribution rather than a specific value.
The former tends to be a lot more sensitive to bad stats than the latter.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 7, 2011 at 4:40 pm
Dang...it was sniffing! As soon as I added a set of local variables to the proc and set them to the value of the input parameters I started getting the same plan as with the script. I did not think to try that because I thought the RECOMPILE option would have handled that...thank you again Gail!
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
July 7, 2011 at 4:45 pm
Try updating statistics (with fullscan if necessary) and putting the parameters back in the queries.
If parameter sniffing is causing a problem on a fresh compile it means there's something wrong with the cardinality estimations and that's most likely bad statistics
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 8, 2011 at 12:43 pm
Long term I definitely want to remove the local variable swapping I just added. I'm still in the process of learning the data in the tables in question and what is going on with this instance in terms of fragmentation and stats maintenance. It won't hurt to get a stats update on the maintenance schedule in the meantime though...I'll be asking for that as soon as I can. I added this thread to my SSC Briefcase and will let you know how it goes down the line. Thanks as always Gail.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
July 9, 2011 at 4:51 am
opc.three (7/8/2011)
Long term I definitely want to remove the local variable swapping I just added.
Why? :blink: It works just fine the way you did it and it's a whole lot more bullet-proof now.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 9, 2011 at 6:09 am
Jeff Moden (7/9/2011)
opc.three (7/8/2011)
Long term I definitely want to remove the local variable swapping I just added.Why? :blink: It works just fine the way you did it and it's a whole lot more bullet-proof now.
And likely less efficient.
Parameter sniffing with accurate stats can be the most optimal plans. Variables, because the optimiser has to guess cardinality based on averages, can be way worse.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 19, 2011 at 7:01 pm
I won't go into why it took this long to get a few UPDATE STATISTICS statements run in production, but the results are finally in.
I removed the parameter switching and after running the stats updates performance improved to where the proc now runs in ~1m 😀
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply