February 23, 2008 at 1:49 pm
Did you play a bit with the PARAMETERIZATION option ?
Alter databaset yourdb
set PARAMETERIZATION FORCED
http://msdn2.microsoft.com/en-us/library/ms175037.aspx
Maybe it's this kind of queries the tried to tackle using this option :hehe:
ps
If the recompile takes to long or
if you know a certain pat is preferable in most of the cases,
you could even add parameter optimisation hints :
OPTIMIZE FOR @variable_name = literal_constant
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
February 23, 2008 at 5:16 pm
Thanks, Johan,
I took a look at the forced parameterization thing... looks real spooky when they say things like it could slow down the whole world and use with great caution, etc. I believe I'll steer clear of that one, especially since it's a database wide option.
I have to take back a bit about what I said... I've found an instance where the OPTION (RECOMPILE) doesn't fix things and the OPTIMIZE FOR option had no effect either. Of course, Michael's suggestion of using Dynamic SQL takes care of the whole shabang but, wow, what a mistake the optimizer makes and, wow, wait a PITA to overcome.
I'm still playing around with a couple of these problems... thanks for all the feedback, folks.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 23, 2008 at 11:04 pm
Yeah, OPTION(RECOMPILE) is the one I meant, should have said really!
What is the case where a recompile didn't work? I had a problem recently that a recompile alone couldn't help because the statistics were "flawed", and I ended up using a temp table to resolve it. It involved a permissions table with potentially millions of records, with the majority of users having less than 100 records but some admin users having more than 20000 records. Because the statistical histograms only get up to 200 entries, admin users would sometimes get terrible plans even with the recompile. OPTION(OPTIMIZE FOR) would have been a good option if there was always an appropriate admin entry in the table, but that couldn't be guaranteed. By inserting the permissions records into a temp table first the compiler had accurate information to produce a good plan.
Having more than 200 steps in a histogram as an option would actually be a nice enhancement for a future release ... or perhaps even to allow us to create our own stats and use them like plan guides.
February 24, 2008 at 12:02 am
matt stockham (2/23/2008)
I had a problem recently that a recompile alone couldn't help because the statistics were "flawed", and I ended up using a temp table to resolve it.
That's precisely how I ended up resolving it... nothing else worked... not the recompile nor the optimize for.
I guess lot's of folks would look down their nose at using a temp table in such a fashion, but that's also where a CTE or a Derived Table live is in TempDB. Even the hint of FORCE ORDER didn't work so the semi-hard coding of putting an interim result set appears to be absolutely necessary. Fortunately, this is for batch processing and it only takes a bit over 100 milliseconds to do. That's a lot better than the 10+ minutes it was taking with the variable sniffing problem.
Oh, the problem... I'm working on an article for faster ways to calculate workdays, future delivery dates, and "need to order by" dates. One of the methods ends up doing a self-join to the calendar table (no triangles) and the variable gets added to an offset. If the variable is replaced by a hardcoded constant, no problem... takes just seconds to resolve a million rows... if it remains a variable, takes more than 10 minutes and I end up killing the run.
I'll be sure to point it out in the article.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 24, 2008 at 11:44 pm
Jeff Moden (2/23/2008)
That's kinda the conclusion I came to. Thanks for the great explanation, Gail.
Glad to be of help.
You don't happen to have a good URL for how to overcome parameter sniffing, do you? I like Matt's idea because not all recompiles are bad and it seems to fit perfectly for something like this, but I don't want to limit possibilities... knowing more than one method is always good.
I blogged on parameter sniffing a while back. Options I know of for fixing parameter sniffing are use variable instead of parameters. Means that if you have data skew in the table you won't get a plan that's optimal for one value and very bad for another. downside is that you'll never get a very good plan. Option (Optimize For ...) if you know that one parameter value is more likely than the others. Option (recompile) if you know there is no optimal plan for the query.
For what you encountered (which is kinda the opposite of parameter sniffing), I'd say your options are to use parameters instead of variables (if possible), and also Option (optimise for ...) and option (recompile).
That's what I can think of. There may be other tricks I haven't encountered before.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 5 posts - 16 through 19 (of 19 total)
You must be logged in to reply to this topic. Login to reply