April 24, 2006 at 12:22 pm
I have a vendor supplied proc that we use for reporting. Essentially, this proc has about 25 different IF statements triggered by procedure input parameters that dictate which version of 25 different SQL statements is going to run. Each statement is essentially the same query with nominal differences according to the input params. The base query is a JOIN of 15 tables - give or take a couple depending on with "IF" evaluates to true.
Obviously, one danger of this method is that the first access path compilation may not be the correct path for subseqent queries, thus leading to an non-optimal plan.
However, in this case, the procedure recompiles often - and it take 10 to 20 minutes to compile, depending on other server workload.
Even the non-optimal plans are better than this compilation overhead. So, what triggers it? I thought an aged plan, invalid statisctics, or a large percentage of data changes to one of the tables would kick off an new compilation. However, in this case, I don't think any of the above are actually occurring.
Any ideas?
Ryan
April 24, 2006 at 12:32 pm
There is a recompile option on the create procedure command. This causes the procedure to generate a new plan on every execution.
Tom
April 24, 2006 at 12:40 pm
sp_recompile has not been specified within the procedure, nor is anyone specifying a recompile manually.
The proc seems to get recompiled several times per day...not every time.
RH
April 24, 2006 at 2:15 pm
I found what I need in BOL under SQL Server Architecture: Execution Plan Caching and Reuse.
Now, does anyone know how to expose the aging metrics? I would like to interogate these stats so I can see how the quickly the plan is being aged and deallocated.
Thanks. RH
April 24, 2006 at 2:18 pm
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply