June 19, 2009 at 3:45 pm
hi,
I have a store procedure running in SQL 2005 SP2 environment. It run the same store procedure with different parameters for many times. Basically it run to process some data day by day. It works fine for a while, f.g. it take less than 1 min to complete a day for about 20 days, then suddenly take 25 minutes to complete next day and after. If I run it again, it may take 1 minutes to complete a day for all the days or performance drop start from 15th day or from 25th day.
Any one has similar experience?
Thanks in advance.
William
June 19, 2009 at 10:39 pm
Might be that the server is occasionally caching an execution plan based on a parameter value which is not typical for your workload. A plan that works well for one parameter value can be a terrible approach for a different parameter value.
This is a complex subject, so rather than try to cover all the issues here, I would encourage you to read this TechNet article
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
June 22, 2009 at 11:58 am
As Paul says, this can get rather complex, however I have a rule of thumb on these. If the procedure can be called with a wide range of parameter values and the cost of compiling the query is relatively small compared to the cost of execution, I would give serious thought to using the WITH RECOMPILE option for the procedure. This typically is the case with "reporting" procedures as opposed to very selective procedures.
/*****************
If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek
*****************/
June 22, 2009 at 3:05 pm
Agreed. However, if the majority of calls are 'typical' then you should use WITH RECOMPILE just on the statement that calls the procedure, rather than forcing a recompile every time.
For example, instead of CREATE PROCEDURE dbo.P WITH RECOMPILE
one would omit the RECOMPILE in the definition and call the procedure like this:
EXECUTE dbo.P WITH RECOMPILE -- for atypical parameter values only
This method keeps the plan which is useful for the majority of parameter values in cache; a procedure called with RECOMPILE in the statement does not cache the produced plan, nor evict the current plan from cache.
The OPTIMIZE FOR query hint can also be useful in 2005 and 2008 (even more so in 2008 with the UNKNOWN option). See This SQL Server Team Blog Entry for full details on that.
Paul
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
June 24, 2009 at 7:50 am
Thanks for all your reply. I did have a chance to check web yesterday. I will try Paul's tip and report the result.
Recompiling is what I guessing in the beginning. But one thing I forgot to mention is that the same store procedure (it is a report) run in SQL 2000 was return in 2-3 minutes constantly (Client just upgrade from SQL 2000 to 2005 recently).
Any SQL 2005 behavior changes caused this issue?
Thanks in Advance,
William
June 24, 2009 at 8:02 am
liuwilli (6/24/2009)
Any SQL 2005 behavior changes caused this issue?
William,
The answer is yes, or no - depending on what is in your code 😀
Paul
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
June 24, 2009 at 8:19 am
The problem is determining what is "typical" and what parameter values are not... Also, if you recompile for an atypical parameter value, you must also recompile it "back" when you get a "typical" value. We looked at that option and found that it was going to be quite a bit of work for a very marginal return. Your mileage may vary...
We had the same issue when we upgraded from SQL2k to 2005. Every once in a while, our reports would take forever to run... It took me a while to figure out what was going on, and it took me even longer to embrace the solution because I had been so thoroughly conditioned to think "recompiling of stored procedures = bad". But after someone pointed out that the cost of recompiling that particular procedure was a small fraction of the cost of executing the query, and was downright minuscule compared to the cost of running it with a bad execution plan, that I got over it.
Obviously, MS changed a few things about the optimizer in 2005, generally the changes are beneficial, but no change can be beneficial to all situations. This, of course, is why we tested 2005 extensively before adopting it, but I have to say, we didn't see this particular issue crop up until we went to production.:( I can't give you a detailed explanation of what changed and why, but I can tell you that adding WITH RECOMPILE to a number (not all) of our reporting procedures cured the problem.
/*****************
If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek
*****************/
June 24, 2009 at 8:23 am
Sure.
In some cases, one would want to add RECOMPILE to the procedure.
In others, just to one or more statements in the procedure.
In still others, to the procedure call itself.
I guess that's why so many ways exist 😉
Paul
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
June 24, 2009 at 8:32 am
That's a good point Paul.
If you can determine that the problem comes down to a particular statement you can use the OPTION (RECOMPILE), or you can modify the call. You can also use the OPTIMIZE FOR option or a query plan guide. The point is this is almost certainly a parametrization issue and there are lots of options for dealing with it in 2005.
/*****************
If most people are not willing to see the difficulty, this is mainly because, consciously or unconsciously, they assume that it will be they who will settle these questions for the others, and because they are convinced of their own capacity to do this. -Friedrich August von Hayek
*****************/
June 24, 2009 at 9:19 am
There is a video regarding on how MySpace.com uses SQL Server and can be found via http://www.sqlservercentral.com/Forums/Topic652576-61-1.aspx
One of the issues that is discussed in the video was the database option to automatically update statistics, which does a sample, and stored procedure plans. If the sample happened to pick a non-typical distribution of data, the query plans would get redone but would sub-optimal for the actual data. Their solution was to turn off auto-update and update the statistics with a full scan on a regular basis.
There was also another set of posts on SSC where is was confirmed that index rebuilds do not update statistics on non-indexed columns.
SQL = Scarcely Qualifies as a Language
June 24, 2009 at 3:14 pm
Carl Federl (6/24/2009)
One of the issues that is discussed in the video was the database option to automatically update statistics, which does a sample, and stored procedure plans. If the sample happened to pick a non-typical distribution of data, the query plans would get redone but would sub-optimal for the actual data. Their solution was to turn off auto-update and update the statistics with a full scan on a regular basis.
That is one option. More generally, I think it is good general advice is to leave auto-update enabled (normally with the async option on too) and to only disable it for specific structures which can be shown to generate materially unrepresentative statistics when a default sample is taken. This can be done using the NORECOMPUTE option of the CREATE and UPDATE STATISTICS statements, but one needs to be careful when doing this.
Carl Federl (6/24/2009)
There was also another set of posts on SSC where is was confirmed that index rebuilds do not update statistics on non-indexed columns.
Why would they?
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply