February 7, 2007 at 8:38 pm
Comments posted here are about the content posted at http://www.sqlservercentral.com/columnists/mdillon/2863.asp
May 6, 2007 at 10:35 pm
Thanks for the article, Mike!
Just curious: Why have not you used WITH RECOMPILE option? I believe one has more flexibility with this option if specified in a procedure call statement. Alternatively a procedure can be created with this option.
And the join hints can tell SQL to use a particular join every time, just in case
May 6, 2007 at 11:03 pm
You're on the money with your detective work.
However, I agree with Alex that you should have considered the WITH RECOMPILE option with your create procedure statement. That's probably in line with your intent.
I'm curious, did you call the sp_recompile on your stored procedure from within the stored procedure itself?
-Larry
May 7, 2007 at 12:01 am
Great!! Recently one of the stored procedure was behaving some what in the similar way as mentioned, I did use sp_recompile which made it much faster, but didnt knew the reason behind..!!
The Great article for explaning the behaviour of sp..!!
May 7, 2007 at 3:55 am
Thank you for posting this useful article, Mike. I also had the same experience in the past with the same cause (sp_recompile).
Regards, Vincent
May 7, 2007 at 4:23 am
I am a novice and don't know much about the SQL server engine but Microsoft documentation claims that "Microsoft® SQL Server™ automatically recompiles stored procedures and triggers when it is advantageous to do so."
I found this claim at http://msdn2.microsoft.com/en-us/library/aa238892(SQL.80).aspx
Not sure what is it. Could some expert explain?
May 7, 2007 at 5:50 am
valuable article; whenever I found a situation similar to this, I always blamed out of date statistics, and updated statistics and recompiled procedures; nice detective work, I have a better understanding now.
Lowell
May 7, 2007 at 6:33 am
This is good info, but I don't see how it matches the article title. Would anyone mind "join"-ing the dots for me?
May 7, 2007 at 6:43 am
Nice. I agree with others, you should use "with recompile" statement . And more, if you are using MS SQL 2k5, you may use "with recompile" on a statement rather on the batch.
And some comments about the network level in .NET. If you would be having problem with that layer, I guess you would see a waiting status for your spid with something like 'NETWORKIO'.
Best Regards,
Igor.
May 7, 2007 at 7:05 am
A little off topic. I have seen similar results (slow .NET access but fast in Query Analyzer) that seemed to be solved with "SET ARITHABORT ON". Maybe by doing that I am forcing a recompile and it is the same problem. Of course you need to be careful that the actual procedure doesn't have the potential for an actual error that ARITHABORT will falsely handle.
Hope that helps someone else. I would welcome any other feedback on a relationship between how SQL Server and .NET are actually handling these differently at a code level. Maybe there is some other solution that is more appropriate.
Enjoy,
Brian
May 7, 2007 at 7:08 am
Why did Query Analyzer always run faster? Doesn't QA use the Cache?
I have a similar problem except my stored procedure will run for months in the application then suddenly slow down to the point of timing out.
albanello
May 7, 2007 at 8:29 am
Both QA and app use cache, but the way the cache gets used is that exactly the same command needs to run, down to the whitespace. This means that it's possible when you run from QA, you might force a recompile or a different plan.
Frank,
We actually saw this at Andy's old company during TechEd 05. We queried Christian Kleinerman of the storage engine and a few other MS guys at the time. They said that at times they've seen a plan mistakenly flushed and then a bad one chosen under load, especially memory pressure. They didn't have a good way to fix it other than flush or restart the server. That seemed to work for Andy and so they did that.
If you do that, one suggestion they had was to set a startup script that runs some important queries before the app to try and build a cache up. As mentioned, be sure that you use the same query that's expected.
More reading: http://blogs.msdn.com/sqlprogrammability/archive/2007/01/23/4-0-useful-queries-on-dmv-s-to-understand-plan-cache-behavior.aspx
May 7, 2007 at 9:47 am
Can someone post short example of how to use sp_recompile or with recompile option? I looked at MS reference. But not good enough.
Thanks.
May 7, 2007 at 10:11 am
Actually the fact that an SP uses the cached plan is one of the oft touted reasons to use SP's rather than dynamic SQL. It save all that time compiling an execution plan.
I've seen cases where compiling the plan was much longer than the actual execution. If your SP breaks a task down to where it runs a whole lot of simple querries of dynamic nature the the recompile option might be helpful.
ATBCharles Kincaid
May 7, 2007 at 10:55 am
If you can guarantee that a result set with a large date range will have a LOT of data, I would probably choose to evaluate the date range in the initial stored procedure and have it call one of two procedures.
If the date range is large, then call Procedure A without a recompile.
If the date range is small, then call Procedure B with a recompile (unless you can also guarantee that a small date range will always have a LITTLE data).
This way you save the time of the recompile because Procedure A should always have the same optimum plan.
Viewing 15 posts - 1 through 15 (of 39 total)
You must be logged in to reply to this topic. Login to reply