August 8, 2007 at 4:12 pm
My team developed SQL code to populate a table for reporting purpose. The code is fairly typical of what we've produced to pull information out of our application: two INSERT INTO ... SELECT statements, each one extracting data from several joined tables, with a WHERE clause and a GROUP BY clause. The results of the script were what we expected, and the script took about 10 minutes to run.
We took that code and wrapped into in a stored procedure. When we executed the stored procedure, that takes over 7 hours to run.
Both place the same number of records in the result table. I have pulled the code from the stored procedure out and run it again as a script, to make sure there weren't any fiddly little changes we made. That script still runs in about 10 minutes.
The execution plans are slightly different in the specifics, but are relatively similar beyond that.
Here are the differences I have been able to find:
1) When I run the code as a script, I get the following messages:
----------
(88655 row(s) affected)
(86190 row(s) affected)
----------
However, running as a stored procedure, I get:
----------
(88655 row(s) affected)
(37 row(s) affected)
(86190 row(s) affected)
(37 row(s) affected)
----------
2) For the script, the execution plan makes much more accurate guesses as to the number of rows than the stored procedure. In fact, the stored procedure's query plan seems to show the estimated rows as 1 for every step.
Has anyone seen something like this before? Any advice as to where to look?
R David Francis
August 8, 2007 at 7:25 pm
Could be a couple of things, but let's go for the simple side, first.
Try adding the WITH RECOMPILE option to the stored proc... you may be fighting something bad that was cached.
Try DBCC FREEPROCCACHE... same reason.
If that fails, you'll need to post the code. Of course, the code may be part of the problem... see any missing or broken statistics in the execution plan?
--Jeff Moden
Change is inevitable... Change for the better is not.
August 8, 2007 at 9:34 pm
Are you running it against the same database or against the copy in DEV?
_____________
Code for TallyGenerator
August 9, 2007 at 8:00 am
Parameter Sniffing, huh? Well, you learn something new every day!
Yes, I had taken the parameters for the stored procedure and declared them as variables when I ran this as a script.
Creating variables, copying my parameter values to them, and then using the variables instead of the parameters in my code brought me back down to the 10-15 minutes range (which is the expected duration of the procedure).
It's kind of weird that I haven't run into this before; I've written hundreds of stored procedures, usually testing them out as scripts first, and I haven't noticed this sort of thing before; now, I've got at least three procedures in one week that I suspect are running into this.
Anyone have any knowledge of what triggers this? I mean, why one procedure with parameters would be fine, and another would be slow?
R David Francis
August 9, 2007 at 8:05 am
Oh - and thanks to all who responded, and especially to Vladan, whose tip resolved my issue!
R David Francis
August 9, 2007 at 9:53 am
We had an issue here (with SQL 2005) where this happened. Declared variables inside the proc that we assigned the values passed into the procedure, and it now was fast again.... Go figure.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply