January 21, 2004 at 9:03 am
HI,
We have code that runs quickly in QA, but the same code in a stored procedure runs terribly. I have also noticed that the execution plan for the code in the proc when run is not optimal, yet the code when run in QA is executed optimally.
I have sp_recompiled the proc, DBREINDEX'ed the table.
Looking for any insight.
Thanks
Mark
January 21, 2004 at 1:55 pm
That sounds like "parameter sniffing" with atypical values. Does your stored procedure have defaults (like null) for its parameters?
--Jonathan
January 21, 2004 at 1:57 pm
Also, does the SP produce #Temp tables with data you are "massively" processing or have many rows in them.
Once you understand the BITs, all the pieces come together
January 22, 2004 at 8:06 am
We had same problem, too. Are you on SQL 7 ? If do, you need go to SQL2K. Most of this problem disppear on SQL2K.
January 22, 2004 at 11:06 am
Alter your stored procedure and place constant values in your queries, or assigning values to your variables in the stored procedure to test whether or not the procedure compiles with a better plan.
January 26, 2004 at 9:15 am
Thanks for all the posts,
_____________________________________________
That sounds like "parameter sniffing" with atypical values. Does your stored procedure have defaults (like null) for its parameters?
______________________________________________
Yes we are assigning default values to a few of the params, but are always passing in a proper value. There are two params that do have a default of null.
Can this effect a query? Come to think of it was this param that seemed funny in the plan. Does anyone have more info on this?
Thank you!
January 26, 2004 at 9:17 am
We are also using SQL 2K AS, and this SP in question used no temp tables.
Also, good idea on testing with constants.
Thank you
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply