February 22, 2010 at 1:21 pm
i was giving a friend some advice about dynamic sql and i want to make sure i was correct.
So if i have a proc written in dynamic sql then the optimized execution plan of this proc will be based on the actual generated sql string therefore the only instance that this cached execution plan will be used again is if the proc call ends up w/ the EXACT same generated sql string.
So for example aGetResults @pagenum = 2 and aGetResults @pagenum = 3 would not use the same execution plan if written in dynamic sql.
However if it was NOT written in dynamic sql then the param sniffing would allow aGetResults @pagenum = 2 and aGetResults @pagenum = 3 to both use the same optimized exec plan. Which means each plan would be used by more differenct variations of the proc call and therefore the cached plan would be used more often increasing the performance.
Are my assumtions correct. I'm usually close but off by a little.
February 22, 2010 at 1:53 pm
If you use sp_executesql, the dynamic string can be set up so that all variables are parameters, and the parameters also passed in thru sp_executesql. As you add/drop lines in the creation of the dynamic sql, and thus change the actual sql statement itself, then you will get a new plan. But as long as the string is the same, and you just have different parameters, then the plan will be re-used.
Wayne
Microsoft Certified Master: SQL Server 2008
Author - SQL Server T-SQL Recipes
February 22, 2010 at 2:03 pm
I agree with Wayne on this one, if you want to use the same plan, sp_ExecuteSQL is the way to go. And if you want different plans based on certain different parameters (i.e. lookup by invoiceDate and not InvoiceNum), then you have to dynamically build your string, and pass the string to the sp_executeSQL.
If a plan was already built for this EXACT query, then the plan will be used, and feeded the parameters you sent in, otherwise, it will create a new plan, to match the exact string you have built in your dynamic stored procedure.
I used this a few times when users wanted to search on really different fields in the same table, but they never wanted to fill everything in, this made a much better performance, since you don't have to fill every parameter, just add them dynamically, and look for index seeks!
Hope that helps,
Cheers,
J-F
February 22, 2010 at 2:40 pm
WayneS (2/22/2010)
If you use sp_executesql, the dynamic string can be set up so that all variables are parameters, and the parameters also passed in thru sp_executesql. As you add/drop lines in the creation of the dynamic sql, and thus change the actual sql statement itself, then you will get a new plan. But as long as the string is the same, and you just have different parameters, then the plan will be re-used.
True, with an additional caveat: if the two sets of parameters yield vastly different #'s of rows, the plan will be invalidated and rebuilt as well. In other words, if only the parameter values are different, the plan will be EVALUATED, and if efficient, will be used.
In other words - if @param1=1 would yield 50 rows, and @param1=2 yields 100,000 rows, the plan may get tossed out and rebuilt to a better plan based on the higher cardinality.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
February 22, 2010 at 3:22 pm
ok. thanks guys. i will have to play around w/ sp_executesql and pass in some params and see what happens.
February 24, 2010 at 3:06 pm
If you use sp_executesql, the dynamic string can be set up so that all variables are parameters, and the parameters also passed in thru sp_executesql. As you add/drop lines in the creation of the dynamic sql, and thus change the actual sql statement itself, then you will get a new plan. But as long as the string is the same, and you just have different parameters, then the plan will be re-used.
Wayne
==========================================================================
So what your saying is that if the final dynamic sql output string is
select fname
from customers
where lame = @lname
the opitmizer will use the same opimized exec plan independent of the value of @lName. Yea. That makes sense. I was unaware of such techniques.
thanks guys
February 27, 2010 at 4:05 am
Matt Miller (#4) (2/22/2010)
WayneS (2/22/2010)
If you use sp_executesql, the dynamic string can be set up so that all variables are parameters, and the parameters also passed in thru sp_executesql. As you add/drop lines in the creation of the dynamic sql, and thus change the actual sql statement itself, then you will get a new plan. But as long as the string is the same, and you just have different parameters, then the plan will be re-used.True, with an additional caveat: if the two sets of parameters yield vastly different #'s of rows, the plan will be invalidated and rebuilt as well. In other words, if only the parameter values are different, the plan will be EVALUATED, and if efficient, will be used.
In other words - if @param1=1 would yield 50 rows, and @param1=2 yields 100,000 rows, the plan may get tossed out and rebuilt to a better plan based on the higher cardinality.
Matt, that seems exactly backwards to me! 😛
Parameterized batches executed via sp_executesql are exactly that: parameterized. SQL Server builds a plan based on the parameter values passed on the first execution, and re-uses the plan regardless of parameter values from that point forward.
The plan may be recompiled for correctness reasons, when statistics go out of date, or if the recompilation threshold for any table present in the final plan is reached. There may be a few other edge-cases for recompilations, but those are the ones I recall.
Here's a simple reproduction script:
USE tempdb;
GO
IF OBJECT_ID(N'dbo.PlanTest', N'U') IS NOT NULL DROP TABLE dbo.PlanTest;
GO
CREATE TABLE dbo.PlanTest (row_id INTEGER IDENTITY PRIMARY KEY, data INTEGER NOT NULL);
GO
INSERT dbo.PlanTest (data)
SELECT TOP (250000)
CASE
-- 99.5% of values = 100
WHEN RAND(CHECKSUM(NEWID())) <= 0.995 THEN 100
ELSE 999
END
FROM master.sys.columns C1,
master.sys.columns C2,
master.sys.columns C3;
GO
-- Demonstration index
CREATE NONCLUSTERED INDEX nc1 ON dbo.PlanTest (data);
GO
-- Run these tests individually with Include Actual Execution Plan ON
-- Two index scans
SELECT COUNT_BIG(*)
FROM dbo.PlanTest PT
JOIN dbo.PlanTest PT2
ON PT2.data <> PT.data
WHERE PT.data = 100;
-- Scan on PT2, seek on PT
SELECT COUNT_BIG(*)
FROM dbo.PlanTest PT
JOIN dbo.PlanTest PT2
ON PT2.data <> PT.data
WHERE PT.data = 999;
-- Two scans
EXECUTE sp_executesql
@stmt = N'
SELECT COUNT_BIG(*)
FROM dbo.PlanTest PT
JOIN dbo.PlanTest PT2
ON PT2.data <> PT.data
WHERE PT.data = @P0;
',
@params = N'@p0 INTEGER',
@p0 = 100;
-- Still two scans
-- (Cached plan reused with no recompilation)
EXECUTE sp_executesql
@stmt = N'
SELECT COUNT_BIG(*)
FROM dbo.PlanTest PT
JOIN dbo.PlanTest PT2
ON PT2.data <> PT.data
WHERE PT.data = @P0;
',
@params = N'@p0 INTEGER',
@p0 = 999;
-- Force all plans referencing dbo.PlanTest to recompile
-- on their next execution
EXECUTE sp_recompile @objname = N'dbo.PlanTest';
-- Run that last sp_executesql one again
-- Recompilation with the very selective parameter value gives a scan on PT2, seek on PT
EXECUTE sp_executesql
@stmt = N'
SELECT COUNT_BIG(*)
FROM dbo.PlanTest PT
JOIN dbo.PlanTest PT2
ON PT2.data <> PT.data
WHERE PT.data = @P0;
',
@params = N'@p0 INTEGER',
@p0 = 999;
-- Now try the one with low selectivity
-- (Same plan used - one scan, one seek)
EXECUTE sp_executesql
@stmt = N'
SELECT COUNT_BIG(*)
FROM dbo.PlanTest PT
JOIN dbo.PlanTest PT2
ON PT2.data <> PT.data
WHERE PT.data = @P0;
',
@params = N'@p0 INTEGER',
@p0 = 100;
GO
DROP TABLE dbo.PlanTest;
Anyone reading that wants to read about the fine details of compilation and plan caching should read:
Batch Compilation, Recompilation, and Plan Caching Issues in SQL Server 2005
Paul
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
February 27, 2010 at 4:13 am
I have a blog entry on dynamic sql and unbalanced data loads which seems appropriate to pimp here 😉
February 27, 2010 at 4:28 am
Dave Ballantyne (2/27/2010)
I have a blog entry on dynamic sql and unbalanced data loads which seems appropriate to pimp here 😉
Very nice. I seem to recall that there is some caveat/problem regarding the idea of getting different plans by changing the query text slightly, especially using commented text. I have a feeling that it relates to Forced Parameterization or Plan Templates, but can't immediately recall. Any insight on that?
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
February 27, 2010 at 4:47 am
Paul White (2/27/2010)
Dave Ballantyne (2/27/2010)
I have a blog entry on dynamic sql and unbalanced data loads which seems appropriate to pimp here 😉Very nice. I seem to recall that there is some caveat/problem regarding the idea of getting different plans by changing the query text slightly, especially using commented text. I have a feeling that it relates to Forced Parameterization or Plan Templates, but can't immediately recall. Any insight on that?
From what i have found forced parametrization does , as you would expect , completely negate any thing in that article. I do have a lump of code that i need to get into another blog entry for exactly this. I guess this has given me a reason to get on with it 🙂
February 27, 2010 at 5:21 am
Dave Ballantyne (2/27/2010)
From what I have found forced parametrization does, as you would expect , completely negate any thing in that article. I do have a lump of code that i need to get into another blog entry for exactly this. I guess this has given me a reason to get on with it 🙂
Thank you! I had a nag in the back of my mind about it, so many thanks for clearing that up. I look forward to the blog entry.
Paul
Paul White
SQLPerformance.com
SQLkiwi blog
@SQL_Kiwi
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply