May 19, 2010 at 9:45 pm
Comments posted to this topic are about the item Use Dynamic SQL to Improve Query Performance
May 19, 2010 at 9:52 pm
Before you went through all of the rigmarole of creating a stored procedure that constructed dynamic sql, did you try a combination of adding "WITH RECOMPILE" or using OPTION(OPTIMIZE FOR UNKNOWN) in the select itself?
May 19, 2010 at 9:52 pm
Dynamic SQL should be last resort - but sometimes it is unavoidable. Please have a look this article regarding dynamic sql. http://www.sommarskog.se/dynamic_sql.html
May 20, 2010 at 12:07 am
Maybe you should consider rewriting original function with coalesce on every parameter. That's how you would avoid table scan if the parameter is null.
BR,
Vladimir
May 20, 2010 at 12:14 am
I second the sommarskog.se site (which gretchkosiy 2550 refers to above) - that is an excellent writeup which includes gotchas for SQL 2008 SP1 and dynamic SQL if I recall correctly.
Having had some even more insanely complex queries (many conditions) suffering because of the complete failure of the SQL query engine to short-circuit effectively, I can say that I have resorted to this approach after trying every other one (recompile, optimize for unknown, etc.) and I feel very confident saying that at least as of SQL 2008 SP1, dynamic parameterized SQL is THE way to go for proper index usage and performance.
It is a pain to debug, and it's awfully weird to have SQL generating SQL in a stored proc (in my opinion the stored proc compiler should have been smarter) just to build this kind of search query, but it just plain works. I've seen queries that performed 10x faster as dynamic parameterized SQL vs. static short-circuiting style code - all due to the ability for the right indexes to be used. I tried to contact Microsoft once to ask why the engine didn't recognize this use case, but I never heard back.
I'm certain not all complex search style where condition procs (which often require many OR conditions if statically coded and are therefore not easily SARG-able) need to be dynamic parameterized SQL (for example, if you're dealing with smaller tables - say 10,000 to 100,000 rows - I don't think it's a huge difference). For larger tables or complex sets/long lists of dynamic conditions, I think, as of today at least, it's a good way to go.
As always, test it for yourself, as every environment is different.
Thanks - good article!
May 20, 2010 at 12:21 am
Nice writeup.
Maybe I am a bit off topic, but I would never allow null values to pass through to the select statement in a procedure if the tables underneath do not allow nulls. This should be handled in the front end application og by using non null default values in the parameters.
May 20, 2010 at 12:23 am
chris-945648 (5/19/2010)
Before you went through all of the rigmarole of creating a stored procedure that constructed dynamic sql, did you try a combination of adding "WITH RECOMPILE" or using OPTION(OPTIMIZE FOR UNKNOWN) in the select itself?
And why would that make a difference? SQL Server still has to come up with an execution plan for all the statements in the WHERE clause. I've seen enough procedures with multiple optional parameters and developers trying to create "clever" methods of combining them into one ugly TSQL statement, to know that dynamic SQL cannot be avoided. In SQL Server 2005 and SQL Server 2008 you can also prevent the problem with broken ownership-chaining in dynamic SQL by signing the procedure. Problem solved.
May 20, 2010 at 12:26 am
Sometimes, dynamic queries are the way to go. But in this example would be interesting to see performance of the modified static query. It would look something like this:
SELECT [Name],
[Color],
[ListPrice],
[ModifiedDate]
FROM [Production].[Product]
WHERE [Name] LIKE coalesce(@Name, [Name])
AND Colour] LIKE coalesce(@Color, [Colour] )
AND [ListPrice] >= coalesce(@ListPriceMin, [ListPrice])
AND [ListPrice] <= coalesce(@ListPriceMax, [ListPrice])
AND ModifiedDate >= coalesce(@ModifiedDateMin, ModifiedDate)
AND ModifiedDate <= coalesce(@ModifiedDateMax, ModifiedDate)
I must say that I am not completely sure that this query is faster than dynamic sql, but seems like it should be.
BR
May 20, 2010 at 12:38 am
Very well written, a nice article.
Can you please put some light on the excution plan of this SP, while writing dynamic SQL.
As per knowledged, the execution plan won't be resued.
Please suggest.
May 20, 2010 at 12:42 am
Vladimir Basanovic (5/20/2010)
Maybe you should consider rewriting original function with coalesce on every parameter. That's how you would avoid table scan if the parameter is null.BR,
Vladimir
Is it Vladmir... 😎
Thanks for sharing the idea. I know it is very simple thing but not aware.
Although I use to write WHERE clause using COALESCE, but do not know the performance improvement.
Really appriciate.
May 20, 2010 at 12:44 am
What about this approach?
SELECT [Name],
[Color],
[ListPrice],
[ModifiedDate]
FROM [Production].[Product]
WHERE
(@Name IS NULL OR [Name] LIKE @Name)
AND (@Color IS NULL OR [Colour] LIKE @Color)
AND (@ListPriceMin IS NULL OR [ListPrice] >= @ListPriceMin)
AND (@ListPriceMax IS NULL OR [ListPrice] <= @ListPriceMax)
AND (@ModifiedDateMin IS NULL OR [ModifiedDate] >= @ModifiedDateMin)
AND (@ModifiedDateMax IS NULL OR [ModifiedDate] <= @ModifiedDateMax)
By adding first IS NULL parameter checking, logically the second condition after OR should not be checked if first one is TRUE.
May 20, 2010 at 1:02 am
mohd.nizamuddin (5/20/2010)
Very well written, a nice article.Can you please put some light on the excution plan of this SP, while writing dynamic SQL.
As per knowledged, the execution plan won't be resued.
Please suggest.
Execution plans will be cached when using dynamic SQL, and SQL Server will cache a separate plan for each combination of the values in the parameters for sp_executesql.
To test that this actually works I suggest you run Profiler and add SP:CacheHit, SP:CacheMiss and SP:CacheInsert events when you use sp_executesql.
May 20, 2010 at 1:06 am
Nils Gustav Stråbø (5/20/2010)
mohd.nizamuddin (5/20/2010)
Very well written, a nice article.Can you please put some light on the excution plan of this SP, while writing dynamic SQL.
As per knowledged, the execution plan won't be resued.
Please suggest.
Execution plans will be cached when using dynamic SQL, and SQL Server will cache a separate plan for each combination of the values in the parameters for sp_executesql.
To test that this actually works I suggest you run Profiler and add SP:CacheHit, SP:CacheMiss and SP:CacheInsert events when you use sp_executesql.
Thanks for your prompt response.
Just going through http://www.sommarskog.se/dynamic_sql.html and I got the idea.
🙂
May 20, 2010 at 1:06 am
Dynamic SQL is the last thing to do. I use and prefer the following way;
SELECT [Name],
[Color],
[ListPrice],
[ModifiedDate]
FROM [Production].[Product]
WHERE ([Name] LIKE case when @Name is Null then [Name] else @Name end)
AND ([Color] LIKE case when @Color is Null then [Color] else @Color end)
AND ([ListPrice] >= case when IsNull(@ListPriceMin,0) = 0 then [ListPrice] else @ListPriceMin end)
AND ([ListPrice] <= case when IsNull(@ListPriceMax,0) = 0 then [ListPrice] else @ListPriceMax end)
AND (ModifiedDate >= case when @ModifiedDateMin is Null then ModifiedDate else @ModifiedDateMin end)
AND (ModifiedDate <= case when @ModifiedDateMax is Null then ModifiedDate else @ModifiedDateMax end)
May 20, 2010 at 1:19 am
Atif Sheikh (5/20/2010)
Dynamic SQL is the last thing to do
Dymamic SQL is the only efficient way (in my knowledge) to solve the problem with multiple optional parameters. Whether you use CASE, COALESCE, ISNULL or OR to try to avoid dynamic SQL, you will most likely (if not always) end up with index scans. I have never seen the methods mentioned above work in a performance perspective. Look at your execution plan the next time you try a MyCol=case when @val is null then MyCol else @val end. You will see a index scan, not a seek (if the column is indexed).
Viewing 15 posts - 1 through 15 (of 106 total)
You must be logged in to reply to this topic. Login to reply