September 22, 2007 at 12:41 pm
Comments posted to this topic are about the item How to Short-circuit Where Clauses
Paul Ibison
Paul.Ibison@replicationanswers.com
November 26, 2007 at 12:54 pm
any reason you can't just write:
select * FROM TblTest
where
SupplierId = @Input
OR
@Input = -1
December 27, 2007 at 8:43 am
Both methods (Using an OR, and Using the CASE block) FAIL to "Short Circuit".
In this simple example it doesn't matter, but in this nightmare query I've recently inherited, I've got something like this:
SELECT STUFF
FROM a bunch of Tables
WHERE (@InputA IS NULL
OR
EXISTS (SELECT 1 FROM a bunch of Tables WHERE SomCol = @InputA)
)
AND
(@InputB IS NULL
OR
EXISTS (SELECT 1 FROM a bunch of Tables WHERE SomCol = @InputB)
)
AND ETC...
SQL is resolving each subquery even when the input parameters are NULL (even when I re-wrote the query to use the CASE blocks in the WHERE Clause as this article suggested).
I re-wrote the query using Dynamic SQL (building the where clause in a string), and that worked, but it's gonna be a nightmare to maintain and modify over time.
December 27, 2007 at 11:02 am
Just note that this method will almost certainly prevent index seeks from been used to find affected records. It'll work, but it ain't likely to work fast.
It won't be noticeable on 6 rows. 600 000 is a very different story.
My usual suggestion for problems like this is either separate queries (if the number of parameters is low) or dynamic SQL (if the number of params is large)
Most of the other fancy tricks perform terribly on larger row sets.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 27, 2007 at 12:18 pm
Yep - sounds like dynamic SQL is my best bet, I've got lots of optional parameters and lots of rows.
Maybe in the future, Microsoft will improve their optimization algorithms for this type of situation...
Thanks for the advice!
December 27, 2007 at 12:44 pm
Any form of function on a column in the where clause prevents index seeks, even something like UPPER, LEFT or the like.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 27, 2007 at 1:24 pm
It's a clever solution to complex lookups in small tables (very few rows), where indexes won't matter. On larger tables, it's going to bypass indexes and quite probably slow way down.
One solution I've used on moderately complex where statements is using Union All instead of Or. Makes it a bit more complex to maintain, but speeds things up quite nicely (assuming the tables have appropriate indexes on them).
select col1, col2, col3
from dbo.table1
where col1 = @param1_in and @param2_in is null
union all
select col1, col2, col3
from dbo.table1
where col2 = @param2_in and @param1_in is null
instead of:
select col1, col2, col3
from dbo.table1
where col1 = isnull(@param1_in, col1)
or col2 = isnull(@param2_in, col2)
or:
select col1, col2, col3
from dbo.table1
where col1 = @param1_in and @param1_in is not null
or col2 = @param2_in and @param2_in is not null
On a big table with decent indexes, the first solution is much faster, because it uses the indexes, than either of the later two solutions, or the solution proposed in the article.
(If the parameters are not mutually exclusive, then use "Union" instead of "Union All", or you'll get duplicate rows. Unless this is part of a sub-query or other situation where duplicate rows don't matter. Union is slower than Union All, but it's still generally faster than skipping indexes by using Or in your Where clause.)
With multi-table selects, complex Where clauses and a dozen or so parameters with complex relations, the Union solution becomes much too complex to build and maintain, and Dynamic SQL is better.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
January 2, 2009 at 9:02 am
bbbrian007 (12/27/2007)
Yep - sounds like dynamic SQL is my best bet, I've got lots of optional parameters and lots of rows.Maybe in the future, Microsoft will improve their optimization algorithms for this type of situation...
Bear in mind that dynamic SQL is frowned upon for 2 reasons ;
i) possibility of SQL Injection attacks
ii) Optimiser unable to cache query plans (as the SQL is prepared dynamically)
January 2, 2009 at 11:23 am
Preet (1/2/2009)
bbbrian007 (12/27/2007)
Yep - sounds like dynamic SQL is my best bet, I've got lots of optional parameters and lots of rows.Maybe in the future, Microsoft will improve their optimization algorithms for this type of situation...
Bear in mind that dynamic SQL is frowned upon for 2 reasons ;
i) possibility of SQL Injection attacks
ii) Optimiser unable to cache query plans (as the SQL is prepared dynamically)
The first one can be avoided by using 'sp_executesql' and validating/passing parameters as needed.
The second one is no longer valid - as long as the dynamically created SQL is the same. And that includes spaces, formatting, etc... The query plans will be cached and available for re-use.
Jeffrey Williams
“We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”
― Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
January 2, 2009 at 3:56 pm
Good golly... I'm glad I'm not the only one that believes in the usage of well formed and safe dynamic SQL. 🙂
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply