August 30, 2008 at 11:52 am
Hi JLSSCH,
These sample procedures were expecting that the caller will add the necessary wildcard pattern to the variable being passed. If you look at the code that calls the stored procedure, you will see calls like this.
EXECUTE ProductSearch6 NULL, NULL
EXECUTE ProductSearch6 'CA%', NULL
EXECUTE ProductSearch6 NULL, 'Black%'
EXECUTE ProductSearch6 'CA%', 'Black%'
See that the pattern is created by the caller. This will give the caller more control over how the matching is done.
regards
Jacob
.
September 1, 2008 at 11:19 pm
Excellent one...:)
September 25, 2009 at 2:17 am
Self-generating code is not a good thing. Non-ecsaped string data will cause you pain and maintaining the code will be terrible.
SELECT * FROM Production.Product
WHERE
(@ProductNumber IS NULL OR ProductNumber LIKE @ProductNumber)
OR
SELECT * FROM Production.Product
WHERE
1 =
CASE
WHEN @ProductNumber IS NULL THEN 1
WHEN @ProductNumber LIKE ProductNumber THEN 1
ELSE 0
END
p.s. good article
September 25, 2009 at 3:31 am
the issue i have with procedures like these is that the developers at our place create a procedure for all eventualities which is fine BUT, a lot of these optional where clauses require a join to a table that is not necessary unless that option is selected. Therefore, we have some queries with several LEFT JOIN's that are not necessary for certain selections and therefore causing additional reads when not necessary. This can cause chaos on very large tables.
Don't get me wrong, i don't mind this method, but only on small data sets. Anything large then i recommend seperate stored procedures, purely from a performance point of view, as stated earlier this can have additional administrative overhead in having to say update 6 procs as opposed to 1 but if it means the process runs in less than half a second as opposed to 10s then i'm happy to do that!
_________________________________________________________________________________SQLGeordieWeb:- Jarrin ConsultancyBlog:- www.chrisjarrintaylor.co.ukTwitter:- @SQLGeordie
September 25, 2009 at 8:26 am
I really do not recomend use CASE in the where, I have seen a lot of problems with index invalidation in sql server 2000 and 2005, if you need use case in where and your tables are really small you wont note this problem, but if your tables are really big you will see full index scan all time and many read activity then your performance will be afected.
September 25, 2009 at 8:40 am
I tried conditional WHERE clauses not too long ago and found a few things:
1) Instead of dynamic where conditions in a sproc, another option is to allow the DAL to generate dynamic SQL.
2) A main advantage of sp_ExecuteSql over of Execute is how it caches the query plan when using parameters.
3) A main downside of sp_ExecuteSql is that it causes issues with most DAL tooling support that relies on code inspection, such as LINQ. When you write ordinary SQL, LINQ and other tools can parse the SQL and create a type based on the fields that are being returned. When you use sp_ExecuteSql, the DAL has no easy way to know what's being returned. You may be able to tell your DAL what "type" is being returned or you can switch back to 1).
September 25, 2009 at 8:44 am
Learn allot very good article:-)
September 25, 2009 at 8:45 am
Excellent article! I especially want to thank you for
I have not done a performance comparison of all the different methods. Each of the methods mentioned above has its own place. There are times when a specific method is found to be better than other options. I would suggest that you check the different methods and compare the performance of each method in the context of the specific query that you are writing. I am expecting that some of the SQL Server experts around will jump in and share their views and experiences on this subject.
This made my day for two reasons: First, you recognized that environments and requirements differed, and offered a good overview of the different ways to solve a problem, rather than presuming to claim that you'd found "THE solution". Second, this led to a good discussion of the pros/cons of the different solutions, rather than the more tense discussions that tend to follow a single-solution article.
As a result, I now have a resource that covers the philosophy of conditional where clauses, as well as some indication of where different solutions might be useful. Thanks!
(And, thanks Steve, for republishing an excellent article to the newsletter that I and apparently others had missed first time around!)
September 25, 2009 at 9:16 am
This may sound odd and a bad choice for performance, but I often select all the records into a temp table and then conditionally delete the unwanted rows. This seems to perform well even with over a hundred thousand rows. (So basically: Select * into table with some permanent conditions in the where clause. Then afterwards, a series of IFs follow that remove rows.) For complex quesries, it is much easier to read and maintain that way.
I'm not saying that I know that this is recommended, so any comments are weclome.
The greatest enemy of knowledge is not ignorance, it is the illusion of knowledge. - Stephen Hawking
September 25, 2009 at 9:36 am
markjvernon (9/25/2009)
Self-generating code is not a good thing. Non-ecsaped string data will cause you pain and maintaining the code will be terrible.
Sorry, but I would have to disagree with this. Dynamic SQL, is not a preferred solution, but it is often the best solution available.
In particular, queries like this:
SELECT * FROM Production.Product
WHERE
(@ProductNumber IS NULL OR ProductNumber LIKE @ProductNumber)
OR
SELECT * FROM Production.Product
WHERE
1 =
CASE
WHEN @ProductNumber IS NULL THEN 1
WHEN @ProductNumber LIKE ProductNumber THEN 1
ELSE 0
END
perform terribly and cause far more problems than well-written Dynamic SQL ever would.
EDIT: fixed typos.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
September 25, 2009 at 10:40 am
This was a great article, but typically when I start seeing needs like this, I start wondering if maybe a better solution would be to try something completely different. Once queries start getting into the realm of "decision support", maybe it's time to start investigating an alternative (e.g Analysis Services) that has better support for what your users want to do.
It's not always practical or possible, given time and cost constraints, but it bears asking, I think.
September 25, 2009 at 10:48 am
I wanted to add that I have specifically had some really serious performance headaches with SQL Server's query engine and its handling of SARGS (search arguments), particularly related to OR conditions. I'm not saying that solving that problem is any easy one for the query engine, but here's an example of something that I often come across and has no "static sql" solution that produces an optimal query plan (maybe I should write this up as an article?):
@even_odd_filter filters out rows with an even or odd integer field (1 = odd, 0 = even)
@date_filter - F = use from date, T = use to date
@search_date is a date, either from or to date
create procedure TestSarg
@even_odd_filter bit,
@date_filter char,
@search_date datetime
as
select
t.intfield,
t.from_date,
t.to_date
from
t
where
(
(@even_odd_filter = 1 and t.intfield % 2 > 0)
or (@even_odd_filter = 0 and t.intfield % 2 = 0)
) and
(
(@date_filter = 'F' and t.from_date > @search_date)
or (@date_filter = 'T' and t.to_date > @search_date)
)
This proc is only a sample, but shows what I feel is a good syntactical way of expressing static conditions). It does not run efficiently on larger data sets, even with proper indexes (usually choosing scans over seeks), due to the use of the nested OR operators. It appears the SQL query processor cannot figure out which code path execution will follow and therefore always generates a sub-optimal plan for the procedure.
Anybody else see what I am getting at here, and have any ideas? I am aware I could change the conditions in certain ways, but I am looking for a more general answer to this question of multiple OR options, without using dynamic SQL.
Jon
September 25, 2009 at 10:55 am
dmbaker (9/25/2009)
This was a great article, but typically when I start seeing needs like this, I start wondering if maybe a better solution would be to try something completely different. Once queries start getting into the realm of "decision support", maybe it's time to start investigating an alternative (e.g Analysis Services) that has better support for what your users want to do.It's not always practical or possible, given time and cost constraints, but it bears asking, I think.
That may be true, I wonder how the MDX query processor would handle these kinds of problems, conditional statements in the where clause?
September 25, 2009 at 10:56 am
Good article
Just a small piece of mind, in regards of the performance, SQL Server supports short circuiting, which can be a MAYOR help if you know what kind of filters will be used mostly because you can accomodate your WHERE clause to try to evaluate the minor number of conditions.
Regards
Soki G.
September 25, 2009 at 11:01 am
Soki (9/25/2009)
Just a small piece of mind, in regards of the performance, SQL Server supports short circuiting, which can be a MAYOR help if you know what kind of filters will be used mostly because you can accomodate your WHERE clause to try to evaluate the minor number of conditions.
Regards
Soki G.
http://www.pwnedgames.com[/quote%5D
Soki,
It sure does seem to support short circuiting, but I have seen performance problems on large data sets with doing too much short circuiting (see my Post #793948 above). Do you have any similar experiences?
Viewing 15 posts - 61 through 75 (of 106 total)
You must be logged in to reply to this topic. Login to reply