June 1, 2009 at 7:03 am
I have to say I prefer the CASE approach to code such as this. Also remember that proper indexing of tables and thoughtful arrangement of parameters according to most frequent usage probably offsets performance to some extent. Granted I would not use this technique for heavily used performance intensive situations, but this is fairly straightforward coding in cases such ad ad-hoc reporting where performance is not as critical and versatility of code is important. And that assumes that code is written to avoid ad-hoc usage returning inordinately large sets.:-)
June 1, 2009 at 7:04 am
I have to say I prefer the CASE approach to code such as this. Also remember that proper indexing of tables and thoughtful arrangement of parameters according to most frequent usage probably offsets performance to some extent. Granted I would not use this technique for heavily used performance intensive situations, but this is fairly straightforward coding in cases such ad ad-hoc reporting where performance is not as critical and versatility of code is important. And that assumes that code is written to avoid ad-hoc usage returning inordinately large sets.:-)
June 1, 2009 at 7:08 am
Anything we can do to encourage procs, the better. But, with the movement towards the use of object abstraction tools such as NHibernate, Linq, and MyGenerate, I fear the battle against code generated, dynamic queries will only get more difficult. Many, like EntitySpace, encourage the use of stored procedures for CRUD operations, but Hibernate does not.
The more you are prepared, the less you need it.
June 1, 2009 at 7:32 am
Folks:
I prefer to go in the right way of doing this kind of procedures (true dynamic querys) even if the tables involved are small. In this way if the tables keep growing nothing bad happens
June 1, 2009 at 7:45 am
http://www.sommarskog.se/dynamic_sql.html
Erland Sommarskog made a comprehensive summary of dynamic SQL issues. Very thorough and very useful for quick reference. He's also illustrated differences between versions of SQL where relevant.
He also shows when to use/avoid isNull and coalesce -- an important consideration if the search field allows nulls or not.
June 1, 2009 at 7:49 am
Ridiculously off-topic, but... I've heard of 'A Boy Named Sue' but never a girl named Jimmy. :hehe:
June 1, 2009 at 7:49 am
I'm not a big fan of dynamic SQL, actually I hate it, but we try to present multiple points of view, and this is not a horrible way to code in places. It certainly doesn't fit some situations, and I know TheSQLGuru makes a living fixing this kind of code for a rather high price.
I wouldn't recommend this as a general technique for you to use in most situations, but it might make sense in some.
June 1, 2009 at 7:50 am
Where is the HTML mangled? I didn't see it in FF3.
June 1, 2009 at 8:46 am
With everyone getting into their religious entrenched positions on what way is the “best” way to handle this, I think we all need to take a step back and remember that as with just about everything, it depends. Yes, handling optional parameters using the @param IS NULL OR… method can cause poor performance, it depends on what type of system it is, how the tables are indexed, which index is the clustered index, and what the cardinality or granularity of the data is. In an OLTP system which only has indexes on a few key fields, or even in a reporting system on columns that have low degree of selectivity, will it really make a difference if you dynamically build the query string and then execute it?
June 1, 2009 at 8:49 am
Steve Jones - Editor (6/1/2009)
I'm not a big fan of dynamic SQL, actually I hate it, but we try to present multiple points of view, and this is not a horrible way to code in places. It certainly doesn't fit some situations, and I know TheSQLGuru makes a living fixing this kind of code for a rather high price.I wouldn't recommend this as a general technique for you to use in most situations, but it might make sense in some.
I have used dynamic SQL to improve multi-variable, multi-table 'open-ended' search routines FOUR TO FIVE ORDERS OF MAGNITUDE for two different clients. Several others have had lesser improvements. Throw in the drastically increased concurrency due to much shorter duration locking and it becomes even more extraordinary. This truly is a situation where dynamic SQL can shine. I must say it was a certified b-tch to get it written correctly for some cases that had 10+ tables and 20+ input parameters though! YMMV and always guard against SQL Injection!!!!!!
Believe it or not Steve I have had not one but TWO hiring managers tell me in the last few months that my rate was significantly LESS than what they expected! But I sure am happy with what I make (and consider myself extremely fortunate - especially in this economy) and with the fact that I have more work thrown at me than I can handle. Add on the fact that I truly love what I do and it just doesn't get any better!! 😎
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 1, 2009 at 9:12 am
Chris Harshman (6/1/2009)
With everyone getting into their religious entrenched positions on what way is the “best” way to handle this, I think we all need to take a step back and remember that as with just about everything, it depends. Yes, handling optional parameters using the @param IS NULL OR… method can cause poor performance, it depends on what type of system it is, how the tables are indexed, which index is the clustered index, and what the cardinality or granularity of the data is. In an OLTP system which only has indexes on a few key fields, or even in a reporting system on columns that have low degree of selectivity, will it really make a difference if you dynamically build the query string and then execute it?
1) if you don't have indexes on fields that you are searching for then you have bigger problems than writing bad code.
2) If you are searching for non-specific fields (without other more restrictive filters) likewise
Time to be blunt here: People, please push the "I Believe" button here. This (@param IS NULL OR @param = field1) is a TRULY BAD CONSTRUCT. Not only do you wind up with many table scans (or maybe index scans) instead of index seeks you also get hit by all kinds of query caching badness. Table scans when you should be doing seeks, or believe it or not even worse getting index seeks/bookmark lookups when you should be doing scans. Lets throw in additional poor performance of joining to tables that aren't even required given the input parameters. Concurrency problems. CPU/IO/RAM hits. The list goes on.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
June 1, 2009 at 9:20 am
Chris Harshman (6/1/2009)
Yes, handling optional parameters using the @param IS NULL OR… method can cause poor performance, it depends on what type of system it is, how the tables are indexed, which index is the clustered index, and what the cardinality or granularity of the data is.
It doesn't depend on cardinality, granularity or where the indexes are. This kind of query construct does not perform well because the optimiser has to generate a plan that is safe for reuse. Because the plan has to be safe for reuse it cannot be optimal, seeing that the columns actually been filtered on differ radically between executions.
Small tables (< 100 rows) maybe you'll get away with it. But there's no guarantee that those small tables will still be small in a year or two.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
June 1, 2009 at 9:23 am
I was misled by the article title. I was looking for the Dynamic stored procedure. However its a good one!
June 1, 2009 at 9:28 am
Chris Harshman (6/1/2009)
With everyone getting into their religious entrenched positions ...
So far it isn't too religious, nobody has brought up parameter sniffing yet.
June 1, 2009 at 9:36 am
Noel McKinney (6/1/2009)
Chris Harshman (6/1/2009)
With everyone getting into their religious entrenched positions ...So far it isn't too religious, nobody has brought up parameter sniffing yet.
Actually I did in my "Time to be blunt" post, although I didn't explicitly state the phrase. The part about doing index seek/bookmark lookup when you should be doing a table scan (or vice-versa) is due to parameter sniffing and plan caching.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 15 posts - 16 through 30 (of 83 total)
You must be logged in to reply to this topic. Login to reply