Help needed in removing dynamic SQL

  • Sadly I am limited in what changes I can make at this time but I've been given a dynamic query which performs badly (>15 seconds) which I think could be made better if the dynamic part could be removed. (I'm sure the LIKE part doesn't help much either.)

    The main part of the query runs along the usual lines

    'SELECT Col1, Col2, Col3 FROM TABLE1'

    Then there's a WHERE clause which can be any of the columns fed it

    'WHERE ColX LIKE ''%foo%'' '

    Finally there's an ORDER BY clause which again can be any of the columns

    ' ORDER BY ColX'

    All 3 parts are simply thrown together @sql = @sql + @WhereClause + @OrderByClause and then EXEC'ed.

    Is there anyway to re-write this using CASE statements or even sp_execsql which will lead to a saved execution plan and therefore possibly some sql-injection protection.

  • Can u give bit more info please? what you are trying to achieve with CASE statement. ?

  • SELECT Col1, Col2, Col3, ColD4 FROM TABLE1

    WHERE Col1 LIKE

    CASE

    WHEN @Col1 IS NULL THEN '%'

    ELSE @Col1 END

    The trouble is I don't know which of the columns is going to be passed in from the second statement. It could be Col1 or Col2 or Col3 .......

    And I don't know if writing a

    WHERE Col1 LIKE '%' AND Col2 LIKE '%' AND Col3 Like '%Foo%' AND Col4 LIKE '%' clause is going to make things even slower.

    What would be nice is WHERE @ColumnName LIKE @ColumnAnswer

  • I would say your bigger problem is that starting the LIKE condition with a wildcard means the database engine cannot use an index. So, if you have a single table you are querying, you will get a table scan every time - dynamic SQL or not.

  • Hmmm a good point. And best of all, I've just counted the rows in the table.

    You. Don't. Want. To. Know.

    I could buy a UK bank with numbers like that.

    Perhaps I should suggest they use cusors... 😉 - Do they help will many, many, many thousands of rows???

    That one should (in theory) be a more straightforward fix. I'll get the code "updated" to remove the LIKE and % signs and confirm the results come back the same in test (ready for live)

    I'd still much, much rather exterminate that dynamic SQL run though. If anyone has any ideas on a re-write which removes the swathes of red text from QA I'd be most grateful.

  • If it is an either/or situation, you can do the logic before calling the select statement:

    IF @Col = 'FirstName'

    BEGIN

    SELECT * FROM MyTable WHERE FirstName LIKE @val + '%'

    END ELSE IF @Col = 'LastName'

    BEGIN

    SELECT * FROM MyTable WHERE LastName LIKE @val + '%'

    END

    If this is in a procedure, you may want to call to individual procedures within the IF statement so you do not end up caching bad plans.

    Otherwise, you could handle the logic with CASE statements, but dynamic SQL is probably going to perform better.

  • FNS (10/13/2008)


    Perhaps I should suggest they use cusors... 😉 - Do they help will many, many, many thousands of rows???

    Umm, I'm going to assume that the smiley face means that you are kidding about using cursors here.

    I'd still much, much rather exterminate that dynamic SQL run though. If anyone has any ideas on a re-write which removes the swathes of red text from QA I'd be most grateful.

    I'm a little curious about your negative feelings towards Dynamic SQL. Granted, from a style standpoint, it does degrade the code quality a bit: say a 25-30% hit. However, style-wise it is whole lot better than an extensive list of IF..ELSE branches with 90% identical code (have fun trying to maintain that correctly) and about the same as a lot of WHERE..AND clauses terminated with "OR @Parameter IS NULL" expressions (which also performs much more porrly than dynamic SQL).

    And injection is certainly a valid concern, but there are also ways protect yourself from it and still be able to use Dynamic SQL.

    As for performance, I cannot say that I know of any cases where a performance problem was caused by some SQL code being dynamic. And in fact, in many cases, Dynamic SQL is the most performant solution available.

    Don't get me wrong: I don't think that Dynamic SQL should be used indiscriminately, only where it is needed and in an appropiate manner. However, Dynamic SQL does have many needful and appropiate uses.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • rbarryyoung (10/13/2008)


    And injection is certainly a valid concern, but there are also ways protect yourself from it and still be able to use Dynamic SQL.

    I'm looking at having to use Dynamic SQL on a .net website for a couple of the reports we're planning to put there. Do you have a link to a good article on injection protection?

    Seth Phelabaum


    Consistency is only a virtue if you're not a screwup. 😉

    Links: How to Post Sample Data[/url] :: Running Totals[/url] :: Tally Table[/url] :: Cross Tabs/Pivots[/url] :: String Concatenation[/url]

  • It might help if we could see the actual code you are working with else wise all we can do is guess at what may help you.

    😎

  • Garadin (10/13/2008)


    rbarryyoung (10/13/2008)


    And injection is certainly a valid concern, but there are also ways protect yourself from it and still be able to use Dynamic SQL.

    I'm looking at having to use Dynamic SQL on a .net website for a couple of the reports we're planning to put there. Do you have a link to a good article on injection protection?

    I wish I did, but I have not seen a really good one myself.

    For the purpose of discussion I like to classify different SQL code into levels according to the degree to which they incorporate user or client-supplied text into SQL statements:

    Level 0: No user text is used to construct dynamic SQL commands.

    Level 1: User text is only used to specify the values of columns, variables and expressions (i.e.: "something = client-text").

    Level 2: User text is used to specify Column Names.

    Level 3: User text is used to specify Table & View Names.

    Level 4: User text is used as expressions of a specific and limited type to be directly included in the SQL Command.

    Level 5: User text used as SQL commands or is directly incorporated as SQL commands.

    Level 0 may have a lot of dynamic SQL in it but is perfectly safe, because it never includes user text in a SQL command.

    Level 1 is not safe when done directly, but can be made safe by transforming it into a parametized call to sp_executesql that is then Level 0 Dynamic SQL.

    Level 1 is as high as I have seen any article go, and I have seen many articles incorrectly assert (or imply) that this technique is applicable to and safe for all Dynamic SQL (it isn't).

    The trick to making Levels 2 and 3 safe is called "Keying" wherein, instead of using the user text directly in creating the SQL string, it is used as a "Key" into a table of acceptable/allowable strings. For instance, if the user text is supposed to specify a column name from a specific table then you do something like this: Select @SafeColumnName = COLUMN_NAME

    From INFORMATION_SCHEMA.COLUMNS

    Where TABLE_NAME = 'mySpecificTable'

    And COLUMN_NAME = @UsersColumnText From this point you can transform the Dynamic SQL to Level 1 or 0, because the dynamic SQL text no longer includes any direct user-supplied text.

    I have seen some (very few) responses in SQL support forums that discuss Level 2 techniques. I have never seen or heard anything higher on the Internet (though of course, the Internet is a very big place).

    Level 4 can be validated only by parsing the user text to insure that it stays strictly within the bounds of what it is supposed to be expressing. Level 4 is distinct from Level 5 because Level 4 is still reasonably possible (if nonetheless extravagant).

    Level 5 can only be validated through a full SQL parser. Obviously unreasonable for anyone other than the vendor to attempt.

    And all of this should be firewalled in special-purpose roles/schemas/db user accounts whose rights and permissions have designed for this designated purpose.

    This list is not 100% inclusive as there are some unusual schemes that do not quite fit into one of these levels and in particular, there is arguably enough room between Level 3 and Level 4 that you could squeeze in one or two more meaningful levels. Hmm, maybe I should write an article...:hehe:

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • rbarryyoung (10/13/2008)


    Hmm, maybe I should write an article...:hehe:

    I was kinda hoping you'd come to that conclusion :P.

    We'll likely be generating a dynamic column list for the final select, as well as an order by, and possibly even some exclusion criteria via dynamic SQL. (Basically customizing how a report will be displayed) I had considered parsing the lists to ensure all columns actually existed before running anything, but it seemed like it'd add quite a bit of extra overhead, and was kinda hoping for an easier way.

    Seth Phelabaum


    Consistency is only a virtue if you're not a screwup. 😉

    Links: How to Post Sample Data[/url] :: Running Totals[/url] :: Tally Table[/url] :: Cross Tabs/Pivots[/url] :: String Concatenation[/url]

  • rbarryyoung (10/13/2008)


    Umm, I'm going to assume that the smiley face means that you are kidding about using cursors here.

    Correct - I have enough problems with performance when some genius decides that 1.5 million row tables are ideal cursor material. Then I have to watch as SQL crawls through the table, row by tortuous row. It's even better when there's an ntext field in the row.

    I'm a little curious about your negative feelings towards Dynamic SQL.

    Personally I think on the whole it's done out of laziness and sloppy coding practices. I know that sometimes it's an invaluable tool but it shouldn't be used first without thought for the alternatives. So if I can exterminate it, I do.

    Someone else wanted more specific detail on the issue:

    CREATE PROCEDURE FindUserDetails(

    @WhereClause varchar(2000),

    @OrderByClause varchar(50)

    )

    AS

    declare @sql varchar(2000)

    set @sql = 'SELECT UserID, CustomerID, FirstName, LastName, EMailAddress, CompanyName, CompanyID, CompanyEmailAddress, CompanyPhone, CompanyLocation

    FROM CustomersDetails JOIN

    CompanyDetails

    ON CompanyID = UserCompanyID' + @WhereClause + @OrderByClause

    EXEC(@SQL)

    GO

    So the call would be EXEC FindUserDetails 'WHERE FirstName LIKE '%Foo%' , 'ORDER BY CompanyName'

    My problem is that the @WhereClause (and @OrderByClause) can have any of those columns listed as it's generated on a webserver over which I have no control. Now I can tinker within the stored proc but I can't change what it's being fed. I might be on a hiding to nothing here but I thought I'd ask in case anone has a suggestion.

  • FNS


    Personally I think on the whole it's done out of laziness and sloppy coding practices. I know that sometimes it's an invaluable tool but it shouldn't be used first without thought for the alternatives. So if I can exterminate it, I do.

    So Jaded! Like anything else, Dynamic SQL is a tool. It can be used for good or for harm. There are some things that are infinitely more difficult, or simply impossible without it, and there are times when it's completely extraneous and easily replaced. I'm guessing you've been exposed to a lot of the latter.

    Seth Phelabaum


    Consistency is only a virtue if you're not a screwup. 😉

    Links: How to Post Sample Data[/url] :: Running Totals[/url] :: Tally Table[/url] :: Cross Tabs/Pivots[/url] :: String Concatenation[/url]

  • My take on this is that it is inappropriate for any front-end to build your WHERE clause for you. A procedure can easily be written to take a parameter for each item that could potentially be in your WHERE clause and then to build a dynamic WHERE clause based upon what is passed in. That would allow you to easily verify what is being asked for and to clean up anything that violates any rules (i.e. prohibit a leading '%').

    Even if you couldn't disallow leading '%'s, having the individual parameters passed in could allow you to include all the parts of the WHERE clause except for those that could have those nasty LIKE's. You could load those results into a temp table (hopefully a much smaller set of data than the original set). Then you could perform the LIKE on your temp table to filter it down from there.

    Just my 2 cents,

    Scott

  • Dynamic SQL is not my preference but not always the cause of performance issues.

    The type of SQL produced would be a likely cause of performance problems as you have little control over what is generated. And if you are not in control of what is generated how can you ensure you have the best indexes in place.

    Also if the SQL generated is continuously changing you will loose the benefits of cached query plans.

    If you want to really deal with this you need to provide examples of the generated SQL (best and worst case scenarios).

    If there are only a few examples of parameter variations there is rarely an excuse for not using a well thought out fixed SQL statement.

    SELECT UserID, CustomerID, FirstName, LastName, EMailAddress, CompanyName, CompanyID, CompanyEmailAddress, CompanyPhone, CompanyLocation

    FROM CustomersDetails JOIN CompanyDetails ON CompanyID = UserCompanyID

    WHERE

    (@Firstname = '' OR Firstname like @Firstname '%')

    ORDER BY

    CASE @orderby WHEN 'fname' THEN FirstName

    ELSE LastName

    END DESC

    Also security wise dynamic SQL is not safe, even with the best will of developers validating parameters.

Viewing 15 posts - 1 through 15 (of 27 total)

You must be logged in to reply to this topic. Login to reply