August 12, 2008 at 9:36 am
Oops, almost forgot: Dynamic execution also provides Turing-completeness.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
August 12, 2008 at 9:37 am
thanks folks 🙂 glad to hear i'm not way off-base!
sages be damned.
August 12, 2008 at 9:59 am
regarding comment:
"view vs stored proc....you still need to wrap the view up in a stored proc in order use dynamic selection criteria."
wrong - use a table-valued function. Much tidier and as MS says "a tvf is essentially a parameterized view".
and just in case you think this to be sage advice, I also use SQLCLR as much as possible -:w00t: - much to the shock and horror of almost every dba I ever meet.
August 12, 2008 at 11:40 am
riix (8/12/2008)
why not add in diagrams and foreign key constraints? surely there's a perf penalty to consider when every fk is constantly yet needlessly checked by the dbms when the CUD (who cares about R) is done by stored procs and the db is secure against external changes.
The optimiser can use the foreign keys to generate more efficient execution plans in some cases. It's more information about the data for the optimiser and often more information means more optimal queries.
More on that: http://blogs.msdn.com/conor_cunningham_msft/archive/2008/08/07/foreign-keys-are-our-friends.aspx
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 12, 2008 at 11:49 am
ryan.leuty (8/12/2008)
Is it better [faster, more reliable, more secure] to send a complex statement to SQL via the web app, or use a sproc?
Faster, maybe. It depends. SQL caches execution plans of adhoc SQL as well as stored procs, but not as efficiently and with less chance of reuse, unless you take pains to parametrise those queries properly.
More reliable, not sure what you're getting at here.
More secure. For sure. Google SQL injection and take a look. Admittedly, you can get much of the same benefit by parametrising the SQL statements rather than concatenating them together and Stored procs don't help at all if you're concatenating queries together inside.
The other advantage of stored procs is modularity. It means that your DBAs can optimise the stored procs, or even change the underlying table definitions and the procs, and they don't have to touch your web code at all.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 12, 2008 at 11:54 am
riix (8/12/2008)
regarding comment:"view vs stored proc....you still need to wrap the view up in a stored proc in order use dynamic selection criteria."
wrong - use a table-valued function. Much tidier and as MS says "a tvf is essentially a parameterized view".
Just watch the performance implications. A single statement tvf is usually fine, the multi-statement ones temd not to perform well if they are returning lots of rows and are part of other queries
I also use SQLCLR as much as possible -:w00t:
Why?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 12, 2008 at 12:36 pm
In asking for an example of someplace where cursors were a preferred solution, did the referenct to TPC benchmarks not count? They don't get those programmers from the local staffing agency, you know ... those folks know their stuff.
But the classic use case where a cursor is defensible is the long-running, massive update that needs to (for any number of legitimate reasons) run in the background as unobtrusively as possible. You can use a T-SQL server side cursor which amounts to building a record set and then POPping through it iteratively; or you can build a WHILE loop that does a series of singleton FETCHes instead of popping results off of an opened cursor; or (in really simple cases) you can do your DML repetitively with a ROWCOUNT or TOP limit in effect. In this kind of use case, the cursor's locking can be made to be totally independent of the table data. That in and of itself can have huge, beneficial, concurrency implications.
The reason I'm not going to give you specific examples is that I work for a great and very well-known Internet-centric company, and I am legally prohibited from doing so. But I assure you that use cases such as the one I'm describing above do exist for us, and that cursors are sometimes an admirable solution for them.
As others have noted, you can't draw any conclusions just from the fact that you can rewrite something one way or the other and see performance gains. For very hard problems involving lots of data and different concurrency requirements, I frequently code, optimize, and test both with and without cursors. I am pretty sure I can optimize the code either way reasonably well, and the DML "inside the loop" is usually the same anyway. Based on our results, I'd have to say that people who say wear the "cursors are bad" bumper sticker are just assuming a particular class of use cases, because our testing shows otherwise. Like it or not, that's the way I see it.
Cheers,
Chris
August 12, 2008 at 12:40 pm
this thread is great because it proves the editor's point exactly!
there are times that you should use the 'no-no's!
August 12, 2008 at 1:01 pm
The thing is, most people on the thread, just about everyone on the thread, that suggested using TSQL cursors was a bad thing didn't say "THOU SHALT NEVER USE A CURSOR" They just said, it really depends on the use and most uses to which they are applied currently are inappropriate. I think that's true. How can that be read as a "no-no." It's a "possibly-possibly"
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
August 12, 2008 at 1:05 pm
that's pretty much my point and the editor's. they are not no-no's.
August 12, 2008 at 1:10 pm
chrisleonard (8/12/2008)
In asking for an example of someplace where cursors were a preferred solution, did the referenct to TPC benchmarks not count?
No, because you did not (and still have not) provided any actual examples. Just waving your hand in that general direction is not a demonstration of your point.
But the classic use case where a cursor is defensible is the long-running, massive update that needs to (for any number of legitimate reasons) run in the background as unobtrusively as possible. You can use a T-SQL server side cursor which amounts to building a record set and then POPping through it iteratively; or you can build a WHILE loop that does a series of singleton FETCHes instead of popping results off of an opened cursor
WHILE loop is the preferred way to do this, not Cursors. Both because cursors want to make temp tables behind your back and hold onto them and because WHILE loops offer the possibility of controlling the chunk size in ways that cursors do not. Not that I normally recommend WHILE loops either, normally they are just as bad as cursors, but in this case there is a difference and WHILE is preferred.
; or (in really simple cases) you can do your DML repetitively with a ROWCOUNT or TOP limit in effect.
Sure, but there is nothing about doing DML repetitively in SQL 2005 that requires a cursor.
In this kind of use case, the cursor's locking can be made to be totally independent of the table data. That in and of itself can have huge, beneficial, concurrency implications.
Sorry, I do not follow you on this. Sure would be easier with an example.
The reason I'm not going to give you specific examples is that I work for a great and very well-known Internet-centric company, and I am legally prohibited from doing so.
I work for hundreds of companies and I am restrained by so many NDA's that I could wallpaper my house in them. Nonetheless, I still find that when I offer opinions, I can create examples to support them without violating any of my other obligations. The key is to create definitions and data that have nothing to do with specific instances so much as general principles.
[font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
Proactive Performance Solutions, Inc. [/font][font="Verdana"] "Performance is our middle name."[/font]
August 12, 2008 at 1:18 pm
Just watch the performance implications. A single statement tvf is usually fine, the multi-statement ones temd not to perform well if they are returning lots of rows and are part of other queries
Hmm .. if I have a situation where a multi-statement tvf is *required* then its most likely not something that I could've done in a view. Also, anything could be "part of other queries" so this is noise. Finally, the entire idea of having a stored proc around just to "dynamically filter" the output of a view is as silly as cobol.
Above quote is kinda what Ryan and I discussed earlier - sage-based FUD (fear, uncertainty, doubt) generation.
(Ryan - close your eyes!).
And why SQLCLR? Cuz it works? Cuz in our db's it performs more than adequate for our business needs? Cuz its easier to create/manage? Cuz I put everything (sp, trigs, funcs, etc.) into source-code repository? Cuz I can version everything along with my apps? Cuz I can even use cobol.net to create store procs?:D
August 12, 2008 at 1:31 pm
riix (8/12/2008)
Hmm .. if I have a situation where a multi-statement tvf is *required* then its most likely not something that I could've done in a view.
Indeed. I have, however seen in the past, people creating a 'multi statement' tvf with only a single statement in. They are two different forms of functions, but sometimes they get confused.
Also, anything could be "part of other queries" so this is noise.
Not really. What I meant is that a query of the form
SELECT <column list> FROM dbo.fn_ComplexMultiStatementFunction(@SomeParameters)
isn't an issue or a concern. When said function gets included in a more complex query and joined in to other tables, performance problems sometimes show. This is because the multi-statement tvf doesn't have statistics available for the data returned. Hence the optimiser doesn't know how many rows will be returned or what the distribution of values in those rows are. That cardinality inaccuracy can result in non-optimal query plans been created and slower-than-expected queries.
Does that explain some of the reason behind the statement?
Above quote is kinda what Ryan and I discussed earlier - sage-based FUD (fear, uncertainty, doubt) generation.
I'm sorry the statement didn't meet your standards of approval. What would you prefer in the future?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
August 12, 2008 at 1:43 pm
Gail, I can tell you do softer style martial arts because your response is much more a redirect of his energies than mine would have been. Very nice.
riix, try this, execute a query involving a JOIN to a view, a single-statement UDF and a multi-statement UDF where each returns a couple of hundred rows. Measure the responses and look at the execution plans. Then tell me how much FUD is being generated around the multi-statement UDF. Jeez.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
August 12, 2008 at 1:46 pm
Grant Fritchey (8/12/2008)
riix, try this, execute a query involving a JOIN to a view, a single-statement UDF and a multi-statement UDF where each returns a couple of hundred rows. Measure the responses and look at the execution plans. Then tell me how much FUD is being generated around the multi-statement UDF. Jeez.
Make that a couple hundred thousand. To make it a valid test. Anything's fast with a hundred rows
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
Viewing 15 posts - 31 through 45 (of 65 total)
You must be logged in to reply to this topic. Login to reply