August 13, 2008 at 7:22 am
Good debate and glad to see it's not degenerating.
I have used cursors and temp tables for one-off stuff. Honestly it's been faster to build for me in some fairly small, one-time tasks to loop through things. I hate ms_foreach, so I usually roll my own for small stuff like that.
However, I'd tend to agree with Barry that you don't need to do this most of the time. I don't know I'd go as far as to say never, which was the point of the editorial, but you can get around it if you look and work at it.
My last comment would be to say that I've been putting together solutions for well over a decade with SQL Server and never had issues putting my objects in source control.
August 13, 2008 at 9:38 am
Tom Garth (8/12/2008)
GSquared - You are right about not leaving loose ends. I was referring to scripts that were on their way to being procs. I use the GOTOs to skip already tested parts, or not yet ready for prime-time pieces. But along with various print and select statements that I may be using, they have to come out before release to production.
I use block comments markers for that. It's much easier for me to visual notice that half the code is green than for me to notice a goto burried in the midst of paragraphs of code.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
August 13, 2008 at 9:49 am
rbarryyoung (8/12/2008)
Wow. This obtuse defense of cursors and RBAR has more hand-waving than a Michael Jackson concert. Practically every other post here has statements to the the effect that "sometimes cursors are the best solution", starting with the original article itself, and yet none of these claims has been backed up with an example, or a pointer of a cursor-based SQL procedure that is the best solution for SQL Server in its current release.Not one.
Example I: I had a database at a prior job where daily notifications needed to be e-mailed to clients based on data in certain tables. Each client would get customized, individual data, based on what they had requested. After reviewing and testing a number of potential solutions, I decided on using SQL 2005 Database Mail. Since sp_send_dbmail can't be called on a table all at once, I had a cursor step through the query and execute that proc one at a time.
I tested inserting data into the queue tables directly, without calling the proc, but that caused more problems than it solved in this case.
Example II: The IS dept here regularly needs up-to-the-minute copies of production databases copied to a testing server, to test new code on current data.
I wrote a proc that takes a source database name and a target database name as input parameters. It copies the most recent full backup to the test server, from the production server, restores it and renames it as the target database. Then it checks to see if there's been a diff backup since the last full backup, and does the same thing with that. Then it checks for log backups since the last diff backup, and has to restore those one at a time, and follow that with a final restore to bring the database into a usable state.
I use a cursor to step through the log backups, since the Restore command doesn't work on a set of backup files.
(The whole thing also uses a large chunk of dynamic SQL.)
In both of those examples, a cursor seems to me to be the best solution. Do you disagree? What would you use to accomplish those goals?
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
August 13, 2008 at 9:57 am
riix (8/12/2008)
Grant you seem more stressed than Gail; the point of discussion was Gail earlier stmt:"Just watch the performance implications. A single statement tvf is usually fine, the multi-statement ones temd not to perform well if they are returning lots of rows and are part of other queries"
and this was in answer to my prior-prior that a tvf is preferable to using the combination of a view and a controlling sp to "dynamically filter" the view.
this was the context of the discussion.
Now you really feel running a view that returns zillions of huge rows into a stored procedure that then just selects a few from that lot (and then only just a few fields from these) - this is more effective than writing a tvf that returns just what is needed?
(Emphasis added)
Actually, both do the same thing, just in different ways.
When a proc calls a view, it doesn't run as a nested sequence. It doesn't run the whole view, then run the proc on top of it. It just runs the proc as if the view's select were written into the proc.
To test this, create a view that selects the complete contents of a table. Wrap it in a proc that selects just one row, based on the clustered index (PK if possible). Run the proc. Then run the select directly. Make sure to check stats while you do both.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
August 14, 2008 at 9:32 am
The mode closed doors to certain technology the more limited the toolset to solve problems. I know there are at least three ways to do most everything but a few nanoseconds slower for 1000 executions is far less cost then extra days learning and then doing it in a way that is "perfect".
As I have said before, find a way to do it and do it. Something that does it right, getting the right answer a little slower still is right. With internal speeds running in the server running well past the speed of thought it is time to realize that some little technique does not matter.
And 'views' being a problem? If you have even the most simplistic view of BI you have to laugh at people who think views are "wrong".
Miles...
Not all gray hairs are Dinosaurs!
August 14, 2008 at 1:02 pm
Miles Neale (8/14/2008)
The mode closed doors to certain technology the more limited the toolset to solve problems. I know there are at least three ways to do most everything but a few nanoseconds slower for 1000 executions is far less cost then extra days learning and then doing it in a way that is "perfect".As I have said before, find a way to do it and do it. Something that does it right, getting the right answer a little slower still is right. With internal speeds running in the server running well past the speed of thought it is time to realize that some little technique does not matter.
And 'views' being a problem? If you have even the most simplistic view of BI you have to laugh at people who think views are "wrong".
Miles...
The problem with the philosophy espoused here is that you usually end up paying for the shortcuts later. Writing a query for maximum performance, even if it takes 30 seconds longer to write, means you won't have to spend a year rewriting the database later, when all of the slow, unstable, deadlockly code in it is causing the company to lose money.
The speed issues with cursors are the least of the problems they cause. Lock contention is much more important. That's one example of why "good enough" sometimes isn't.
The other fallacy here is that writing slow code is somehow faster. Which one of these takes longer to type:
update dbo.Table1
set Col1 = table2.Col2
from dbo.Table2
where table1.col3 = table2.col1
or
declare CurUpd cursor local fast_forward for
select Col2, Col1
from Table2
declare @Col2 int, @Col1 int
open CurUpd
fetch next from CurUpd
into @Col2, @Col1
while @@fetch_status = 0
begin
update dbo.Table1
set Col1 = @Col2
where Col3 = @Col1
fetch next from CurUpd
into @Col2, @Col1
end
close CurUpd
deallocate CurUpd
On the point on Views: The problem with many views is that they overdo what's needed. They are misused heavily by lazy database devs who don't understand what they are doing to the server.
Here's an example:
create view Tables1and2and3
as
select * -- Assume a dozen or more columns from each table.
from Table1
full outer join Table2
on Table1.Col3 = Table2.ColA
full outer join Table3
on Table2.ColX1 = Table3.ColB;
go
create proc MyProc
(@Parameter_in int)
as
select Col1, ColA
from Tables1and2and3
where Col2 = @Parameter_in
The view selects every column from all three tables, and does nothing to them. The proc then uses the view to select two of those columns, based on a Where criterion on one other column. In the proc, Table3 isn't needed, but the join to it is still going to be done because of the view.
I've seen this, and much worse, hundreds of times. Views that select from a dozen tables, including hundreds of columns, all to make writing four or five procs a little easier. But the procs will only include two or three of the columns, from two of the tables, that have a direct PK-FK relationship.
It's a shortcut for the developer, at the cost of killing performance.
I've taken dozens of procs, removed the view from them, written in the actual tables needed, and reduced run times from multiple seconds to fractions of a second.
Can views be correct? Definitely yes. The problem is not the few views that are used correctly. The problem is the multitudes of views that break things.
You mention saving a few nanoseconds per thousand runs. That's not the situation anyone here is dealing with. I'm having to fix procs that take minutes to run, and by applying a few standard practices, getting rid of cursors and views, fixing Where clauses to not have inline UDFs in them, etc., taking them down to sub-second run times.
If we were obsessing over a few thousandths of a nanosecond, that would be different. But what we're talking about here is crashed servers, applications that time out, deadlocks losing valuable data, etc.
Claiming otherwise is just an attempt to set up a straw man.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Viewing 6 posts - 61 through 65 (of 65 total)
You must be logged in to reply to this topic. Login to reply