December 28, 2007 at 3:59 pm
I'm not sure about that Jeff.
I have come across situations where the firehose cursor is faster than the loop.
I pinned it down to the loop running many separate queries where as the firehose cursor grabs a lock and blasts through the records happily blocking other users while it does so.
Works fine on small datasets but as you say, if there is a set based way of achieving the same result use the set based way.
December 28, 2007 at 4:26 pm
What would you have the cursor or While loop do in a test? I think they'll come out the same but I'm not sure so, you know me, I'll do a test. I just need to know what the code should do because I'm so set based oriented, I'm not sure I could come up with a decent test for a cursor vs While loop example...
--Jeff Moden
Change is inevitable... Change for the better is not.
December 29, 2007 at 2:45 am
Problem with code like this is the max variable length will be hit you when you least expect it. There is also the bigger problem of a delimiter or separator appearing in the data which will cause the stored procedure to throw up. Users are good at this!.
I have one example which currently has 177000 rows in the insert(ed) table which uses code like this:
DECLARE @MaxRowCount int,
@RowNumber int
/*
Create table to hold data from the cursors select statement
You could check for there being only one row to process here and skip the table creation code. Experience has shown it's not worth the grief.
*/
CREATE TABLE #SomeTableName (RowNumber int IDENTITY (1,1),..........)
--Insert the rows from the cursors select statement
INSERT INTO #SomeTableName (............)
SELECT .......... FROM
SELECT @MaxRowCount=MAX(RowNumber) FROM #SomeTableName
SET @RowNumber=1
WHILE @RowNumber<=@MaxRowCount
BEGIN
--do your processing
SELECT ................. FROM #SomeTableName WHERE RowNumber=@RowNumber
SET @RowNumber=(@RowNumber+1)
END
Surprisingly code like this really flies. Donβt use variable tables as you will be back at square one!
December 29, 2007 at 8:29 am
John Beggs (12/8/2006)
All of this seems like a lot of work to replace a cursor in a row by row process is needed.
Why is it that people don't seem to understand that cursors are only bad if you use them where a set based solution could have been used?
What's more, while I have no issue with using a table variable or temp table when needed, I challenge you to make either of them out perform a properly formed cursor. FAST_FORWARD anyone?
I absolutely agree... my problem with most people's code is they make it so that you MUST use RBAR. Instead of working out a set based solution, they'll have some RBAR GUI code laying around that's written to handle precisely 1 row. Then, they'll use that same single row proc for a batch process of thousands of rows instead of having a "sister" process that will handle all of the rows in the batch.
For control loops where each loop processes sets of rows instead of RBAR, I see no problem with using a Fast_Forward cursor or a Temp/While Loop. Even then, though, people still seem to think that certain things just aren't possible using set-based code and they inappropriately revert to ISAM/RBAR thinking.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 30, 2007 at 7:09 pm
I agree with Jeff - set-based is certainly best. The idea of using service broker in the background is pretty good. Often the rbar stuff, if absolutely necessary, could be done later, in which case service broker (which I've never actually used) or populating a batch table with a batchID and the record keys and having a timed sql server agent job to process the rows is good enough.
Often such things arise on the server because it's "easier" than writing a little service or app to do the processing, particularly if it's a simple small (and will stay that way) job that runs once a day. RBAR in a trigger though sounds like it could happen a LOT and thus a better set-based approach is warranted... I would be interested in seeing the performance metrics of such code in a trigger comparing cursors calling a stored proc vs building the dynamic SQL and calling a stored proc. The proc would have to do something nasty like write a text file to disk π The proc could be called 1000 times due to 1000 rows being inserted into the table on which the trigger is based. We could pretend the file being written to disk has the artibrary ID as its filename and the contents of the rest of the table row within it for some other system to collect....
I'm too lazy today though to code it up π
January 2, 2008 at 3:40 am
I have to say, this 'solution' is actually worse than having a cursor. It would have the same performance implications (because the reason cursors are 'bad' is because row based processing is slow).
Therefore it's just another way to cause the same problem, and is a badly researched article. The reason it's worse than a cursor is because of the 8000 character limit, which will cause large insert batches to fail.
Nice.
Atlantis Interactive - SQL Server Tools
My blog[/url]
Why I wrote a sql query analyzer clone
January 2, 2008 at 6:06 am
I agree with Matt Whitfield
while loop concept is more eassy then using cursor
and this method is good in conceptually but not feasible practically when you are dealing with VLDB's
Regards
Shashi Kant Chauhan
January 2, 2008 at 9:34 pm
The real point here is that you should not be using any form of RBAR in a trigger... no matter how you do it, calling a RBAR proc from a trigger is an insane thing to do... the proc should be rewritten to handle sets of data instead of the slothful agony of single row processing. π
--Jeff Moden
Change is inevitable... Change for the better is not.
April 29, 2008 at 1:02 am
Hi,
Tell me the way to retrieve records row by row without using Cursor Concept.I dont like fetching record using cursor.Best solution plz.
April 29, 2008 at 7:02 am
Poornima,
I think the basic concept is to avoid doing row by row operations when you can -- you should reflect on your need to do this. But, if you want a row by row operation and you don't like cursors, you can just dump your query result into a temporary table with an AUTOID column and increment a counter in your loop, taking a new record from your temporary table each time. You can even fancy it up with a CTE and ROW_NUMBER, if you're using SQL Server 2005.
Ion
April 29, 2008 at 11:38 pm
Hi Ion,
I got answer for my query through this forum.In this already Shaalini had posted thread about this.Jeff had given the answer.Ur site s vey useful and immediate reply from ur site make us to learn more.Thanks a lot...
April 29, 2008 at 11:46 pm
Ummm, then why did you post that you wanted to process rows one at a time? We've all said that's a bad thing... what is it that you're trying to do?
--Jeff Moden
Change is inevitable... Change for the better is not.
April 30, 2008 at 12:04 am
Hi,
s i need row by row process only.
For eg
-----
Consider the table stud
rno name dept
1 ram cse
2 radha ece
Consider another table studSubj
sdno rno mark1 sub
578 1 80 Maths
579 1 98 Physics
Here i want to duplicate the rows of stud and using tat newly generated rno i have to update the same records in studSubj table.i.e.,In stud table we have rno-1,for rno-1 we have 2 records in studSubj.If i generate duplicate records then it will have some rno like 4 r 5..For rno-4 i need to have same records of rno-1 .
This is my entire need.
April 30, 2008 at 9:10 am
See the following URL... almost identical situation...
http://www.sqlservercentral.com/Forums/Topic491969-149-1.aspx#bm492576
--Jeff Moden
Change is inevitable... Change for the better is not.
May 1, 2008 at 11:03 pm
Hi Jeff,
Thanks a lot for ur reply.I learnt more from ur sites.very good and useful forum...
Viewing 15 posts - 46 through 60 (of 70 total)
You must be logged in to reply to this topic. Login to reply