November 28, 2007 at 9:22 am
Hello, I'm using the following t-sql to insert qualifying records into table ProspectResearch. There are slightly more than 13000 recs to be added in my test data. It is currently taking about 10 minutes to insert 300 records and I'm not sure why it is so slow. There are no triggers on the receiving insert. Any suggestions to speed this up? Thanks much.
WHILE Exists
(SELECT ConsID as AccountID FROM #tmpFinancial
where #tmpFinancial.PFIType='Real Estate Value'
and ConsID not in(Select AccountID from ProspectResearch where ResearchCategoryCode='3001' and Company_Foundation='CMC Added'))
BEGIN
DECLARE @AccountIDint
DECLARE @ResearchIDint
DECLARE rs INSENSITIVE CURSOR FOR
(SELECT ConsID as AccountID FROM #tmpFinancial
where #tmpFinancial.PFIType='Real Estate Value'
and ConsID not in(Select AccountID from ProspectResearch where ResearchCategoryCode='3001' and Company_Foundation='CMC Added'))
DECLARE rs2 INSENSITIVE CURSOR FOR
SELECT max(ResearchID)+1 from ProspectResearch
OPEN rs
OPEN rs2
FETCH NEXT FROM rs INTO @AccountID
FETCH NEXT FROM rs2 INTO @ResearchID
WHILE @@fetch_status = 0
BEGIN
INSERT ProspectResearch(
Accountid,
ResearchID,
ResearchCategoryCode,
ResearchTypeCode,
ResearchSourceCode,
AddUser,
ChangeUser,
Company_Foundation)
VALUES(@AccountID,
@ResearchID,
'3001',
'3001',
'3001',
'CMC',
'CMC',
'CMC Added')
FETCH NEXT FROM rs INTO @AccountID
FETCH NEXT FROM rs2 INTO @ResearchID
END
CLOSE rs
DEALLOCATE rs
CLOSE rs2
DEALLOCATE rs2
END
GO
November 28, 2007 at 9:30 am
A couple of observations.
1 cursors are slow.
2 the check on @@FETCH_STATUS, should this be done after the first fetch, I think the fetch loop will never end?
Allen
November 28, 2007 at 9:44 am
I am confused... The below is taken directly from SQL Books on-line & it looks like I'm doing what they are saying.
Is there something else I could use besides a cursor? We used to do these in Access & I'm trying to move them all to SQL so all of this is new to me.
USE pubs
GO
DECLARE authors_cursor CURSOR FOR
SELECT au_lname FROM authors
WHERE au_lname LIKE "B%"
ORDER BY au_lname
OPEN authors_cursor
-- Perform the first fetch.
FETCH NEXT FROM authors_cursor
-- Check @@FETCH_STATUS to see if there are any more rows to fetch.
WHILE @@FETCH_STATUS = 0
BEGIN
-- This is executed as long as the previous fetch succeeds.
FETCH NEXT FROM authors_cursor
END
CLOSE authors_cursor
DEALLOCATE authors_cursor
GO
November 28, 2007 at 9:50 am
Why use a cursor at all? Cursors usually destroy performance. The fact that you're running 13,000 queries against ProspectResearch doesn't help either.
Using an actual set-based option - this should be done in a second or 2 (all 13,000). My test runs in under 1 sec.
Try this on
--just to know how fast it is
declare @g datetime
select @g=getdate()
declare @nextID as int
select @nextid=max(rid)+1 from testpivot
--set up the dsefault values
create table #tempresearch(
researchID int identity(1,1),
accountid int,
ResearchCategoryCode varchar(10) default '3001',
ResearchTypeCode varchar(10) default '3001',
ResearchSourceCode varchar(10) default '3001',
AddUser varchar(10) default 'CMC',
ChangeUser varchar(10) default 'CMC',
Company_Foundation varchar(10) default 'CMC Added')
--set the accountid to the correct value
dbcc checkident ('#tempresearch', reseed, @nextid)
--populate the temp table
insert #tempresearch(accountid)
select top 13000 N from dbo.tally
--move the data into the real table
INSERT ProspectResearch(
Accountid,
ResearchID,
ResearchCategoryCode,
ResearchTypeCode,
ResearchSourceCode,
AddUser,
ChangeUser,
Company_Foundation)
select
Accountid,
ResearchID,
ResearchCategoryCode,
ResearchTypeCode,
ResearchSourceCode,
AddUser,
ChangeUser,
Company_Foundation
from #tempresearch
--clean up
drop table #tempresearch
--check performance
select datediff(ms,@g,getdate())
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
November 28, 2007 at 9:52 am
Hi dbaltazar
From BOL
Returns the status of the last cursor FETCH statement issued against any cursor currently opened by the connection.
Your code does a FETCH (the second FETCH on rs2 that looks like it will never run out of data so will always return @@FETCH_STATUS = 0
I think I'm right on that - hopefully if not someone else will jump in.
Regarding another way of dooing it I would
INSERT INTO ... SELECT FROM ...
November 28, 2007 at 10:07 am
Matt beat me to it... using a cursor for this is a sure fire way to make code run slow. Lose the cursor and do it in the setbased fashion that he proposed.
Replace the "Cursor" help in Books Online with a single page that says, "Cursors = RBAR on steroids." 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
November 28, 2007 at 2:07 pm
🙂 This works now! Thanks Matt & everyone else. I've been testing it, I understand it and it runs quickly now. Thanks again!!!
November 28, 2007 at 2:13 pm
Not trying to be a smart guy here...
What did you learn by this?
--Jeff Moden
Change is inevitable... Change for the better is not.
November 28, 2007 at 2:17 pm
I learned cursors stink ... and I learned the hard way what is meant by RBAR. :laugh:
Actually though this is really going to help. In the 2 weeks I've been trying to do this I couldn't figure out how to increment my key and add data from another table at the same time. That was the best I could come up with. Didn't have a problem with speed until this week. My background is Foxpro & Access & as a company were trying to utilize SQL more so I'm figuring it out as I go. This is just wonderful!
November 28, 2007 at 3:05 pm
Perfect. Thanks for the feedback.
Just to be sure... you know what the "Tally" table is and how to build one?
--Jeff Moden
Change is inevitable... Change for the better is not.
November 28, 2007 at 3:16 pm
dbaltazar (11/28/2007)
🙂 This works now! Thanks Matt & everyone else. I've been testing it, I understand it and it runs quickly now. Thanks again!!!
You're welcome...Happy to help!
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
November 29, 2007 at 2:39 pm
Yep I know what a tally table is. I've been reading the forums in my spare time (what there is of that). 🙂
November 29, 2007 at 2:40 pm
So let me ask a follow up question. Under what circumstance would a cursor be good? Small data set?
November 29, 2007 at 4:28 pm
dbaltazar (11/29/2007)
So let me ask a follow up question. Under what circumstance would a cursor be good? Small data set?
When you need to send emails.
Or upload/report data from/to a file.
If you have multiple e-mail address or file names in the list you should process them one by one.
Cursors are perfect for it.
_____________
Code for TallyGenerator
November 29, 2007 at 7:45 pm
dbaltazar (11/29/2007)
So let me ask a follow up question. Under what circumstance would a cursor be good? Small data set?
Another possible reason is if you have to write code to "step through" all the databases in an Instance. They're good for "control loops" that control set-based operations. When they are used for RBAR ops, they just drag the world down for performance.
Even then, I don't use a cursor... I'll use a Temp table with an Identity column and a counter/loop. Same idea as a fire-hose cursor but more "fun".
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply