February 20, 2004 at 6:45 am
Hi,
The following procedure has been running for 14 hours. Does this seem reasonable? All I want to do is update the columns where LINK_ID matches in each table. Should be straight forward but it is taking too long - is my Database hung? Any suggestions to make these queries run faster?
The Navtech table has 23M rows, NavtechQ4_2003 ~ 21M rows.
The other tables: navtechlnk has about .7M, and navmaxfips ~ 1.6M
The LINK_ID columns are indexed for each table.
I am running this procedure from the Query Analyzer
I am running Simple recovery mode.
Server has 2GB of memory and SQLSErver taking up about 1.5GB of that.
update sde.conman.navtechQ4_2003 set modified = '2/18/04', username = 'INIT'
print "updating from navtechlnk"
update sde.conman.navtechQ4_2003 set
nhs = t1.nhs,
strahnet = t1.strahnet,
priority = t1.priority,
ppp1 = t1.ppp1,
ppp2 = t1.ppp2,
ppp3 = t1.ppp3,
ppp4 = t1.ppp4,
ppp5 = t1.ppp5,
ppp6 = t1.ppp6,
ppp7 = t1.ppp7,
isp_found = 1
from sde.conman.navtechq4_2003 t
inner join sde.MIKE.navtechlnk t1 on (t1.link_id = t.link_id)
print "Done updating from navtechlink"
print "updating stfips from navtech..."
update sde.conman.navtechq4_2003 set
stfips = t1.stfips
from sde.conman.navtechq4_2003 t
inner join sde.conman.navtech t1 on ( t1.link_id = t.link_id)
print "Done updating stfips from navtech..."
print "updating stfips from Maxstfips...."
update sde.conman.navtechQ4_2003 set
stfips = t1.stfips
from sde.conman.navtechq4_2003 t
inner join sde.conman.navmaxfips t1 on (t1.link_id = t.link_id)
February 20, 2004 at 10:30 am
Even in simple mode, this will be logged in one long transaction. Once it completes, the checkpoint will wipe it out, but until then it's all getting written.
I'd break it into batches, run maybe 100 or 1000 at a time, use a loop, while exists (Select * from ...), this might help. I've had loops like this run in a fraction of the time of the whole update.
February 22, 2004 at 3:18 pm
Hi Steve,
I am still having trouble with this query. I tried a number of different ways to use the while loop, but it still runs for over 10 hours and only 100K records updated.
Could you elaborate on your solution with some additional code?
I tried the following. ObjectID is unique 1..N for each row. If I test on a smaller set this works, but the larger data tables still seem to drag on overnight without result.
declare @upper integer
declare @lower integer
set @upper = 1000
set @lower = 1
print current_timestamp
WHILE (SELECT count(objectid) from sde.conman.navtechq4_2003 where objectid >= @lower and objectid <= @upper) > 0
BEGIN
update sde.conman.navtechQ4_2003 set
stfips = t1.stfips
from sde.conman.navtechq4_2003 t
inner join sde.conman.navtech t1 on (t1.link_id = t.link_id)
where t.objectid >= @lower and t.objectid < @upper
set @lower = @upper + 1
set @upper = @upper + 1000
END
print current_timestamp
GO
Thanks for your help. This forum has been great to me over the last few months.
-Mike
February 22, 2004 at 6:37 pm
February 22, 2004 at 7:38 pm
My trouble seems to be how to do it for every 1000.
Do you put the rowcount inside the loop or is the loop not necessary anymore?
I put the Set Rowcount = 1000 in my loop and it seems to run a lot faster on my test table of 1M records. Takes about 3 minues instead of 20.
Thanks for your help!
-Mike
February 22, 2004 at 8:56 pm
February 22, 2004 at 9:22 pm
The statement you provided only updates 1000 rows out of my table.
I am new at SQL so I am not sure if I need to include anything else.
I do appreciate your help.
-Mike
February 22, 2004 at 9:33 pm
February 22, 2004 at 9:54 pm
I've got a where clause that I am incrementing in a loop to select the primary key.
where t1.objectid >= @lower and t1.objectid <= @upper
The problem is my table is so large 23M records - that the query runs too slow.
If I test based on 100K records - your query takes 7 seconds. If the table has 1M records - it takes about 3 minutes. If it has 2M records - in 11 minutes is about about 30% complete. At that rate 23M won't finish in a reasonable time - I'll even take overnight as good enough.
Any additional thoughts?
February 22, 2004 at 10:14 pm
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply