December 10, 2009 at 4:49 pm
Hey forumites, if you have a simple update statement in a stored procedure that updates millions of rows at the same time and you want to seperate the updates into batches of like say (10000) at a time to prevent locking, how do you go about this?
December 10, 2009 at 6:45 pm
I wrote the article Deleting Large Number of Records[/url]. Take a look at the code for SQL Server 2005, you should be able to use that as a starting point to write a batched UPDATE as well. Part of it will be determined by your where clause. As the updates occur, will the previous batch(es) be excluded from the update?
Hope this helps.
December 11, 2009 at 9:56 am
Thanks Lynn for your response to my question. To answer your question, Yes the previous updates will be excluded from the other updates. I will read your article now and might ask follow questions if need be. Thanks again for your time.
December 11, 2009 at 10:46 am
Thank you for the feedback. We'll try to answer any questions you may have.
December 11, 2009 at 11:01 am
Hey Lynn, love the article. I am assuming that replacing the delete statement with the update statement will suffice here in my case. I am also not concerned about the transaction logs at this point so would it be safe to disregard that part of the code? Another question, while i'm updating the large table, it will still be accessible by other tables right?
December 11, 2009 at 11:28 am
The only blocking you should see would be the result of attempting to access rows that are being updated in that particular batch as long as it hasn't escalated to a table lock. Doing it in batches, however, should allow the locks to release and allow other processes to access the table as the next batch is getting ready to run. If you aren't going to take t-log backups between updates, realize that the t-log will grow, and could be excessive. That is the purpose of the t-logs in the process, manage t-log growth as well. Without the t-log backup, you may want to put a short delay in its place.
And finally, yes, replace the DELETE with an UPDATE statement.
December 11, 2009 at 12:27 pm
Hey Lynn, Thank you very much for your help and advice. It is very much appreciated. Happy Holidays btw !
December 11, 2009 at 12:46 pm
Merry Christmas and Happy New Year to you as well. May the holidays be good to you.
December 14, 2009 at 8:52 am
The below will allow u to perform your update in chucks of whatever desired size u want. Correct my if i'm wrong but i suppose the idea is to have your update update one data page at a time using "rowlock". Updates accross massive amounts of records is very dangerous, especially when your updateing say order records for a massivew website whose end users are constantly trying to write to the the records ur updating. Locking is inevidable if u dont find a way to do them in chunks. I learned this one the hard way.
create table #loopingTable
(
id int identity(1,1)
, custId int
)
declare @rowId int
set @rowId = 1
declare @numRecsToProcess int
set @numRecsToProcess = 100 --this is the number of records to process
--these are the records u plan on updating
insert into #loopingTable
select distinct custId
from customer
while exists ( select top 1 * from #loopingTable where id > @rowid )
begin
update c with (rowlock) --try and minimize locking
set custname = 'whatever'
from customer c
join #loopingTable tt
on tt.custid = c.custid
where tt.id between @rowId and @rowid + @numRecsToProcess --this is the throtle
set @rowid = @rowid + @numRecsToProcess
end
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply