September 30, 2009 at 8:57 pm
I have a table which has 19 million rows and is 2.6TB in size.
I need to update a varbinary(max) column to null for a certain criteria (documenttypeid in 4, 10,11,12,13)
Any recommendations on how best to do this - It should be done in batches so as not take extended locks.
cheers.
October 1, 2009 at 2:37 am
You could use something similar to this:
DECLARE @rwcnt int
DECLARE @BATCHSIZE int
SET @rwcnt = -1
SET @BATCHSIZE = 100
WHILE @rwcnt <> 0
BEGIN
SET ROWCOUNT @BATCHSIZE
UPDATE VeryLargeTable
SET varbinaryColumn = NULL
WHERE documenttypeid IN (4, 10,11,12,13)
AND varbinaryColumn IS NOT NULL
SET @rwcnt = @@ROWCOUNT
SET ROWCOUNT 0
END
If the recovery model is full, I recommend you take log backups between each iteration to avoid the log filling, or switch recovery to simple.
Hope this helps
Gianluca
-- Gianluca Sartori
October 1, 2009 at 1:25 pm
cheers, thanks.
That's what I had too but it's the last bit where it's checking for rows that are not null that seems to take too long in a test I did.
October 1, 2009 at 1:42 pm
Do you have any indexes on the tables?
October 1, 2009 at 3:00 pm
BrentMc (10/1/2009)
cheers, thanks.That's what I had too but it's the last bit where it's checking for rows that are not null that seems to take too long in a test I did.
Try this instead...
AND varbinaryColumn > ''
--Jeff Moden
Change is inevitable... Change for the better is not.
October 12, 2009 at 1:42 am
This is a case where a cursor might be handy.
DECLAREcurYak CURSOR FOR
SELECTVarBinaryColumn
FROMVeryLargeTable
WHEREDocumentTypeID IN (4, 10, 11, 12, 13)
AND VarBinaryColumn IS NOT NULL
OPEN curYak
FETCHFIRST
FROMcurYak
WHILE @@FETCH_STATUS = 0
BEGIN
UPDATEVeryLargeTable
SETVarBinaryColumn = NULL
WHERECURRENT OF curYak
FETCHNEXT
FROMcurYak
END
CLOSEcurYak
DEALLOCATEcurYak
N 56°04'39.16"
E 12°55'05.25"
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply