September 19, 2011 at 2:26 pm
If you have space available on the server you're performing this on, sometimes it's easier to simply create a new table, select the data that you want to keep into the new table from the old, create the indexes you need, then truncate/blow away your old table.
Other than that you're stuck with a WHILE loop - you could also implement a CHECKPOINT every x number of times through the looping to help keep the transaction log growth in check...
______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience
September 20, 2011 at 6:28 am
tim.hulse (9/19/2011)
GSquared (9/19/2011)
tim.hulse (9/19/2011)
kokila.kondasamy (3/23/2011)
hiKindly find the sample code...hope this will help u...is there any better way of performing the delete operation, kindly let us know
Declare @count int
set @count =0
while @count<=2000
begin
delete from tablename where id in (select top 200 id from tablename where condition )
set @count=@count+@@rowcount
print @count
waitfor delay '000:00:05.000'
end
Regards
kokila K
Great advice, did exactly what I needed, thanks!
That will delete a specific number of records. 2000 in the sample given here.
select 1;
while @@rowcount > 0
delete top (1000) from dbo.MyTable Where (my where clause);
Something like that will keep deleting, 1000 at a time (which allows for reduced lock time), until there are no records that match the Where clause.
I'm sure you're right, but it looks to me like it would delete 200*2000 rows - deleting 200 on every loop. Then again, as my wife points out, I'm usually wrong...
Nope. @count is initialized with a value of 0, and then the @@rowcount value is added to it in each iteration of the loop. First iteration @count = 200 (or less if the number of rows affected is less than that). Second iteration @count = 200 + @@rowcount, and so on.
So, it will either increment by @@rowcount each time till it hits 2000 (10 iterations), or it will loop forever if there are less than 2000 rows to delete.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
September 20, 2011 at 9:08 am
GSquared (9/20/2011)
tim.hulse (9/19/2011)
GSquared (9/19/2011)
tim.hulse (9/19/2011)
kokila.kondasamy (3/23/2011)
hiKindly find the sample code...hope this will help u...is there any better way of performing the delete operation, kindly let us know
Declare @count int
set @count =0
while @count<=2000
begin
delete from tablename where id in (select top 200 id from tablename where condition )
set @count=@count+@@rowcount
print @count
waitfor delay '000:00:05.000'
end
Regards
kokila K
Great advice, did exactly what I needed, thanks!
That will delete a specific number of records. 2000 in the sample given here.
select 1;
while @@rowcount > 0
delete top (1000) from dbo.MyTable Where (my where clause);
Something like that will keep deleting, 1000 at a time (which allows for reduced lock time), until there are no records that match the Where clause.
I'm sure you're right, but it looks to me like it would delete 200*2000 rows - deleting 200 on every loop. Then again, as my wife points out, I'm usually wrong...
Nope. @count is initialized with a value of 0, and then the @@rowcount value is added to it in each iteration of the loop. First iteration @count = 200 (or less if the number of rows affected is less than that). Second iteration @count = 200 + @@rowcount, and so on.
So, it will either increment by @@rowcount each time till it hits 2000 (10 iterations), or it will loop forever if there are less than 2000 rows to delete.
You're right, I think that I've just proven your tagline true! It's never good to speed-read code, I guess.
October 17, 2013 at 11:25 am
You can use DBCC TRACEON (610) to have minimal logging. But no way to avoid logging completely.
Viewing 4 posts - 16 through 18 (of 18 total)
You must be logged in to reply to this topic. Login to reply