November 9, 2015 at 6:43 am
I have a 3rd party database that has an interface table that has 5 image data fields in it. about 240K rows total. I need to delete about 195K rows. even with no one on the system deletes of 300 rows can take over an hour. one of my thoughts of an approach was to take a database backup and then create a table of only the surviving records, and then truncate the production table and insert from this new table. a little fearful of how long this may also take. I can't control the bad design of this table (it also has 6 guid defined fields). anyone have approaches they may have tried with this type of table?
November 9, 2015 at 6:57 am
You want to do it in batches, in a loop. Since 300 rows takes an hour, you definitely don't want your batch size that big. Try somewhere between 25 and 50, doing it manually. Needless to say, get (and save) a list of IDs so you can just delete the ones in the list. When you get a size that works in no more than 15 seconds, use that as your batch size. you want a loop like this for your deletes:
delete XXX from
select top 50 * from
list a inner join
XXX b on b.key = a.key
While @@rowcount > 1
begin
waitfor delay '00:15:00'
delete XXX from
select top 50 * from
list a inner join
XXX b on b.key = a.key
end
That way the users don't get hammered and neither does the log file. Big deletes are time consuming because the server has to be able to undo them if needed, so it makes a big log file thingy, to use the technical term. This way makes a lot of little thingies instead of one gigantic thingy.
November 9, 2015 at 7:01 am
Oops, need a commit in there...
begin tran
delete XXX from
select top 50 * from
list a inner join
XXX b on b.key = a.key
commit
While @@rowcount > 1
begin
waitfor delay '00:15:00'
begin tran
delete XXX from
select top 50 * from
list a inner join
XXX b on b.key = a.key
commit
end
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply