March 21, 2006 at 12:06 pm
Any ideas what will be the best way to delete old records from table when this table is heavily used (lots of inserts and selects)?
This table will hold log information (everything done in application will be logged into this table and application will not run if it cannt write and read this log).
When just use delete statement then it takes long time even if DB is not in use (it takes more than hour for deleting 3 mil rows).
If DB is in use then application will hung (probably because of locks and/or high disk I/O).
Table have approximately 30 mil rows.
Maybe some query hint will help? Or ... ?
March 21, 2006 at 12:24 pm
If it's just a log table then why can't you use "Truncate Table" script at off-peak time. If its hard you to access the system on off peak hours then create a job and schedule it to run the job to truncate the rows in the table. To get correct estimation of time to truncate the rows in your table with 30million records at your environment then just create a temp table with some sample records and truncate it. That would give you better idea how long you should need to lock the table while truncating the data.
Bhusahn
March 21, 2006 at 12:31 pm
What is the criteria for deletion ? A date/time column older than a certain time period ?
Try deleting in smaller batches using the deletion criteria and using SET ROWCOUNT
-- Delete next 10,000 rows
SET ROWCOUNT 10000
DELETE YourTable
WHERE YourDateColumn < DeletionCrietria
Run that as a scheduled job every minute and you'll get rid of 600,000 rows per hour, with each delete only locking the table for a short duration. If locking/contention is still an issue, tweak the rowcount and run frequency.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply