September 16, 2013 at 7:53 am
Are there any recommendations for indexing a table with a very high churn rate?
I have a table (in an inherited DB) that is basically used for processing temporary data.
There is a service constantly running that populates this table (200,000-400,000 inserts per day).
There are a number of services running that retrieve and use the data. They flag the data as processed. (200,000-400,000 updates per day).
There is also an additional service that deletes old processed data from the table. (200,000-400,000 deletes per day).
There are indexes on the table but because of the high data churn rate, they are always completely fragmented. They are probably causing more harm than good.
So are there any recommendations for indexing on tables with such high insert, update, delete and read activity?
September 16, 2013 at 8:58 am
The most important thing I would suggest is monitoring how the statistics are maintained. Likely it's just the automatic update. Depending on the type of data, how it's stored in the indexes, etc., this may be grossly inadequate. You may need to do a full scan on your own on a regular basis.
As to the fragmentation of the indexes, unless you're running enterprise and can do an online rebuild of the indexes, I don't think you have much choice around that. Even a fragmented index can be helpful and fragmentation isn't always a big deal. Measure performance before and after a defrag to see exactly how much impact you might be suffering from.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
September 17, 2013 at 1:53 am
thanks for the feedback. I will try that.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply