July 22, 2003 at 11:31 pm
We have 60 GB tables and we deleted almost 10 Gb from them. Since then the queries are running slowly. Since its an online system, we had done dbcc indexdefrag on tables and also update statistics with fullscan. Still the queries are slow. The queries are dynamic queries from web application. Any ideas what else can be done?
July 23, 2003 at 6:18 am
Can you check whether dynamic queries use correct indexes to find records that need to be deleted? Do you delete them in a single batch or split the deletion into many small batches that delete portion of the records at one time? Can you replace the dynamic queries with stored procedures?
July 23, 2003 at 9:45 am
Just to clarify. The deletes have already been done with no issues. After the deletes the dynamic sql's from web frondend have started to work slowly. We cannot change them to stored procedures as of now.
July 23, 2003 at 11:10 am
Run a sample of the dynamic sql queries that run through the front end in Query Analyzer. Look at the execution plan and see what's taking the longest to execute.
July 25, 2003 at 9:37 pm
If you do a massive delete you may want to reindex the table or at least set recompile on for it.
July 30, 2003 at 7:21 am
Yes, it sounds as if you need to run update_stats or reindex. The execution plans that were created for the queries are now seriously out of date.
Bonne chance
Dr. Peter Venkman: Generally you don't see that kind of behavior in a major appliance.
Patrick
Quand on parle du loup, on en voit la queue
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply