August 18, 2005 at 10:56 am
I have a database with over 900 tables. I want to monitor and defragment all the tables and index if necessary. I want to have the process in a scheduled job, run every Sunday.
To do this I am planning to code a stored procedure, which runs DBCC SHOWCONTIG, saves the result in a table, scan the result table for Scan Desity, rebuild index wher Scan Density <80%.
The first problem is how to save dbcc showcontig result into a table. In Query Analyzer, this DBCC result is in the Message page, not in Result Grid page, therefore "insert into result_table exec .." does not work.
The next question is on the best way to rebuld a fragmented table having no clustered index. One can drop and create the table, or create a clustered index and then drop it. Which is better ?
Thanks
August 18, 2005 at 11:07 pm
While your idea is valid, I think you might not be considering the *big* picture.
Read the following article, this is good info:
http://www.quest-pipelines.com/newsletter-v5/0204_B.htm
There are two statements he makes that you may want to consider when doing your index analysis:
Although you have 900 tables in your database, I doubt every one of those tables has to be monitored so closely. In fact I would venture to say that you can list your most highly used on a piece of paper. As a database administrator, you should be very aware of your "problematic" or highly used tables. Once you understand your tables you can build them to fulfill your needs.
August 18, 2005 at 11:43 pm
While Dan's comments are quite legitimate, you can use the WITH TABLERESULTS option on the DBCC SHOWCONTIG command to return the results in a table format. This make it easy to store the results in a table and analyze what you will.
--------------------
Colt 45 - the original point and click interface
August 19, 2005 at 12:07 am
Thank you Phil, I learned something today
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply