May 2, 2006 at 11:52 am
I have a lot of experience dealing with OLTP data models and index tuning but not too much in regards to a Datawarehouse. I am looking at an existing database that seems to be having some performance problems and in my analysis have found that the overall database size is 146 GB. In details there is 26 GB of data and 120 GB of indexes. To me this sounds like a lot. Anyone else think so? and if so what would be some recomendations to clean these up. I have already ran some scripts that find duplicate indexes as well as some overlapping ones but the developers seem to think there will be a massive performance implossion if any of the indexes are removed.
May 2, 2006 at 11:59 am
Look at the query plans. Run some index tuning traces.
That's truly the only way to know what is causing the performance hit.
Have a look at this: http://blogs.msdn.com/sqlcat/archive/2006/02/13/531339.aspx
Duplicate indexes can cause some issues due to maintenance. They can also cause Query Analyzer to choose the "wrong" index at times.
May 3, 2006 at 3:56 am
You may have indexed views ? Covered or overlapping indexes may be present - it's not always possible to have one index covers all in a table and for performance reasons there may be any number of apparent multiple indexes. As Pam says, you'll really have to isolate the queries that are causing probelms and seek to tune them. It's probably a silly question but you are updating stats/rebuilding indexes in the DW after loads and so on - failure to perform housekeeping can steadily degrade performance.
I still sort of agree that that volume of data to indexes does seem a little strange - unless you're mixing clusterd indexes in your figures.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply