July 20, 2011 at 1:35 pm
Hi everyone, I was looking into different tools to optimize some of our high performance servers and was trying to find if anyone had implemented a way to apply a Pareto analysis against a database (also known as the 80/20 rule). In a nutshell, analyze what tables/processes take up to 70% or 80% of my work/resources. I know the principals of how to apply it, but my question is what to measure.
The easiest way is ranking by number of rows or determining size of a table, and rank them, finding out at which point and how many tables get me to a cumulative 75% for example. Another option is to add the user reads and writes (and scans?) and rank them from largest to smallest, finding out at how many tables it takes to hit a cumulative 75% of my resources.
Being SQL 2008, there are obviously LOTS of metrics/DMV's available, so I'm trying to simplify it to get the most used tables and called procedures optimized to hell, and work my way down to the less critical objects.
Thanks!
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
July 20, 2011 at 1:44 pm
This is what I do, applies the same base principles :
July 21, 2011 at 8:12 am
Thanks very much! Quite useful. 🙂
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply