September 3, 2013 at 8:21 am
I took a database and use SQL Profiler to trace the data while a computer program executed a complete cycle as a user would in the application (it's like one BIG macro that mimics a user). Those results said, "if I add these indexes and statistics to the DB, then I will get "X" percentage of improvement." So I did. I reran the above scenario and saw an improvement, but no where near what Database Tuning Advisor said I would. I ran the second test through the DTA and found "Y" percentage of improvement if I added some more indexes and statistics. So I did. On the third test, I returned numbers from DTA that were terrible. Duration numbers 3,4,5 times higher, reads 20% higher on some tables, and CPU 3,4,5 times higher than any previous test. However, this is where it gets REALLY strange. The same computer controlled program that ran every test in about 5 hours, completed the exact same results in 3 hours and 30 minutes with all the new indexes and stats.
So here are my questions:
1. How can adding indexes and statistics make these numbers "appear" to be worse when DTA suggested the queries ran from the test need these changes???
2. Do I care about number 1? (Am I looking at this from an absolute indication of worsening or is there a "good reason" for increased reads... like I gave the DB a better way to read data so it reads more faster???)
3. How can longer duration (3,4,5 times longer than any other test) produce faster completion results overall?
(I understand #3 is a loaded question as you are not aware of the variables within the network)
4. What else can I do to evaluate these numbers and determine a REAL evaluation of the results?
September 3, 2013 at 8:45 am
There isn't a "everything run faster" set of changes.
When you adjust indexes, you potentially adjust execution plans as well. This means some queries may run faster, some may not, choosing poor indexes. Assuming statistics are updated, you should be getting good plans, but with parameter sniffing and skewed distributions, there's not gross way of doing this overall for all your queries and indexes.
The DTA should be used with a grain of salt. It doesn't look to efficiently pick all indexes, but it does a good job. However the DBA needs to weight the value of adding another index, making an include or filtered index instead, or ignoring the recommendation. Don't forget that every index you add impacts your insert/update/delete operations, and can slow them down.
The way to do this is to make changes slowly and test. You don't care when x% the DTA says. It's a guess, not necessarily anything else..
September 3, 2013 at 9:45 am
Thank you for the advise. I made some other discoveries in the database that also point to performance problems (FK and CONSTRAINT's not trusted, etc.).
I have some work to do!!!
September 3, 2013 at 9:59 am
You are welcome, but the important thing to note is this is an ongoing, evolutionary process. It's not a set it and forget it, or a single session.
Go slow, change things, and test regularly.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply