December 10, 2007 at 9:40 am
All,
I have two questions...
Is there any reason why a two 'identical' databases in terms of design, schema structure and indexes etc one on SQL 2005 the other 2000. (Although they are not on identical hardware) Some queries that perform relatively well on the 2000 version have major performance issues in 2005? All indexes exist on 2005 and statistics have been updated etc.
Which leads to my next question...?
I have run DTA against the 2005 database which has made recommendations that it estimates would improve performance by 63% which sounds good to begin with BUT how reliable is this tool...Does it make accurate recommendations. Basically how should I treat the results, all embracing or cautiously?
Your advice please
Gethyn Elliswww.gethynellis.com
December 10, 2007 at 10:14 am
The tool makes recommendation absent other activities. It only tells you things it would use if they existed for the particular statement(s) you selected.
I usually review the reccomendations each individually (when it comes to indexes).
Obviously if it recommends 17 new indexes for a particular table and that table has a very high transaction rate.
you will peed up your select but hurt your inserts / updates / deletes.
So it is a good tool for recomendations but the ulitmate decision on whether those reccomendations are good as far as the whole system is concerned has to be made by you.
December 10, 2007 at 10:16 am
Also, as far as the query differences.
Sql 2000 optimizer is very different than SQL 2005.
SQL 2005 usually picks a better execution plan.
However, SQL 2000 processed table variables much faster in certain situations (large amounts of data).
Check the execution plans between the two to see if there are differences in table scans / seeks etc.
December 10, 2007 at 10:39 am
The DTA is not bad, but it depends on the workload you run through it. Like any tool, it's examining the queries and making guesses. You want to be sure that the indexes recommended make sense for the queries (cost and volume of each being run) on your system.
I'd start by perhaps adding one or two of the recommendations that seem to best fit, according to what you know about the system. Run some of your queries against the tables after a change and see if they run faster. Be sure that you've run a baseline now to know what queries are costing.
I tend to agree with what Joel has written above.
December 10, 2007 at 10:43 am
The DTA is a tool like any other - you get to understand what a tool does by using it, if it suggests xx index will aid performance by 63% add the index and see if it really does! This is the only way you'll learn. Yes queries between versions do re-act differently, in some cases whole applications go bad when ported to sql 2005 ( even more ironic when it's a ms app !!! but I'll not mention names here )
This topic was covered way back in 2004 - the optimiser rewrite means some queries need optimising and some query approaches just don't work. re-use of plans is a particular issue and using plan guides to force recompilation is an oft used work around.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
December 10, 2007 at 10:47 am
Just to toss in a contrarian view point, I've never had DTA recommend a single index that was worth the trouble it took to run a series of scripts through the tool. In fact, one time, it recommended a set of indexes that were clearly going to be a problem. Unfortunately we were on a call with MS at the time, so we had to accept the scripts suggested, implement them, and then document the performance problems that they caused before we could rip them back out.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 10, 2007 at 11:22 am
you pays your money and take your chances! I rarely use the DTA although I'll sometimes run a horrendous query through it , lots of tables, derived tables joins, sun selects etc. When it doesn't crash it can be useful. I usually use the query plan and work from there - I find I can usually better the DTA around 50% of the time ( for complex queries not simple ones ) You can also use the missing index dmvs btw.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
December 10, 2007 at 12:04 pm
If you know little to nothing about indexing, the DTA may be a good place to start. It's better that the index tuning wizard was (but that's not saying much)
Personally, I prefer to optimise by hand. If anything goes wrong, I only have myself to blame.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 10, 2007 at 12:34 pm
I love the missing index DMV's. They're not a panacea either, but they sure can help you spot something simple quickly.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 10, 2007 at 12:58 pm
Agreed. They, along with the index usage stats are among the first things I check when tuning.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 10, 2007 at 4:30 pm
agreed, agreed !!
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
December 16, 2007 at 11:09 pm
Well good night to every one this is going to be my first inaugural post in this site.
First of all I agree totally with the fact that DTA has seldomly helped me in the past. Most of the times the programs I have installed suffer a slowdown, specially if the tables holds data intensive operations, like inserts and updates.
Normally those programs will try to guess what indexes will help, but as all, indexes takes resources, and sometimes we can't afford them.
Now to address what you were asking about SQL2k and SQL2005.
It took them that long to take the new version out cause they rewrote a lot of code. It's not only that you have now the SQL Studio Express, but how the tables, indexes and queries are done.
If you are experimenting a difference in performance, you can set the tables to emulate SQL2k when you install them. That should take care of the problem for starters, but then you should evaluate which queries are decreasing the performance and perhaps try to re-write them.
Especially now that SQL 2008 is going to be released, cause the bridge will expand even more.
Remember that the SQL 2000 used to have no encryption and SQL 2005 have encryption keys built into it, hence it has to encrypt decrypt some times, that is alledgely covered by the increase of ram, even when we are still stucked at 4CPU's.
In short DTA is probably not going to help you see what is going on, but with a good baseline, and a review of the queries you are using you might be able to fine tune your application now.
Sven
December 17, 2007 at 1:28 am
sven.blackwood (12/16/2007)
If you are experimenting a difference in performance, you can set the tables to emulate SQL2k when you install them. That should take care of the problem for starters, but then you should evaluate which queries are decreasing the performance and perhaps try to re-write them.
If you're talking about the compatibility mode, setting that to 80 does not change the optimiser to SQL 2000 behaviour. It's just preventing the new features from working
Remember that the SQL 2000 used to have no encryption and SQL 2005 have encryption keys built into it, hence it has to encrypt decrypt some times
Not sure what you mean here. 2005 does have encryption capabilities, but they aren't used automatically. If you want encryption, you have to specify it yourself.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 17, 2007 at 10:37 am
I agree but in my experience, I have improved a raise in performance, just by changing the compatibility.
I also agree that encryption - decryption is optional, but the original poster, didn't mentioned if it was being used.
I for one, the moment I got my first windows vista ultimate, I used the bitlocker function, among other things to see how it worked, in case a customer needed help... who knows maybe the original poster also included encryption, and didn't thought twice about it.
I know when I did the windows 98 double space disk thing, I never thought about that again, even when my performance was being seriously crippled. Yes I removed that feature and performance trippled, and yes I know windows 98 was probably when some of these posters were watching sesame street, all I'm saying is that sometimes the solution is right around the corner of your eye.
Sven
December 18, 2007 at 8:28 am
I'm with Grant and sven. DTA is a 3-legged dog, especially in the hands of the untrained. And the trained can do much better without it. You couldn't imagine the things I have seen people do with it and its recommendations!!
I will repeat my oft-mentioned mantra that performance tuning is something you should be mentored into. That is far and away the most effective way to become good at the subject. And your company will reap immediate and lasting benefits of the professional's work as well.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 15 posts - 1 through 15 (of 22 total)
You must be logged in to reply to this topic. Login to reply