February 16, 2006 at 6:13 pm
I wrote an editorial this week on the Java Guy, a performance specialist that people helps tweak, tune, and understand their Java applications. If you didn't care for the editorial, I still recommend you read the interview linked in it as it's rather amazing.
One of the items that is referenced constantly in the interview was DTune from Sun, which allows detailed analysis of the inner workings of the Solaris OS. So that got me thinking, wishing is more like it, so the poll this week is:
What type of Performance Analysis Tool Would You Most Like To See on SQL Server?
We have profiler, which allows detailed views into the events that occur, but it doesn't really always help with troubleshooting. So I'm wondering what tool, real or imaginary, that would help you determine what the issues are and what your changes mean.
I'll start it off with one wish tool. I'd like a tool that gives detailed feedback about the paths the optimizer takes and a reason why each is chosen. Knowing why a particular index is chosen in some circumstances and not others can really help to explain some of the behavior that I see with various queries. In line with that, the ability to save a plan and force it to be used later would be a boon in many cases.
Steve Jones
February 17, 2006 at 1:35 am
I would like a tool that predict the overall benefit to a system of an improvement of a constituent part.
I've read a piece on queue theory which used the example of the queue for the ladies loo being much longer as the one for the gents. Apparently women take twice as long but the maths behind queue theory tells us that doubling the number of cubicles will not halve the queue. If fact you have to provide some obscure multiple to gain the same benefits.
This tells me that if I improve a stored procedure it may actually have profound benefits beyone the context of improving the procedure itself.
I would also like a better alert system.
February 17, 2006 at 5:37 am
We've been using Quest Stealth Collector to get aggregate information out of Profiler data on the fly and with history. If there was a way to make this tool more robust AND combine it with detailed Profiler data so that, from the aggregate, we could drill down to specific details, say a sampling of parameters used in a stored proc so that we replicate it's behavior, just as we can do when gathering Profiler data directly, we'd buy it.
I like the idea of having more information on why the query optimizer made certain choices as an option from a query plan. Not only would it help in troubleshooting, but it would be a great educational tool and lead to better database designs. With more information, we can make better choices early in the process as regards PK, FK, constraints & indexes.
That's my 1.5 cents anyway.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 17, 2006 at 6:02 am
From a developer/architect point of view, I think SQL Server 2005 probably has the necessary tools (to some extent) -- they're just too hard to access or toggle on and off at will. It would be great to be able to rapidly turn on and off trace at any level, from an individual procedure to an entire database, and focus on even a single connection at will. My coders often simply don't know how to take advantage of the tools that are there.
The biggest drawback of any performance tool I've seen to date is that the very fact of using them often skews the results, but then that may just be my inexperience with the tools talking. I tend to find myself still doing most things for myself, and avoiding the interface as much as possible because I'm convinced the best performance is generated by the deepest understanding of the code. If I'm coding in a text editor (VIEWs, SPs, UDFs, etc.) I tend to be more focused to begin. And ultimately well-written code at any solution level is the biggest performance benefit.
I also suspect that most of the folks out there (including myself here) really haven't a deep enough understanding of what the results of the profiler and other built-in tools are telling us. In a complex stored procedure, for example, what are the key indicators of real-world performance? Is the number of transactions more important than the time to execute, and what context is there to judge which results are the ones to focus on improving? Obviously, your expectations of the results have to differ based upon what the procedure does.
What I would love to see is an article that talked about how to optimize the use of the tools that are there, and helped build a baseline to understand the results in context of different operations. It would help educate those of us who use the product constantly, but just don't know how to make it sing like it can.
February 17, 2006 at 6:21 am
I don't think it is too much to ask for a performance tool that would measure the efficiency of the SQL Server performance and give details on just what is causing problems and how to fix it in specific terms including first the quickest to implement.
February 17, 2006 at 7:21 am
I would like a tool that would allow me to take a look at the whole database's performance without being a major drag on system resources.
Then I would like to be able to pick out the slow items and tweak indexes, sp's and the like and run the tool again. If I don't like the results, it would revert back to the state before the changes.
Finally, I would like a tool that had a great tutorial that even I could understand
Is that asking too much???
February 17, 2006 at 8:14 am
Due to recent issues we have been having, I would like a tool that could give us detailed information of what is residing in our MemToLeave area.
Having this information would allow us to clean up the process or process's that are depleting our memory and causing us to have to reboot.
Obviously the tool could not be a diag that takes up substantial resources on system while running.
More efficient process's means more efficient SQL Server.
February 17, 2006 at 11:46 am
Holy Cats! We just ran into MemToLeave issues to. I had no idea what a limit there was on the servers due to memory management prior to hitting that problem.
I second the need to monitor this, apparently, scarce resource.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 17, 2006 at 11:49 am
We use InDepth from Precise...I mean Veritas... I mean Symantec. The historical information that this provides is very useful in helping to resolve issues that surface over time. It has its downfalls as they all do but overall it is a really good product.
As for saving explain plans and reusing, Oracle has this. I thought this was available in 2005 but maybe I was dreaming.
David
@SQLTentmaker“He is no fool who gives what he cannot keep to gain that which he cannot lose” - Jim Elliot
February 17, 2006 at 12:13 pm
I have played with this before at a client's site and thought it was a pretty good tool to help track down overall performance issues. Unfortunately, I don't think they support 64-bit other than Itanium. We have 64-bit xeon processors in all our servers. To date, I haven't found a performance monitoring tool like InDepth that provides support for that platform. Anyone else seen one?
Brandon
February 17, 2006 at 3:09 pm
Historical execution statistics for stored procedures - numbers of times executed and time. Access statistics for tables and views.
February 20, 2006 at 6:35 am
I am still generally surprised at the lack of built-in tools available. To a large extent, the whole topic of performance monitoring seems fairly nebulous and, when I've tried to learn more about it in 2000, it always seems that, when it comes down to it, a third-party tool is required to do it somewhat efficiently. That is partly why you're in business, right?
February 20, 2006 at 6:41 am
Take a look at the Stealth Collector from Quest. It basically uses Profiler data to create the exact kind of aggregates & drill down you're looking for.
It would be nice to get it directly from SQL Server or Perfmon or something rather than having to build or buy it.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 22, 2006 at 8:21 am
The Dynamic Management Views and Functions are great.
I'd like to have an interface listing these functions (multi-selectable) in one window and an option of relevant windows with the active processs (according to the selected function) in another window and a results pane.
The action would be to drag the active process into a results pane. The results:
Gains: pre-selected functions would help with real-time problem solving and it would be really cool to play around with on lunch time.
It would be fabulous to include the Activity Monitor as well as a graphical lock chain. All windows should be dockable and the application should be independant from the Management Studio as sometimes you just need the RAM and to kill the Management Studio (EM is what I'm thinking about here).
Max
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply