August 28, 2015 at 10:41 am
Comments posted to this topic are about the item SQL Performance Elevenses
Best wishes,
Phil Factor
August 28, 2015 at 5:33 pm
Very nice list, Phil. I wish I could get people to understand #11. It would make my life so much easier. I think I'm going to use your list, so thank you for a teaching tool that will help people focus.
August 28, 2015 at 5:42 pm
Nice List, thanks.
I would include Collation in #11 - mixed collations can be very bad and hard to diagnose for performance.
MM
select geometry::STGeomFromWKB(0x0106000000020000000103000000010000000B0000001000000000000840000000000000003DD8CCCCCCCCCC0840000000000000003DD8CCCCCCCCCC08408014AE47E17AFC3F040000000000104000CDCCCCCCCCEC3F9C999999999913408014AE47E17AFC3F9C99999999991340000000000000003D0000000000001440000000000000003D000000000000144000000000000000400400000000001040000000000000F03F100000000000084000000000000000401000000000000840000000000000003D0103000000010000000B000000000000000000143D000000000000003D009E99999999B93F000000000000003D009E99999999B93F8014AE47E17AFC3F400000000000F03F00CDCCCCCCCCEC3FA06666666666FE3F8014AE47E17AFC3FA06666666666FE3F000000000000003D1800000000000040000000000000003D18000000000000400000000000000040400000000000F03F000000000000F03F000000000000143D0000000000000040000000000000143D000000000000003D, 0);
August 28, 2015 at 7:03 pm
They're all excellent but I like #11 the best, as well. People that believe that "Code First" efforts work well really need to study up on Aesop's fable of "The Hare and the Tortoise".
--Jeff Moden
Change is inevitable... Change for the better is not.
August 29, 2015 at 5:47 am
A couple of other thoughts:
run Glenn Berry's SQL Server Diagnostic Information Queries for a recent month for the platform of SQL Server you are on (SQL Server 2005 - 2016)
http://www.sqlskills.com/blogs/glenn/category/dmv-queries/
develop a baseline and capture baseline metrics for your system
August 31, 2015 at 12:56 am
Number one should be '...affecting..' not '..effecting..'
August 31, 2015 at 4:47 am
It works out the best plan, based on what you want, the distribution and size of the data, and the indexes it can use for your query.
My possibly flawed understanding is slightly different. I thought that the optimizer created a "reasonable plan in a reasonable amount of time". I was taught that it simply doesn't have the resources to examine all the possible permutations and come up with the absolute guaranteed best plan. If that was its aim, finding the right plan would often take longer that actually running the query. Would it be fair to say too, that we can make life miserable for the optimizer by throwing too many elements into a single query?
...One of the symptoms of an approaching nervous breakdown is the belief that ones work is terribly important.... Bertrand Russell
September 6, 2015 at 8:56 am
The Query Optimizer is good at working out the best way of getting what you want, far better than you. It works out the best plan, based on what you want, the distribution and size of the data, and the indexes it can use for your query.
Nice article Phil, I have to however disagree with above however. The optimiser is not designed to produce the best plan. It is designed to produce a good enough plan quickly. To pick the best plan the optimiser would need to evaluate every possible plan, including bushy trees rather than left deep trees, which for complex queries is all but impossible. I will grant you that it is a damn sight better at it than me but working together is the best way to go.
September 6, 2015 at 10:59 am
Keithyv (8/31/2015)
Number one should be '...affecting..' not '..effecting..'
The above should have '...effecting...' not '..effecting..'. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
February 16, 2016 at 2:06 am
Always run regular automated performance tests
This is the one I find the hardest sell in development. It takes both time and resources but though it can highlight killer performance issues early it still remains as something too often left as an afterthought and run with penetration testing once development is complete.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
February 16, 2016 at 8:47 pm
Gary Varga (2/16/2016)
Always run regular automated performance tests
This is the one I find the hardest sell in development. It takes both time and resources but though it can highlight killer performance issues early it still remains as something too often left as an afterthought and run with penetration testing once development is complete.
I've found that if you pay attention to performance, resource usage, and scalability during unit testing during development, you don't actually have to run any "automated performance tests". It's the ol' "Mind the pennies and the dollars will take care of themselves" saw. 🙂
--Jeff Moden
Change is inevitable... Change for the better is not.
February 17, 2016 at 1:56 am
Jeff Moden (2/16/2016)
Gary Varga (2/16/2016)
Always run regular automated performance tests
This is the one I find the hardest sell in development. It takes both time and resources but though it can highlight killer performance issues early it still remains as something too often left as an afterthought and run with penetration testing once development is complete.
I've found that if you pay attention to performance, resource usage, and scalability during unit testing during development, you don't actually have to run any "automated performance tests". It's the ol' "Mind the pennies and the dollars will take care of themselves" saw. 🙂
I agree but only to a point. Automated regression tests cater for changes that occur later on in the development cycle e.g. if you develop and unit test a stored procedure and it performs well enough then that is great but only at that moment in time as I can remove an index, for example, the very next day after you have finished your bit that doesn't have a detrimental effect on what I am doing but kills the performance of your stored procedure. In this scenario you did the right thing and I did not necessarily do the wrong thing but I didn't test all the possible knock on effects. What if that table I took the index from had 94 stored procedures using it? Do I need to performance test them all? Do I have to understand what the characteristics of poor performance is for each of them?
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
February 17, 2016 at 6:37 am
Gary Varga (2/17/2016)
Jeff Moden (2/16/2016)
Gary Varga (2/16/2016)
Always run regular automated performance tests
This is the one I find the hardest sell in development. It takes both time and resources but though it can highlight killer performance issues early it still remains as something too often left as an afterthought and run with penetration testing once development is complete.
I've found that if you pay attention to performance, resource usage, and scalability during unit testing during development, you don't actually have to run any "automated performance tests". It's the ol' "Mind the pennies and the dollars will take care of themselves" saw. 🙂
I agree but only to a point. Automated regression tests cater for changes that occur later on in the development cycle e.g. if you develop and unit test a stored procedure and it performs well enough then that is great but only at that moment in time as I can remove an index, for example, the very next day after you have finished your bit that doesn't have a detrimental effect on what I am doing but kills the performance of your stored procedure. In this scenario you did the right thing and I did not necessarily do the wrong thing but I didn't test all the possible knock on effects. What if that table I took the index from had 94 stored procedures using it? Do I need to performance test them all? Do I have to understand what the characteristics of poor performance is for each of them?
Ah. Understood. We do, in fact, have such "load" testing that we do for large releases or releases where we think performance could be an issue. As a bit of a sidebar, we don't allow any Developer to remove indexes for the very reason you pointed out. We do index reviews on production on a release basis.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 17, 2016 at 7:16 am
If I were to add a #12, it would be this:
It is entirely possible to optimize the top 10 worst performing queries and still not address the specific performance issue affecting the end user.
When attempting to address a specific performance issue affacting end users, don't allow rely on the business (this includes the DBA, your manager, or even the CIO) to tell you what stored procedures or SQL select statements needs to be optimized. Instead, start by asking them to describe what they are actually observing in the field, or better yet meet them at their desk and watch them repeatthe sequence of steps within the application that lead up to the "slowness" they are observing. Once you've identified the general problem area, then run DMV queries or traces to narrow down the culprit.
Also, keep in mind that queries don't intermittently "run slow". They are either busy doing something or waiting to do something. You optimize a query by reducing the amount of work performed (reducing the number of pages read/written or calculations) and / or by reducing the amount of time spent waiting in a blocked state.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
February 17, 2016 at 7:53 am
Jeff Moden (2/17/2016)
...As a bit of a sidebar, we don't allow any Developer to remove indexes for the very reason you pointed out. We do index reviews on production on a release basis.
Very sensible. I expected no less from you. The task should not sit solely with developers. If at all.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
Viewing 15 posts - 1 through 15 (of 16 total)
You must be logged in to reply to this topic. Login to reply