October 3, 2013 at 8:46 am
We have recently implemented Dynamics CRM and have been working to performance tune all aspects of the product. Yesterday we started getting calls related to slow performance.
I installed a performance analyzer toolset named DynamicsPerf (http://archive.msdn.microsoft.com/DynamicsPerf). This tools captures data from the DMVs every 2 hours and stores it in a database.
While analyzing this issue I noticed that logical reads in this timeframe were through the roof. A query which usually averages 1k - 2k reads was averaging 1.4 million reads. This was happening with a number of different entities.
Looking at the data DyamicsPerf captured (very small subset attached), it looks like the same statements and executions plans were used before, during, and after this timeframe.
Any insight into what could cause this type of behavior would be much appreciated.
Thanks,
Sam
October 3, 2013 at 9:27 am
Could be a bad parameter sniffing issue where the cached plan was really only the best plan for the initial parameters passed in.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
October 3, 2013 at 9:42 am
That was my initial thought, but then when I saw that the query plans were the same before, during, and after the issue, I figured that couldn't be the case.
October 3, 2013 at 9:49 am
sam bryant (10/3/2013)
That was my initial thought, but then when I saw that the query plans were the same before, during, and after the issue, I figured that couldn't be the case.
The plan would be the same, it's the run-time parameters that would change which would cause the plan to potentially be suboptimal and cause an increase in reads.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
October 3, 2013 at 10:02 am
I started up a trace yesterday to capture some of the statements. If I take one of those statements which was causing 1.4 million reads, with the same parameter values and run it the amount of reads are only around 300 today.
Would it be related to different estimates vs. actuals in the query plan? I am not even close to a performance expert, so I am just trying to wrap my head around this at the moment.
Thanks for the responses Jack.
October 4, 2013 at 2:22 am
October 4, 2013 at 6:51 am
sam bryant (10/3/2013)
I started up a trace yesterday to capture some of the statements. If I take one of those statements which was causing 1.4 million reads, with the same parameter values and run it the amount of reads are only around 300 today.Would it be related to different estimates vs. actuals in the query plan? I am not even close to a performance expert, so I am just trying to wrap my head around this at the moment.
Thanks for the responses Jack.
You also need to make sure the SET statements are the same for the application and SSMS (or whatever you are running the statements with) as certain SET statements cause recompilation which could be what happens when you run it in SSMS. See this article by Erland Sommerskag, http://www.sommarskog.se/query-plan-mysteries.html.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
October 4, 2013 at 9:51 am
aren't
the statistics outdated ?
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply