January 28, 2013 at 5:26 am
Grant Fritchey (1/28/2013)
You can set a traceflag, 2371, to change this behavior. Here are the details.
For SQL 2008 R2 and above only unfortunately.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 29, 2013 at 12:48 am
GilaMonster (1/28/2013)
Grant Fritchey (1/28/2013)
You can set a traceflag, 2371, to change this behavior. Here are the details.For SQL 2008 R2 and above only unfortunately.
Grant/Gail, It's very very good article and rather very great thing provided by MS. However as it mentions:
"The downside is that updating the statistics results in recompilation of the queries accessing the table. This again can increase the risk of getting a different query plan for the next executions of queries against those tables. On the other side, the dynamic threshold to trigger update statistics should address issues encountered like not finding a new month or new fiscal year for month end reporting in the statistics and hence choosing a sub-optimal plan."
Don't you think once the statistics are upgraded, even if the new query plan is created, it will be optimal ?
And secondly, what does it mean by the issues like "not finding a new month or new fiscal year for month end reporting in the statistics and hence choosing a sub-optimal plan" ?
January 29, 2013 at 1:28 am
sqlnaive (1/29/2013)
Don't you think once the statistics are upgraded, even if the new query plan is created, it will be optimal ?
Maybe, maybe not. If all the stats are up to date and the optimiser has enough time then it should be good enough.
And secondly, what does it mean by the issues like "not finding a new month or new fiscal year for month end reporting in the statistics and hence choosing a sub-optimal plan" ?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 29, 2013 at 4:23 am
Hello again,
I tried updating the statistics using sp_updatestats. Well, now is it worst because executing the stored procedure code is also slow in execution now. Is good that it behave now consistent, but is bad that the problem is not solved yet... ๐
___________________________________________________________________
Is better shut up and look stupid than open your mouth and eliminate any doubt
January 29, 2013 at 4:34 am
sp_updatestats does sampled updates. That may not be good enough.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 29, 2013 at 4:42 am
Here's how to manually update statistics. You'll need to use the WITH FULLSCAN option. This documentation is for 2005, but the 2000 syntax is the same (if I remember correctly).
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
January 29, 2013 at 5:08 am
Finally, I got to a solution. Is not a nice solution (old fashion style) but it works. I have clear criteria for selecting the data from the three major tables, so I decided to isolate the raw data from these tables prior to start processing (summs, groups, data conversion etc). Now it started to work perfectly.
Thank you all for the interest and the hints given.
Best regards
___________________________________________________________________
Is better shut up and look stupid than open your mouth and eliminate any doubt
January 29, 2013 at 5:36 am
Grant, the link was really great. Shows the issues practically. There was the fix provided:
"The fix isnโt hard, a scheduled statistics update, maybe daily depending on when data is loaded and what the queries filter for, fixes this completely. The trick is often realising that it is necessary."
Interestingly this situation is similar to our processes but with little difference. Our process starts with data processing on Sybase and then that processed data is bcp'd to SQL tables. Once it is done, the whole new set of processing starts in SQL. However the challenge with us is that the stats of the tables hit in mid of process and there is no way we can updates the statistics in middle of the whole process. I hope I have not confused you guys. Is there anything I cna do in my scenario ?
January 29, 2013 at 5:39 am
sqlnaive (1/29/2013)
there is no way we can updates the statistics in middle of the whole process.
Why not?
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 29, 2013 at 5:40 am
GilaMonster (1/29/2013)
sqlnaive (1/29/2013)
there is no way we can updates the statistics in middle of the whole process.Why not?
+1
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
January 29, 2013 at 5:55 am
Grant is right. You can fire UPDATE STATISTICS commands from within a stored procedure. They'll execute and then subsequent statements will get the benefits of having fresh and update indexes.
January 29, 2013 at 10:04 pm
Gail/Grant/Ed, I agree with you that it can be done.
However at times, these stupid policies made by self acclaimed DBA legends works against us. update statistics need alter table permissions on the target tables and which is DDL permission. Unfortunately in our organization, there is strict policy not to provide DDL permission to anyone other than DBAs (not even to application account through which processing is done).
January 29, 2013 at 11:49 pm
Put in a request for an exception to that policy (you have a damn good reason).
What I suggest is a procedure with EXECUTE AS a higher privileged user (perhaps even a login-less user) and that proc just updates the necessary stats. That way the app doesn't get any alter permissions and you can still call something in the middle of your execution that updates the necessary stats.
If they say no, then you put into an document that the proc is slow due to the stats and those cannot be updated due to policy, therefore the procedure will have to remain slow. (after you test that that is indeed the case)
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
January 30, 2013 at 12:25 am
Radu Costescu (1/29/2013)
Finally, I got to a solution. Is not a nice solution (old fashion style) but it works. I have clear criteria for selecting the data from the three major tables, so I decided to isolate the raw data from these tables prior to start processing (summs, groups, data conversion etc).
Does it mean that you added an intermediate step where u pulled out the larger table data into temp table then used it in further steps ?
-------Bhuvnesh----------
I work only to learn Sql Server...though my company pays me for getting their stuff done;-)
January 30, 2013 at 1:39 am
Bhuvnesh (1/30/2013) Does it mean that you added an intermediate step where u pulled out the larger table data into temp table then used it in further steps ?
Yes, exactly this is what I did.
___________________________________________________________________
Is better shut up and look stupid than open your mouth and eliminate any doubt
Viewing 15 posts - 16 through 29 (of 29 total)
You must be logged in to reply to this topic. Login to reply