December 23, 2014 at 6:58 am
Hi All,
I'm working on databases where statistics of some indexes (tables) are changing too frequently.
Once I update them manually, one minute after they get 10-20% change, and five minutes after they get over 100% change. Tables get updated very frequently (multiple times in a second).
When I run a query to read from sys.stats, sys.dm_db_stats_properties and other dynamic views, I see that they were last updated when I did it manually, but the change rate overpassed the 500+20% (tables have multiples of 10K rows). Auto create and update statistics are set to true on all databases, and I don't know why sql server does not do that automatically.
How would you handle such a situation...?
Thanks in advance
Igor Micev,My blog: www.igormicev.com
December 23, 2014 at 7:02 am
Stats only get updated when they are needed. If no query is requesting those stats for optimisation purposes, they won't get updated no matter how out of date they are. When a query does need them, they'll get updated.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 24, 2014 at 3:24 am
The update of the stats for all databases lasts for 20-30 seconds. Except Schema locking, no other locking is put on tables, and that is satisfying.
I see the app queries improved a bit, once I implemented a job for updating stats on every 5 minutes. However I will have to investigate more regarding them.
Can you mention some possible cons of the frequently run job, so I can be aware of that?
Thanks!
Igor Micev,My blog: www.igormicev.com
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply