January 24, 2017 at 5:53 am
So i have been doing quite a bit of research on stats and trying to figure out the best way to keep them up to date.
I currently have auto update stats on, and every sunday i do a full scan update on any stats which have not been updated for over 1 day and have modified rows.
I want to change this so it is more granular.
I am looking for advice on the thresholds to set for when to do a full scan update on a stat, linked to rows in the table and % change since last update.
I will only update stats which have not been updated that day, and I am thinking something along the lines of:
< 10000 rows and percent_change >40
10000 - 100000 rows and percent_change between 30 and 40
100000+ rows and percent > 20
This is a data warehouse system, this may lead me to disable auto_update stats if I get it right.
January 24, 2017 at 6:11 am
I know that this is probably not what you were expecting, but I don't update statistics of every table that had modifications. I have daily job and weekly job that update specific statistics of key tables that I've noticed that if I don't do it, we are getting wrong query plans, but I do it for handful of tables. For most tables the automatic updates are good enough. One thing that I have to admit – I work on the OLTP databases and not on the DWH.
Adi
--------------------------------------------------------------
To know how to ask questions and increase the chances of getting asnwers:
http://www.sqlservercentral.com/articles/Best+Practices/61537/
For better answers on performance questions, click on the following...
http://www.sqlservercentral.com/articles/SQLServerCentral/66909/
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply