August 24, 2018 at 1:03 am
Databases can grow really huge, and outdated statistics can cause crucial performance issues. I am interested to know, is it good to set AUTO_UPDATE_STATISTICS ON
If yes, is it better to update _ASYNC or before generating the execution plan. I think that every time updating statistics before executing the
query can cost too much time.
And when to use sp_updatestats instead
Thanks for your time!
ALTER DATABASE [dbname]
SET AUTO_UPDATE_STATISTICS ON
August 24, 2018 at 1:19 am
Hi,
As per my understanding, running sp_updatestats is a good option (may be weekly or daily) for a OLAP environment where we have mostly static data not updated frequently. But, in an OLTP environment where data are getting updated frequently, we should go for AUTO_UPDATE_STATISTICS_ASYNC (in case we should not make the query execution wait). Async will allow the query to get executed and then update the statistics. But AUTO_UPDATE_STATISTICS will update the statistics during plan generation.
August 24, 2018 at 1:32 am
debasis.yours - Friday, August 24, 2018 1:19 AMHi,As per my understanding, running sp_updatestats is a good option (may be weekly or daily) for a OLAP environment where we have mostly static data not updated frequently. But, in an OLTP environment where data are getting updated frequently, we should go for AUTO_UPDATE_STATISTICS_ASYNC (in case we should not make the query execution wait). Async will allow the query to get executed and then update the statistics. But AUTO_UPDATE_STATISTICS will update the statistics during plan generation.
thanks
August 27, 2018 at 1:56 am
It's usually recommended to leave this on, yes.
While it is generally not a very good thing to have your SQL Server immerse itself in an update statistics job right in the middle of the day, it's also not a good option to have it running with stale statistics over any significant period of time, so "the right thing to do" is to find a good balance here. Preferably, you'll be running periodical scheduled jobs on times that you pick and basically just have the auto update enabled as a backup for those special occasions when the periodic schedule is not enough.
That said, there really is no straight and simple "one size fits all" answer to how you should handle your statistics maintenance operations.
Basically, this is a delicate balancing act where you need to be at least marginally familiar with your actual data and usage patterns.
On the one side, you do need to keep your stats in tune with your data, or performance will suffer as your query optimiser stumbles around in the dark taking wrong turns and making bad choices in execution plans because the maps it uses to navigate your DB simply do not match the terrain.
On the other hand, updating the stats takes time, consumes resources - and forces all execution plans to be recompiled - so you generally don't want to run this unless you need to. Also, you should keep in mind that there are two different types of statistics - called index statistics and column statistics - and that if you rebuild indexes, that job will also take care of the index stats for you and your "statistics" job will only need to update the column stats.
Vegard Hagen
Norwegian DBA, occasional blogger and generally a nice guy who believes the world is big enough for all of us.
@vegard_hagen on Twitter
Blog: Vegards corner (No actual SQL stuff here - havent found my niche yet. Maybe some day...)
August 27, 2018 at 7:07 pm
Shifting gears a bit... I've seen lot's of people recommend updating stats based on how long it's been since the stats have been updated. I don't understand that recommendation because if you have a reference table (for example) that hasn't had any changes made to its data in a decade, do you think the existing stats from 10 years ago (again, for example) will be any different than the stats produced if you rebuild the stats today?
IMHO, updating stats based on age is a total waste of time.
I also join the other that have recommended that you leave auto update of stats turned on. It covers the "edge cases" that your stats maintenance might have missed, especially for ad hoc queries.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 29, 2018 at 1:34 am
Just for the record, I agree 100% with Jeff on the basic point that the age of your existing statistics has precisely nothing to do with whether you need to update them or not.
Any fixed schedule should be set based on your actual knowledge of your actual data and usage patterns so as to sort things out a little before they start causing problems.
Also, it is possible to make these jobs "intelligent" in that they'll actually check how many modifications have been made before deciding whether to actually go ahead with an update.
I can recommend Ola Hallengren's maintenance solution, specifically the index and statistics maintenance section, but of course other options do exist and you're free to tinker around and make your own scripts or versions of scripts as needed.
The following SQL code will list existing stats in a database where more than 1000 data modifications have occured since the last stats update, and the output will be sorted in descending order on the modification_counter. It will also tell you the time of the last update for each one of the stats.
SELECT obj.name as object_name, obj.object_id, stat.name as stats_name, stat.stats_id, last_updated, modification_counter
FROM sys.objects AS obj
INNER JOIN sys.stats AS stat ON stat.object_id = obj.object_id
CROSS APPLY sys.dm_db_stats_properties(stat.object_id, stat.stats_id) AS sp
WHERE modification_counter > 1000
ORDER BY modification_counter DESC;
Vegard Hagen
Norwegian DBA, occasional blogger and generally a nice guy who believes the world is big enough for all of us.
@vegard_hagen on Twitter
Blog: Vegards corner (No actual SQL stuff here - havent found my niche yet. Maybe some day...)
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply