I have a 3rd party app and DB that requires all stats be updated using full scan, 100% sample rates. I have a maintenance plan to do this every weekend, but have found that something is erasing my work by updating stats with a small sample, less than 1% on tables with 100+ million records. This is causing performance issues..
Using sp_blitzFirst (Thanks Brent) and sys.dm_db_stats_properties I have found that the stats for many tables are being updated all hours of the day, even off peak times. I've been using sql profiler to track any events that might be triggered by one of these updates, but nothing yet. How can I find what is causing these bad stats?
Auto update stats is enabled, but I doubt the tables are changing that much. I can disable it to remove any chances though.
Server is 2016 SP2 CU13
May 20, 2021 at 2:46 am
SQL itself will automatically update stats when it determines that it needs to. By default, that will use sampling.
You can force SQL to stick with a FULLSCAN unless someone explicitly specifies a sampling percentage by issuing this command once:
UPDATE STATISTICS table_or_indexed_view_name WITH FULLSCAN,
PERSIST_SAMPLE_PERCENT = ON --<<-- add this to the command!
Note: if the table is truncated, SQL will fall back to sampling, so you'll need to run the command above again.
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
If you use the event auto_stats in Extended Events, you get the specific reason for the statistics update. This will let you differentiate between whatever is occurring automatically and what's being caused by some other process. You can see the sampling rate, and a whole bunch of other information; object_id, database_name, more. If you want to capture the specific query that is firing a non-automatic statistics update, you'd need to look to sql_batch_complete and filter based on the statistics command. For procedures, look through your procedures for statistics updates command and capture those procs, filtered on object_id. Use Causality Tracking to be able to combine queries & the statistics updates.
The only issue is, you won't know which application is making these commands, unless the application name is included in the connection string. You will be able to tell the login & user context under which the commands are run though. That's always available.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
May 20, 2021 at 9:00 pm
One easy way to find out if autostats is the culprit is to... temporarily turn it off. 😉
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply