June 18, 2012 at 12:41 pm
That makes sense. The update is triggered only when query is executed against them. Thanks.
June 19, 2012 at 7:48 am
Indianrock (1/19/2010)
We have some tables with millions of records in them. What we seem to be finding is that, in general, sampled updates of statistics creates problems. Yesterday the application was trying to run a select statement which was timing out. A look at the execution plan showed the number of records returned versus the number of records that statistic/execution plan was showing were very, very different.I had run sp_updatestats on the whole database last Thursday night ( 19 minutes ) and then our weekly job run Saturday night -- the latter does a fullscan update to statistics, starting with the indexes with the most row modifications and is limited to 3.5 hours. It updated about 150 statistics out of about 1100.
We have auto update on, auto create on, and also auto update async on. The last one surprised me when I found it on, but our database architect apparently turned that on quite a while back ( default is off ).
If you have 10 million records in a table, then auto update would fire after about 2 million row mods -- that's a long time to wait.
I feel you have a REALLY crappy IO subsystem (and possibly grossly underpowered server period) if you can only update 150 statistics in 3.5 hours!!! That or you have an incredibly busy machine (well, in that case it needs to be beefier so you can handle things like stats updates more effectively).
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 2 posts - 16 through 16 (of 16 total)
You must be logged in to reply to this topic. Login to reply