September 28, 2005 at 1:58 pm
I am running SQL Server 2000 on Win/2K, and have few questions on statistics:
I just ran UPDATE STATISTICS in the database, and noticed that few tables have STATSDATE as NULL. Does it mean that statistics have not been updated by the server although I ran UPDATE STATISTICS?
I also noticed that used, dpages and reserved columns on sysindexes shows very high value for column statistics. For example,
Name: _WA_Sys_AACTG_IE_CREDIT_0A9D95DB
Used: 4231989
dpages: 805019132
Reserved: 4231989
Is this a normal behavious of SQL Server 2000?
Thanks,
Jay
September 29, 2005 at 6:32 am
I believe statistics are gathered when a table is queried, so that subsequent queries can be tuned to follow the path of previous queries;
could it be that the tables with no statistics either have no rows or have not been queried against?
Lowell
September 29, 2005 at 2:08 pm
If you execute UPDATE STATISTICS for a table you need to execute sp_recompile for the table. If you do not then the execution plan(s) of the stored procedures that reference that table will never have the best sampling of statistics for the optimizer to do it's job well !
RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."
September 29, 2005 at 3:08 pm
Thanks for the reply.
I am debating whether to collect statistics with "UPDATE STATISTICS <name> WITH FULLSCAN" or with a relatively small sample size. We have few large tables, and UPDATE STATISTICS command could take some time. Are there any recommendations on the value of sample that is most appropriate for UPDATE STATISTICS command? Of course, the value of SAMPLE depends on one's environment, but have you had any experience where UPDATE STATISTICS WITH FULLSCAN caused Query Optimizer to choose a better execution plan, and hence worked better?
Second question: Database has been configured with Auto update of statistics. If statistics were collected with FULLSCAN option, would SQL Server engine use FULLSCAN to update statistics whenever it needs to update statistics?
Thanks
September 29, 2005 at 3:29 pm
First sample size - the more complete information the better for the optimizer meaning yes, more efficient plans. However there is a point of time vs. size. My largest database has 100 Gb of data in it and a fairly complete SQL Maintenance plan for Optimizations that executes in about 3 1/2 hours (it includes sampling 100% of the data for UPDATE STATISTICS). So the trade off point is size vs. time in your environment. If you opt for a homegrown script (which we are moving to soon) then things can be staggered giving more flexibility in schedulnig and execution than sqlmaint.exe provides.
As for the second question I really do not know definitively. All of my databases are set in this fashion. But I think ti does some sampling, but not complete sampling. You can track this in perfmon (the counter name eludes me at present) but there is a cyclical jump when the server is checking this.
RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply