September 29, 2016 at 10:18 am
Hello everyone,
probably a simple question to all, but had me think, if I have a database were most or all indexes are above 60+ fragmented, BUT the page count level is below 50, would it matter or make a difference in performance to do maintenance and rebuild or would it not matter because its small?
thanks in advance
September 29, 2016 at 10:26 am
The general guideline is to ignore tables under 1000 pages. Fragmentation only affects large range scans from disk, not general operations against pages in the buffer pool.
Make sure you're doing stats maintenance though.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 29, 2016 at 11:44 am
Thanks Gila for the reply back, just wondering, why would we ignore any page count under 1000? again this for the info.
September 29, 2016 at 12:16 pm
You can ignore anything under 1000 pages because performance won't be as greatly impacted. Understanding what fragmentation is will help you see why this is so, but basically its because there is less pages to look through.
Looking through this blog post:
will explain the numbers and looking through the rest of the blog is a great way to gather info on fragmentation.
September 29, 2016 at 3:47 pm
Just be careful with how/where the 1000 page count is checked. I've reviewed a few automated optimisation routines, and a lot of them check the page count at the index level, not at the data level. When this happens we get tables well over 1000 pages, but with small indexes not being optimised, and I've seen poor performance as a result.
Leo
Leo
Nothing in life is ever so complicated that with a little work it can't be made more complicated.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply