July 1, 2004 at 11:53 am
With the following table info:
DBCC SHOWCONTIG scanning 'Invoice_Details' table...
Table: 'Invoice_Details' (2133582639); index ID: 0, database ID: 7
TABLE level scan performed.
- Pages Scanned................................: 507855
- Extents Scanned..............................: 63493
- Extent Switches..............................: 63492
- Avg. Pages per Extent........................: 8.0
- Scan Density [Best Count:Actual Count].......: 99.98% [63482:63493]
- Extent Scan Fragmentation ...................: 1.63%
- Avg. Bytes Free per Page.....................: 1426.8
- Avg. Page Density (full).....................: 82.37%
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
7117725 rows
If I add a tinyint column to this table (as the last column), would you expect any fragmentation or other damage? Here's how I'm looking at it:
7117725 rows / 507855 pages = 14 rows per page
Since there are on average 1427 free bytes per page, no additional pages should be needed. Therefore, no additional fragmentation.
This is the first time I've tried to analyze this type of situation, so I could really be off base!
Thanks
Donhttp://www.biadvantage.com
July 1, 2004 at 5:13 pm
July 1, 2004 at 5:22 pm
I understand that there is very little fragmentation. My question is, since there are 1427 free bytes per page and only 14 rows per page, can I add a tinyint column without affecting fragmentation. If there were only one free byte per page, I would expect that fragmentation would increase. But since so many bytes are available per page, I would expect the new column to be bumped into the existing pages.
As I stated, this is my first foray into this type of freespace analysis with SQL Server 2000. I'm not even sure that I'm approaching the issue properly.
Donhttp://www.biadvantage.com
July 1, 2004 at 10:43 pm
I don't believe adding a column to a table with existing data can cause any level of Fragmentation! I just had a thorough look in BOL and some other info and no mention of Fragmentation by adding columns.
From there, I decided to test this out. I performed a DBCC SHOWCONTIG. I then created a column with a data type of TinyInt. I then performed a DBCC ShOWCONTIG and found that the Scan Density did not change!
Perhaps just give it a go on a test Database and let me know if you get different results!
July 2, 2004 at 6:13 am
I think, to circumvent the issue, unless you consider the following:
- system objects
- fixed length data types
that no fragmentation should occur in the database by adding columns to the table.
Remember as well that fragmentation only occurs when adding or deleting data.
Max
Max
July 2, 2004 at 9:22 am
Thanks for the input. My concern here is Disk I/O. I'll read the BOL and do some more testing.
Donhttp://www.biadvantage.com
July 2, 2004 at 3:07 pm
Hi everyone,
I read the BOL about DBCC SHOWCONTIG, and they mention how to defragment a fragmented index, but they do not mention how to defrag a fragmented table. What are some of the ways to defrag a table? Thanks in advance.
July 4, 2004 at 8:37 pm
July 5, 2004 at 3:53 pm
My Opinion though, is that adding a column to a table might cause fragmentation. If your pages are full enough you will need to fit the data on the page that belong to the row you are adding it to and guess what happen when the row does not fit on the page? ... SPLIT if on top of that your table is large enough you will get many splits and with a lot of splits ...you guessed right >> FRAGMENTATION is happening.
It is true though that on this case it might not cause a problem but again with usage everything changes
HTH
* Noel
July 6, 2004 at 10:37 am
To defrag a table (data), you simply defrag (or rebuild) the clustered index. If your table doesn't have a clustered index, you can still defrag it by creating a clustered index on the table. You can delete the index afterwards if you wish.
Steve
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply