December 1, 2010 at 8:22 pm
There is database which has a table of more than 200gb. I was told one column on that table takes more than 100 gb . I need to create space for test environment. I had dropped the column and recreated but no space was created. Then i shrank the mdf and ldfs still the same. Even shrinking the db didnt help. I am not sure what am i missing?
i) Is there a way to find out the size of column inside a table?
ii) how can i create space on that table?
Thanks
December 1, 2010 at 9:34 pm
Try
DBCC CLEANTABLE
Hope this helps you.
December 2, 2010 at 2:20 am
Try rebuilding the clustered index.
By dropping the column you've created some free space on each page. Shrink only deallocates completely free pages.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
December 2, 2010 at 8:15 am
GilaMonster (12/2/2010)
Try rebuilding the clustered index.By dropping the column you've created some free space on each page. Shrink only deallocates completely free pages.
Forgot to mention, tried that one too..:(
December 2, 2010 at 8:34 am
How do you know that the individual column used 100GB?
It could be that deleting the column freed a certain amount of space on each data page, but that the table definition is such that no other data could occupy the space that was freed. Please will you post a CREATE TABLE statement for your table? How many rows does it contain?
John
December 2, 2010 at 3:49 pm
John Mitchell-245523 (12/2/2010)
How do you know that the individual column used 100GB?It could be that deleting the column freed a certain amount of space on each data page, but that the table definition is such that no other data could occupy the space that was freed. Please will you post a CREATE TABLE statement for your table? How many rows does it contain?
John
That was my other question. How can i findout how much space is used by a specific column in a table.
December 2, 2010 at 7:41 pm
Suresh B. (12/1/2010)
Try
DBCC CLEANTABLE
Hope this helps you.
Thanks Suresh. Theoritically it seems like it will work. I ran DBCC CLEANTABLE (Test_db,"dbo.Mytable") and took 7 hours and is still running. Mytable is 200gb. I do see change in I/O from sp_who2. I think this is very slow. Does this work? Thanks
December 2, 2010 at 11:00 pm
sqldba_icon (12/2/2010)
Suresh B. (12/1/2010)
Try
DBCC CLEANTABLE
Hope this helps you.Thanks Suresh. Theoritically it seems like it will work. I ran DBCC CLEANTABLE (Test_db,"dbo.Mytable") and took 7 hours and is still running. Mytable is 200gb. I do see change in I/O from sp_who2. I think this is very slow. Does this work? Thanks
As a sidebar and especially if you have the Enterprise Edition, you should probably consider partitioning that table.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 3, 2010 at 4:42 am
sqldba_icon (12/2/2010)
John Mitchell-245523 (12/2/2010)
How do you know that the individual column used 100GB?It could be that deleting the column freed a certain amount of space on each data page, but that the table definition is such that no other data could occupy the space that was freed. Please will you post a CREATE TABLE statement for your table? How many rows does it contain?
That was my other question. How can i findout how much space is used by a specific column in a table.
Two methods to finding table size. Column size can only be estimated by doing the Second method (the math version).
First: Go to SSMS. In the Summary page, go to databases, then open up a database. Click Report and choose "Disk Usage". It takes several minutes for this to generate. Once it's done, there will be an option under the graphs called "Disk Spaced Used by Tables." It will tell you the size of each table, the # of records, etc. I don't know if this report includes Large Objects or Large Values in it's estimates, BTW.
Second: By doing math. Searching "data types [SQL Server]" (without the quotes) in Books Online takes you to a page with every single data type. By clicking on the page's link to those data types, you can find the number of bytes each datatype takes. Count the number of columns for each data type you have and for the absolute numerics, you can just multiple the number of columns by the number of bytes. For the strings and approximate numerics, you have to actually add the size to figure out the space each column is taking. Char values (non-variable strings) are a solid 1 to 1 match. Variable values (anything with "var" in the name) adds 2 bytes of overhead for each column. That overhead is what SQL Server needs for adjusting the size of the field.
Add those totals all together, multiple it by the number of rows you have in the table, and you have your approximate Table Size. Understand if you have a lot of variable columns that your number might be over what the table size actually is, but in SQL it's better to overestimate than underestimate.
And remember, Large Object and Large Value data types (Text, NText, VarBinary, and MAX vars) are all stored separately from the main data pages. Books Online can give you details on those. This means that the table pages have, for lack of a better term, pointers that goes to this data and this data isn't really considered in the row size for the table.
December 3, 2010 at 8:19 am
Brandie Tarvin (12/3/2010)
sqldba_icon (12/2/2010)
John Mitchell-245523 (12/2/2010)
How do you know that the individual column used 100GB?It could be that deleting the column freed a certain amount of space on each data page, but that the table definition is such that no other data could occupy the space that was freed. Please will you post a CREATE TABLE statement for your table? How many rows does it contain?
That was my other question. How can i findout how much space is used by a specific column in a table.
Two methods to finding table size. Column size can only be estimated by doing the Second method (the math version).
First: Go to SSMS. In the Summary page, go to databases, then open up a database. Click Report and choose "Disk Usage". It takes several minutes for this to generate. Once it's done, there will be an option under the graphs called "Disk Spaced Used by Tables." It will tell you the size of each table, the # of records, etc. I don't know if this report includes Large Objects or Large Values in it's estimates, BTW.
Second: By doing math. Searching "data types [SQL Server]" (without the quotes) in Books Online takes you to a page with every single data type. By clicking on the page's link to those data types, you can find the number of bytes each datatype takes. Count the number of columns for each data type you have and for the absolute numerics, you can just multiple the number of columns by the number of bytes. For the strings and approximate numerics, you have to actually add the size to figure out the space each column is taking. Char values (non-variable strings) are a solid 1 to 1 match. Variable values (anything with "var" in the name) adds 2 bytes of overhead for each column. That overhead is what SQL Server needs for adjusting the size of the field.
Add those totals all together, multiple it by the number of rows you have in the table, and you have your approximate Table Size. Understand if you have a lot of variable columns that your number might be over what the table size actually is, but in SQL it's better to overestimate than underestimate.
And remember, Large Object and Large Value data types (Text, NText, VarBinary, and MAX vars) are all stored separately from the main data pages. Books Online can give you details on those. This means that the table pages have, for lack of a better term, pointers that goes to this data and this data isn't really considered in the row size for the table.
Brandie the column i have deleted is a ntext data type. Would DBCC Cleantable still help? If not how do i clear space , please advice.
December 3, 2010 at 8:29 am
sqldba_icon (12/3/2010)
Brandie the column i have deleted is a ntext data type.Would DBCC Cleantable still help? If not how do i clear space , please advice.
Do you have a copy of SQL Server Books Online on your PC? If not, I advise downloading a copy from Microsoft. It's a wonderful resource.
Books Online (DBCC CLEANTABLE)
DBCC CLEANTABLE reclaims space after a variable-length column is dropped. A variable-length column can be one of the following data types: varchar, nvarchar, varchar(max), nvarchar(max), varbinary, varbinary(max), text, ntext, image, sql_variant, and xml. The command does not reclaim space after a fixed-length column is dropped.If the dropped columns were stored in-row, DBCC CLEANTABLE reclaims space from the IN_ROW_DATA allocation unit of the table. If the columns were stored off-row, space is reclaimed from either the ROW_OVERFLOW_DATA or the LOB_DATA allocation unit depending on the data type of the dropped column. If reclaiming space from a ROW_OVERFLOW_DATA or LOB_DATA page results in an empty page, DBCC CLEANTABLE removes the page. For more information about allocation units and data types, see Table and Index Architecture.
How many LOB columns does this table have? If more than just that one, chances are your data pages still have information on them and you can't clean them up.
December 3, 2010 at 8:39 am
I think this is a job for "Captain Obvious"! 😀
How about creating a new table, populating the new table and dropping the old one?
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.December 3, 2010 at 8:48 am
Paul,
Repopulating a table isn't necessarily that easy. Especially if it has foreign keys, indexes, constraints, etc. If it's a heap, then not so bad. But you're also forgetting the question of whether or not this table is in Production and the company in question is subject to SOX laws (or other laws that make it illegal to mess with the database using such a quick fix and no documentation).
Yes, it's obvious. No, it's not a good idea.
December 3, 2010 at 8:57 am
sqldba_icon (12/3/2010)
Brandie the column i have deleted is a ntext data type. Would DBCC Cleantable still help? If not how do i clear space , please advice.
Yes. DBCC CLEANTABLE works for ntext data type.
By the way did the DBCC CLEANTABLE completed?
If the table has large number of rows, there is an option to mention the number of rows processed per transaction.
For example:DBCC CLEANTABLE (Test_db,"dbo.Mytable", 1000)
Please read the BOL for more information.
December 3, 2010 at 2:13 pm
Brandie Tarvin (12/3/2010)
Repopulating a table isn't necessarily that easy. Especially if it has foreign keys, indexes, constraints, etc. If it's a heap, then not so bad. But you're also forgetting the question of whether or not this table is in Production and the company in question is subject to SOX laws (or other laws that make it illegal to mess with the database using such a quick fix and no documentation).
Brandie,
We do reorg very large tables with FKs, Indexes, Constraints, Triggers, etc., all the time - that's what we call "maintenance".
Last but not least, please point me to the exact SOX paragraph where it says you cannot reorg a table - that's rubbish.
_____________________________________
Pablo (Paul) Berzukov
Author of Understanding Database Administration available at Amazon and other bookstores.
Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply