November 2, 2007 at 2:50 pm
I have a table with a varbinary(8000) column that was about 13 GB and altered the column to a varbinary(max) column. After the alter statement completed, the size of the table was about 21 GB. I thought that varbinary(max) only used the space for the actual length of the data plus 2 bytes. So I'm a little confused as to why the table grew so large. Can someone explain that or point me in a good direction to read up on it?
Thanks!
Wendy Schuman
November 2, 2007 at 8:19 pm
What are you using to determine the size of the table?
--Jeff Moden
Change is inevitable... Change for the better is not.
November 4, 2007 at 10:01 am
It's possible that the alter added extents and the size hasn't been properly adjusted. You might try updating the usage with DBCC
November 5, 2007 at 8:25 am
I determined the size of the table by checking the data space used in the table properties dialog box. Also, as the alter column was being executed I was watching the free space on the disk getting lower and lower as the alter statement was executing.
I tried using the DBCC UPDATEUSAGE, but it didn't change anything. Any other ideas?
Thanks for you help.
Wendy Schuman
November 6, 2007 at 5:06 am
fragmentation may also be the case.
try rebuilding the table's clustering index.
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply