March 10, 2004 at 2:45 pm
Hi-
We have a database that has grow too large for comfort (SQL Server 2000). 66G on a 68 G drive. One of the columns in the main table is a blob holding all the data + that makes up the other columns. We have decided that we can do without the blob in order to allow ourselves more time to deal with the expanding data.
That one table is almost 48 G and I figure that the blob column is at least half of that.
My question is - can I drop that column without logging it? Or perhaps the correct question is - does dropping a column get logged? I fear that we would run out of room for the log file if it has to log the transactions.
Thank you in advance!
March 10, 2004 at 6:41 pm
Not much help but, off the top of my head...
You probably don't need to get rid of the column, just null the values. The image datatype is a pointer to another page, so this operation will happen pretty easy. As far as logging goes, though, it should be minimal, but I 'm not sure as it's been a few versions since I got into that detail: they used to not log any operations to image data.
If they do and you have a prob, you can aways, run a process to do it sometime when no users are on and do a while loop which sets rowcount 1000, updates that column to null, and truncates the log. Then do a complete backup afterwards.
March 11, 2004 at 3:04 am
SELECT without_a_column INTO xxx FROM Original_table;
DROP TABLE Original_table;
RENAME xxx TO Original_table;
Cheers......
March 11, 2004 at 7:22 am
March 11, 2004 at 8:16 am
phanikk,
This won't work because there is not enough space to do this.
Kari, you might implement a combination of the two solutions. First, NULL the values of the blob then use phanikk's solution to get rid of the column.
After nulling the blob you could also use ALTER TABLE or even EM to get rid of the column graphically.
----------------------------------------------------------01010011010100010100110000100000010100110110010101110010011101100110010101110010001000000101001001101111011000110110101101110011
March 11, 2004 at 7:29 pm
I would imagine that nulling the column will result in heaps of logging for rollback purposes.
I could be wrong.... wouldn't be the first time.
If other solutions fail and - provided you have another drive with sufficent space - you can try the following:
1) Backup the database
2) bcp out from a view of the table that excludes the text column - in clustered index order
3) drop/create the table without the text column, create the clustered index
4) bcp the data back in - it should fit ok, but bulk-logged recovery mode is an option
5) create non-clustered indexes
Cheers,
- Mark
March 12, 2004 at 6:27 am
If Disk Space is only concern, I would have done following,
Switch DB recovery mode to SIMPLE (just while the process is running)
Update the column to null in a loop ( or even a cursor) to update a subset of entire table (may be 1000 rows or so at a time if possible)
Drop column
switch back to original recovery mode.
I know it is gonna be a performance hurting excercise. but i would consider as an exceptional case.
March 13, 2004 at 7:23 am
Kari,
As you said that the datatype is blob you will have to use the WRITETEXT FUNCTION to set the column values to NULL (which is a non-logging operation) via a loop or cursor. Then use DBCC UPDATEUSAGE('databasename') to reclaim all the space back and you could use SRHINK DATABASE as well.
Digesh
Digesh
March 15, 2004 at 9:16 am
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply