April 11, 2006 at 5:04 pm
Hi guys
We have a table A which has an Id column of type INT. There are currently about 260705246 rows in the table and growing at 100,000,000 rows a month. We will hit the limits of INT pretty soon at this rate. So I was asked to study the different options to convert the INT to BIGINT for the table. Conditions: (1) The downtime should be the least (2) the method should be 100% reliable and no unknown factors going into the rollout. I did my research and came up with a couple of methods:
(1) ALTER table ( We tried this and it failed.)
(2) create a new column in the same table. repopulate the column with existing data using a job. Drop the old column. rename the new column back to what it was ( or leave it as is). Recreate the indexes/FK constraints with the new column instead of the old column.
(3) create a new table B with BIGINT datatype. BCP the data from A into B. create all the indexes. drop the old tables. rename the new table to the old one.
I am suggesting method 3 as it requires least number of operations => less changes (and less places to) of screw up and less complicated.
If anyone else who has done this before can provide some feedback/do's and dont's etc I would apprecaite it.
******************
Dinakar Nethi
Life is short. Enjoy it.
******************
April 12, 2006 at 2:02 am
Option 3 is your best bet, especially as you can batch the data transfers to avoid overflow of your transaction log.
When I move data this way I usually put the database into simple recovery mode for the duration of the transfer.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply