Updating large table with minimal impact to the live application

  • Hi,

    I have a large table with over 5 million records which is being accessed all the time. what are the different ways to update with minimal user impact ?

  • Your question is really vague. Would you please explain in greater detail what you mean by updating the table?

  • Do you mean Altering the table structure?

  • sorry for not being clear. I want to update all the rows which may take quite some time but many users are accessing the table. I was asking is there any good way so that i can update the table and the users can still access the table without any issues ? The table is not directly accessed but through web application

  • MysteryJimbo (9/7/2012)


    Do you mean Altering the table structure?

    No. I meant updating the rows ( values ) of some columns

  • ek-822869 (9/7/2012)


    sorry for not being clear. I want to update all the rows which may take quite some time but many users are accessing the table. I was asking is there any good way so that i can update the table and the users can still access the table without any issues ? The table is not directly accessed but through web application

    The best way to do it would be to update the rows iteratively in batches. I would create a working table with all the primary keys and use this to loop through and update the actual table.

  • Of course my suggestion assumes the rows can be in different states whilst the update is ongoing.

    If the rows need update simultaneously I would do this

    1. create a duplicate table

    2. create an insert/update/delete trigger on the source table to maintain data in the duplicate table

    3. Backfill the data in the duplicate

    4. Update the duplicate table

    5. Use sp_rename the old table and the duplicate to make the switch

  • MysteryJimbo (9/7/2012)


    The best way to do it would be to update the rows iteratively in batches. I would create a working table with all the primary keys and use this to loop through and update the actual table.

    MysteryJimbo (9/7/2012)


    Of course my suggestion assumes the rows can be in different states whilst the update is ongoing.

    If the rows need update simultaneously I would do this

    1. create a duplicate table

    2. create an insert/update/delete trigger on the source table to maintain data in the duplicate table

    3. Backfill the data in the duplicate

    4. Update the duplicate table

    5. Use sp_rename the old table and the duplicate to make the switch

    Thank you

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply