Best way to purge data from bulky tables

  • I am looking out for the best ways to delete huge data from much bigger tables. What can be the best way to do it ? (without affecting performance, avoiding lockings and secured method)

  • For example, you have table A with approx 1000000000 rows and you have to purge around 10000000 rows on the basis of column LastupdatedDate. Total columns in the table are 58.

  • If your table is partitioned (only in Enterprise Edition if I'm not mistaken), you can easily swich out an older partition. This goes really fast.

    Otherwise, you can use the DELETE statement. Use a loop to execute the DELETE statement in batches - for example 1000 rows - to keep your transactions small.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Thanks koen, thats what i was thinking. Wanted to confirm if thats the only best way to do it.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply