Delete Large records 20 million rows!!! FAST

  • Unfortunately the software designer needs to take into account how many rows their queries can/will affect. SQL Server, in order to preserve data integrity, will put every record that gets deleted in one batch into one transaction, so that it can put everything back the way it was if the is a failure on the 2nd record or the 5 milionth record.

    The software designer has to take this sort of thing into account and make the delete or other action do its job in batches so that they balance the need for transactional integrity with the usage of the transaction log space.

    Many software designers don't realize that they can batch things to preserve the transaction log space but still preserve transactional integrity by using an explicit transaction.

    I'd be looking for alternatives to using that software for the deletes, then yelling at the software company.

    Something else that may help, set the recovery model to bulk-logged. This should cause less tran log space to be used for a given delete size.

    Chris

Viewing post 16 (of 15 total)

You must be logged in to reply to this topic. Login to reply