data update strategy

  • Hello - I'm trying to think of some good strategies for making data updates in Prod. Transactions provide the ability to commit and rollback as needed. Is there a way to use normal SQL constructs to do something like the following:

    1. Execute a SQL update in prod. The data should be updated but ideally this update should be easily rolled back

    2. Have a QA resource run some queries to validate that the data was updated as expected.

    3. If the data updates pass QA validation then the release manager executes some type of commit to permanently commit all of the data updates that have been made.

    Is there a way to implement this type of scenario somewhat easily? I think normally if you begin a transaction and update some data then run a select before commit then you lose the updates that were made after begin transaction. It's been a while since I've worked with db transactions so my memory of this process is vague and there may also have been advances with newer versions of sql server?

  • You could use the OUTPUT clause and store old and new values in a review table using the INSERTED and DELETED internal tables.

    Ideally, this would be part of the update script, assuming the update is done using a stored procedure only. Otherwise you could also use an AFTER UPDATE trigger.

    This would allow to perform the update and still being able to rollback later on if needed.

    But you'll need to think about a larger variety of scenarios. A few examples:

    a) A business rule based rollback will result in yet another update that'll end up in the QA process since it will perform yet another update. Or you have a column in your table indicating whether it's a "user update" or a "QA rollback" and react appropriately.

    b) How would you deal with three updates to a row being monitored where the one in the middle will be rejected by QA? What should be the expected value in the column - the one before the second update, the one after the third update or "it depends"?



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • Not to mention that adding all those checks into a single transaction increases the time that transaction will run which increases the likelihood of blocking and deadlocks.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply