Sending batches of SPs at one time from Application

  • Not all developers are evil which is why I was compelled to post...that sentiment had started to shine through a bit too brightly and it was unwarranted 🙂

    <my_take>

    Best options in order from best to worst:

    1) Upgrade to SQL 2008 (probably not realistic) and create a proc that accepts a TVP and returns one resultset. (very strong on performance, concurrency and data integrity)

    2) Create a proc that works with SQL 2005 that accepts a (fake) array of TRANSACTIONIDS as an XML node or delimited string and returns a single resultset. (strong on performance, very strong on concurrency and data integrity)

    3) Leave as-is in terms of sending batches but ask developers to execute the batch within SERIALIZABLE or SNAPSHOT isolation level. (OK on performance, not as strong on concurrency, very strong on data integrity)

    4) Leave as-is in terms of sending batches and accept the possibility that proc execution 10 in the batch may be working off data not consistent with proc execution 9 or earlier within the same batch...potentially allowing the application to deliver a skewed view of reality. (OK on performance, OK on concurrency, potentially poor data integrity depending on other database activity)

    5) Send one proc call per batch and run within SERIALIZABLE or SNAPSHOT isolation level as setup on the .NET connection. (poor performance and concurrency, strong data integrity)

    6) Send one proc call per batch and run within the default READ COMMITTED isolation level. (poor on performance, OK on concurrency, potentially poor data integrity depending on other database activity)

    </my_take>

    If you have a moment down the line please post the resulting direction you choose to move.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

Viewing post 16 (of 15 total)

You must be logged in to reply to this topic. Login to reply