Read data row wise with multiple transaction

  • Hi All,

    I have two table customerorder and customerorderdest. I wrote SP which load data from customerorder into customerorderdest. currently i am using select top 1 but i want it dynamic. first transaction will fetch first row... second transaction will fetch second row ... so on.....

    Also let me know is there any locking effect because all 6 transaction using same tables. Please find attachment to create tables with data and SP.

    Thanks,

    Lat

  • Lat

    Why are you doing this row-by-row? Why not update everything in one go?

    John

  • I have total more than 1 million records and my SP will use by .net application with 15-20 thread. so Application read one row and update status base on some parameter.

  • A million rows is not large, and processing row by row is inefficient.

    Don't do it from .net (even slower), do whatever you need as a single transaction in T-SQL.

    If the workload needs paralleling, SQL can do it automatically.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Understand but .net application need to update status back to table. so i am worried about locking issue

  • I strongly recommend you reconsider the entire process.

    Leave .net out, other than to call the stored procedure. Write the load process in T-SQL in a single batch. Do the same with whatever status the app would have sent.

    That way you don't need to manage multi-threading in the app, you don't need to worry about locking, and the process will likely run faster than one-row-at-a-time roundtrip processing could ever have done.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster (9/21/2016)


    I strongly recommend you reconsider the entire process.

    Leave .net out, other than to call the stored procedure. Write the load process in T-SQL in a single batch. Do the same with whatever status the app would have sent.

    That way you don't need to manage multi-threading in the app, you don't need to worry about locking, and the process will likely run faster than one-row-at-a-time roundtrip processing could ever have done.

    I agree. And if locking proves to be a problem after doing this, please open another thread to discuss the changes needed to your proc to resolve it.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Here's another vote for leaving .NET entirely out of it. You will get enormous throughput increases by just having SQL do this in a procedure. Running 20 instances of any application all of which are inserting/updating/deleting one row at a time is like a death by a thousand cuts. Slow and painful.

    __________________________________________________

    Against stupidity the gods themselves contend in vain. -- Friedrich Schiller
    Stop, children, what's that sound? Everybody look what's going down. -- Stephen Stills

  • Piling on with Gail, Phil and Dixie, let me also cast my vote for having .NET call the stored procedure and doing the work there as a set. Row-based processing doesn't scale well at all, while set-based processing does. Don't be intimidated by 1M rows and think you have to engineer a complicated, multi-threaded .NET application. Let SQL Server do what it does best.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply