Stored Procedure Performance

  • This was removed by the editor as SPAM

  • cmartel 20772 wrote:

    In an earlier post, Scott suggested to add BEGIN TRANSACTIONS and COMMIT TRANSACTIONS. As expected this only makes the procedure run much slower.

    I suggested to add them to BLOCKS of UPDATEs/DELETEs (such as 500 or 1000 Us/Ds at a time) rather than having each statement auto-commit as its own transaction.  Did you do the blocks part, or just throw in BEGINs/COMMITs for every one, which would indeed slow it down.

     

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • I want to thank you all for your suggestions. This case had me revisit some concepts of SQL Server I usually ignore or avoid.

    My procedure declares a lot of variable tables, fills them as various records are processed and updates tables having one row per day.

    I was using variable tables because early performance tests proved that they were faster than temporary tables with most cases I encountered. In the actual case, replacing (some, but not all) of the variable tables with temporary tables made the procedure run faster (but slower in other cases).

    I finally started using Memory-Optimized variable tables everywhere and voilà, I get the best performance all the times.

    This does not prevent knowing best practices, like having the right indexes. Interestingly, like Scott suggested, putting blocks of code between BEGIN TRANSACTION and COMMIT TRANSACTION has a positive effect on performance; running the procedure against all my tests databases went from 23:56 to 23:20.

    Thanks again.

  • Sorry, I wasn't being clear before about how to use transactions here.

    If there are no BEGIN TRAN / COMMIT TRAN, then every modification (DELETE / INSERT / UPDATE) will be its own transaction.  For large number of modifications, that's very inefficient.  My suggestion was to use explicit trans so that you could reduce the total number of trans significantly by blocking several modifications into one tran.

    For example:

    DECLARE @how_many_rows_to_commit int;
    DECLARE @modify_count int;

    SET @how_many_rows_to_commit = 500
    SET @modify_count = 0
    BEGIN TRANSACTION

    WHILE @@FETCH_STATUS = 0 BEGIN

    ...

    UPDATE table ...
    SET @modify_count = @modify_count + 1

    IF (condition) UPDATE table ... SET @modify_count = @modify_count + 1

    IF (condition) UPDATE table ... SET @modify_count = @modify_count + 1

    UPDATE table ...
    SET @modify_count = @modify_count + 1

    DELETE FROM table ...
    SET @modify_count = @modify_count + 1

    IF (condition) UPDATE table ... SET @modify_count = @modify_count + 1

    IF (condition) DELETE FROM table ... SET @modify_count = @modify_count + 1

    UPDATE table ...
    SET @modify_count = @modify_count + 1

    ...
    IF @modify_count >= @how_many_rows_to_commit
    BEGIN
    COMMIT TRANSACTION
    BEGIN TRANSACTION
    END

    END

    IF XACT_STATE() <> 0
    COMMIT TRANSACTION

     

    SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".

  • awesome to sharing this forum

    trans4mind.com

Viewing 5 posts - 16 through 19 (of 19 total)

You must be logged in to reply to this topic. Login to reply