VB APP vs Stored Procedure

  • I have been given the assignment of re-designing our billing process. I was hoping for some advice on the best solution.

    Currently a rather large StoredProcedure (>1K lines) runs each night from the SQL Agent. The SP takes nearly 2 hours to complete

    essentially it performs the following functions:

    1.)Creates a temp table

    2.)Creates a Cursor from a several million row DB based on the previous days date. (probably 10-20K rows)

    3.)Converts this data to a billable format

    4.)Adds it to the temp DB

    Repeats steps 2-4 two more times. The difference between the cursors is only a single field - the remainder of the cursor is identical

    once this is done, all info in the TEMP db is added to a Billing DB.

    Is this the most efficient way of doing this? I was thinking about writing a VB app with some ODBC to let another machine crunch the numbers instead of my Production SQL server, and also returning only 1 larger cursor instead of 3

    THX

  • I'll think understanding in detail what the app does and why should be step one.

    Optimizing current sp or rewriting it.  Example do not mix DML and DDL, as you have mention getting rid of some of the cursors.

    Win some time with faster number crunching but with relative slow network transfer time versus doing everything on the same box but if SQL server box is already maxed out may not be the fastest alternative.

     

  • I feel your sp is consists of lot of unwanted code. and as 5409045121009  said dont mix up DDL in SP as each time it will go for a recompiling. and split the sp to 2-3 SP's  and based on the use of Cursor decide weather you can use a table variables instead of cursor (incase if you are using a cursor to store data and later insert to temp db go for a table variable  and delete the used row after inserting each row , so memory will be freed by each delete in the variable) . Moving the process to another machine is not a bad idea but again the tranfer may take some time .

    Again have you tried to use the profiler and index tuning wizard to imrove the performance of this SP? if you are not tried try this before do anything. (strongly recomended).

    /-Jeswanth

    --------------------------------

  • UNLESS you are doing something in the cursor which requires the results of 1 record in the cursor to affect the calculations of the next (or subsequent) record in the cursor......I'd seriously advise you to dump the cursor!!!

    Moving to a set-based solution would be a far better solution.  The performance difference between set-based processing and cursor-based processing is 99.99% of the time phenominal.

     

    Post your SP, DDL for the tables involved, some sample input + (matching) output data....and you may do far better (in the short run in solving this problem, and in the long run in learning a far better solution technique) than tweaking the cursor/location of processing.

     

    A cursor is like using a tweezers to fill a cup full of sugar grain by grain...set-based is like using a spoon to do the same thing!

  • I would retain the function as a stored procedure. Moving processing outside SQL Server will introduce more windows for error, esp. if you're moving large result sets to & from the SQL Server.

    Sounds like your procedure should be completely rewritten though and possibly split into several procs. Ditch inefficient cursors in favour of more intuitive set based SQL.

     

  • Chris,

    As the others have said the first step is to understand why they did what they did and then decide if you can make changes.

    Next, look at the code and see if it can be broken down into smaller stored procedures that you can call from a controlling or master stored procedure. Chech the app to make sure that the routines aren't doing something that is already coded. I have seen this before, where the inserting of record has a stored procedure and then the import routine has recreated that same logic but the problem is that if you change the logic, you now have to change it in two spots.

    Also, sometimes you might be able to use DTS to bring in the data instead ot BCP. This allows you to do some transformation via VBscript and other DTS tasks before it hits any production tables. I like DTS as it allows me to control the workflow (both failure and success) and have a lot better error checking.

    Since the process creates a temp table each time, why not just make it permanent but truncate it when you are done processing the nightly upload or before you begin the nightly upload. This removes the DDL steps from the stored procedure and gives some consistency. I typcially name those types of temporary tables with a 'ztmp' prefix.

    Good luck!

    Shawn

  • Chris,

    I think the stored procedure is very inefficient in whatever it is doing. This usually happens when the functionality grows over time and lot of patches are put to it. I think it is right time to consider refactoring it. I would not suggest moving it to VB or any other front end -unless you consider formatting the printing.

    My favourite way of writing business logic in the backend is thru views and UDFs, a nice view with appropriate joins and predicates will take care of lot of logic, a smart UDF will take care of the rest. Consider dumping the cursors.

    If you think a part of code requires some turnig or rewriting, please post it and i will see if i can get a efficient way of doing it.

    Good Luck!

    Anand.

  • Thank you for all the responses thus far. We have 3 produciton servers that have proprietary data translation software installed. Each server also has SQL 7 installed. 

    Nightly, a billing process on each machine runs to calculate charges for our clients.

    I want to wrap this lengthy SP to a VB app on a fourth(non-Production) machine to

    A free up resources on the Production Machines (they need to be available 24/7)

    BConsolidate the process, with better error handling and alert capabilities

    Although I cannot post the entire SP here is the primary query that is repeated three times(3 times each server).  The only difference is the "Direction" value is different.  

    DECLARE interchange_cursor_i2a CURSOR FOR

      SELECT Interchange_tb.InterchangeKEY,

       Interchange_tb.NbrBytes,

       Interchange_tb.ControlNumber,

       Interchange_tb.Agency,

       Track_tb.TransactionSetID

      FROM Interchange_tb

      LEFT JOIN Track_tb ON Track_tb.InterchangeKey = Interchange_tb.InterchangeKEY

      WHERE (Track_tb.Direction = 0 OR Track_tb.Direction is NULL)

      AND (Interchange_tb.TimeCreated BETWEEN @begdati AND @enddati)

      AND  (Interchange_tb.Agency = 69 /* EDIFACT */ OR Interchange_tb.Agency = 88 /* X12 */ OR Interchange_tb.Agency = 85 /* UCS */)

    OPEN interchange_cursor_i2a

    /*** For each entry in the cursor, load the DB values into the Variables ***/

     FETCH NEXT FROM interchange_cursor_i2a

     INTO @InterchangeKEY, @InterchangeBytes, @ControlNumber, @InterchangeAgency, @TransactionType

     

     SELECT @InterchangeKEY, @InterchangeBytes, @ControlNumber, @InterchangeAgency, @TransactionType

     

    WHILE @@FETCH_STATUS = 0

     BEGIN

      

      SET @WhichBytes = ''

      SET @ApplicationBytes = 0

      SET @MostCount = 0

      SET @PartnerEDI = ''

      SET @Partner1 = ''

      SET @Partner2 = ''

      SET @DateStamp = NULL

      

    /*** Set the variable for applicationbytes equal to the ByteCount where the Direction = 0 and the interchange key

      is equal to the variable @interchangeKey and Gentran  ***/

     SELECT @ApplicationBytes = SUM(ExtdataXref_tb.ByteCount)

       FROM Document_tb

       RIGHT JOIN Track_tb ON

         Document_tb.DocumentKEY = Track_tb.DocumentKEY

       RIGHT JOIN Interchange_tb ON

         Track_tb.InterchangeKEY = Interchange_tb.InterchangeKEY

       RIGHT JOIN ExtDataXref_tb ON ExtDataXref_tb.GentranKEY = Document_tb.DocumentKEY

       WHERE Track_tb.Direction = 0

         AND Interchange_tb.InterchangeKEY = @InterchangeKEY

         AND ExtDataXref_tb.GentranType = 1 /* 1 = Document, 0 = Interchange */

     

     SELECT @ApplicationBytes

    /*** Get the value of the PartnerEDI ***/

      SELECT @PartnerEdi = Partner_tb.EDICode,

         @DateStamp = DATEADD(ss, Interchange_tb.TimeCreated, @constantdate),

         @PartnerKEY = Partner_tb.PartnerKEY

       FROM Partner_tb

       RIGHT JOIN Interchange_tb ON Interchange_tb.PartnerKEY = Partner_tb.PartnerKEY

       WHERE Interchange_tb.InterchangeKEY = @InterchangeKEY

    /*** Get the Partner1 and Partner2 Variables ***/

       SELECT @Partner1 = Lookup_tb.Text1, @Partner2 = Lookup_tb.Text2 FROM Lookup_tb

       RIGHT JOIN Interchange_tb ON Interchange_tb.PartnerKEY = Lookup_tb.PartnerKEY

       WHERE Lookup_tb.Item = 'BILLING' AND Interchange_tb.InterchangeKEY = @InterchangeKEY

      

     SELECT @Partner1, @Partner2

     

    /*** Determine wich byte count is larger, the Application or the Interchange bytes ***/

     IF (SELECT @InterchangeBytes) > (SELECT @ApplicationBytes)

      BEGIN

       SET @WhichBytes = 'Interchange'

       SELECT @MostCOunt = @InterChangeBytes

      END

     ELSE

      BEGIN

       SET @WhichBytes = 'Application'

       SELECT @Mostcount = @ApplicationBytes

      END

     IF (@ApplicationBytes is NULL)

      BEGIN

       SET @WhichBytes = 'Interchange'

       SELECT @Mostcount = @InterChangeBytes

       SET @ApplicationBytes = 0

      END

     SET @InterchangeAgencyText = 'OTHER'

     

    /*** Make a note of the InterChangeAgency type ***/

     IF(@InterchangeAgency = 88)

      BEGIN

       SET @InterchangeAgencyText = 'X12'

      END

     IF(@InterchangeAgency = 69)

      BEGIN

       SET @InterchangeAgencyText = 'EDIFACT'

      END

     IF(@interchangeAgency = 85)

      BEGIN

       SET @InterchangeAgencyText = 'UCS'

      END

     SELECT @WhichBytes, @MostCount, @InterchangeAgencyText

    /*** Begin the routine for beginning and ending Space Trimming ***/

     WHILE SUBSTRING(@TransactionType, 1,1) = ' '

      BEGIN

       SET @TransactionType = SUBSTRING(@TransactionType, 2, LEN(@TransactionType) +1)

      END

     SET @TransactionType = SUBSTRING(@TransactionType, 1, LEN(@TransactionType))

    /*** Check to make sure there is not already an entry for this InterchangeKey in the TEMP DB ***/

     IF EXISTS (SELECT * FROM ##billingInformation WHERE InterchangeKEY = @InterchangeKEY)

       BEGIN

         IF NOT EXISTS (SELECT * FROM ##billingInformation

           WHERE InterchangeKEY = @InterchangeKEY AND

           TransactionType LIKE '%:' + @TransactionType + ':%')

         BEGIN      

           SELECT @TransactionTypeTEMP = TransactionType

           FROM ##billingInformation WHERE InterchangeKEY = @InterchangeKEY

          

           SET @TransactionTypeTEMP = @TransactionTypeTEMP + @TransactionType + ':'

          

           UPDATE ##billingInformation SET TransactionType = @TransactionTypeTEMP

           WHERE InterchangeKEY = @InterchangeKEY

         END

       END

      ELSE

       BEGIN

         INSERT INTO ##billingInformation VALUES (

          @InterchangeKEY,

          @InterchangeBytes,

          @InterchangeAgencyText,

          @ApplicationBytes,

          @WhichBytes,

          @MostCount,

          @PartnerEdi,

          @ControlNumber,

          @PartnerKEY,

          @Partner1,

          @Partner2,

          @DateStamp,

          ':' + @TransactionType + ':',

          'I2A'

     &nbsp

       END

     FETCH NEXT FROM interchange_cursor_i2a

     INTO @InterchangeKEY, @InterchangeBytes, @ControlNumber, @InterchangeAgency, @TransactionType

    END /* While @@Fetch_Status */

     CLOSE interchange_cursor_i2a

     DEALLOCATE interchange_cursor_i2a

  • Chris,

    A few thoughts.

    --Paul

    -Can the cursor be optimized to Fast Forward for example.

    -Can a difference in indexes or statistical hints help?

    -If the cursor is NOT dynamic or updateable than there is a chance you may be able to eliminate it to perhaps an Update for example.

    -In a non-Prod environment, run your Billing process with Profiler and PerfMon turned on for a full capture. Quite often you will find that one or two steps are taking 70-90% of the time.

    -Run Index Tuning Wizard on the Profiler output and see what it suggests.

    -Sometimes chaning the temp table to a static one can help also depending on how it used and the indexes/hints created. I once a converted a temp to static. Front loaded 4m rows into it. Created a few indexes and hints which then resulted in dramatic performance improvement.

    -Even compiled VB is not generally known for excessive processing performance, where as C++ or C is. C# is also generally considered faster than compiled VB. Of course if VB skills are prevalent than this is a mute point.

  • (Track_tb.Direction = 0 OR Track_tb.Direction is NULL)

    -- can be changed to

    IsNull(Track_tb.Direction,0)=0

    ---------------------------------------------------------------------------------------------------

    WHILE SUBSTRING(@TransactionType, 1,1) = ' '

      BEGIN

       SET @TransactionType = SUBSTRING(@TransactionType, 2, LEN(@TransactionType) +1)

      END

    -- Rather use

    SET @TransactionType = LTRIM(@TransactionType)

    ---------------------------------------------------------------------------------------------------

    -- Can this be changed from

           SELECT @TransactionTypeTEMP = TransactionType

           FROM ##billingInformation WHERE InterchangeKEY = @InterchangeKEY

          

           SET @TransactionTypeTEMP = @TransactionTypeTEMP + @TransactionType + ':'

          

           UPDATE ##billingInformation SET TransactionType = @TransactionTypeTEMP

           WHERE InterchangeKEY = @InterchangeKEY

    -- to

           UPDATE ##billingInformation SET TransactionType = TransactionType + @TransactionType + ':'

           WHERE InterchangeKEY = @InterchangeKEY

    ---------------------------------------------------------------------------------------------------

    Bit strained for time but I am pretty sure some of the eagled eye people on this forum will be able to rewrite this into two or three update/insert statement. Senior moment from me(sic) but spend time on understanding what the code is trying to achieve and then convert it into a set based solution. From my own experience I cut the runtime down to a third of the VB app runtime and last count it was running 510 days without any further been maint done on it.

    Just promise them a virtual six pack!

     

  • Chris,

    I went thru your SP and i joined all the tables and created a view instead.

    Where every the temp table is used read from this view instead with the appropriate date parameter

    This view should speed up your process, Only thing missing in this code is the appropriate GROUP BY clause, i don't know your table keys so you can appropriately add the group by clause for the result. I also found a very crude way of doing RTRIM(LTRIM()) in your stored procedure, if you just remove that piece only you should see about 20% of performance increase. If you have questions about the view then please post again and i will try to help you. 

    create view vw_Billing

    as

    SELECT

     Interchange_tb.InterchangeKEY,

        Interchange_tb.NbrBytes,

        Interchange_tb.ControlNumber,

        Interchange_tb.Agency,

        RTRIM(LTRIM(Track_tb.TransactionSetID)),

     isnull(SUM(ExtdataXref_tb.ByteCount),0) ApplicationBytes,

     Partner_tb.EDICode PartnerEdi,

          DATEADD(ss, Interchange_tb.TimeCreated, @constantdate) DateStamp,

          Partner_tb.PartnerKEY PartnerKEY,

     Lookup_tb.Text1 Partner1,

     Lookup_tb.Text2 Partner2,

            case when SUM(ExtdataXref_tb.ByteCount) is null

                      then 'Interchange'

          WHEN Interchange_tb.NbrBytes > SUM(ExtdataXref_tb.ByteCount)

                     then 'Interchange'

                 ELSE 'Application'

                 end  WhichBytes,

            case when SUM(ExtdataXref_tb.ByteCount) is null

                      then Interchange_tb.NbrBytes

          when Interchange_tb.NbrBytes > SUM(ExtdataXref_tb.ByteCount)

                      then Interchange_tb.NbrBytes

                 ELSE SUM(ExtdataXref_tb.ByteCount)

                 end  MostCOunt,

     'OTHER' InterchangeAgencyText,

     case when Interchange_tb.Agency = 88

                    then 'X12'

          when Interchange_tb.Agency = 69

                    then 'EDIFACT'

          when Interchange_tb.Agency = 85

                    then 'UCS'

                 end InterchangeAgencyText

    FROM      Interchange_tb

    LEFT JOIN   Track_tb ON

         Track_tb.InterchangeKey = Interchange_tb.InterchangeKEY

    LEFT JOIN   Document_tb ON

         Document_tb.DocumentKEY = Track_tb.DocumentKEY

    LEFT JOIN  ExtDataXref_tb ON

         ExtDataXref_tb.GentranKEY = Document_tb.DocumentKEY

    LEFT JOIN  Partner_tb ON

         Interchange_tb.PartnerKEY = Partner_tb.PartnerKEY

    LEFT JOIN  Lookup_tb ON

         Interchange_tb.PartnerKEY = Lookup_tb.PartnerKEY

    WHERE (Track_tb.Direction = 0 OR Track_tb.Direction is NULL)

    -- Where every this temp table is used read from this view instead with the

    -- appropriate date parameter

    --AND   (Interchange_tb.TimeCreated BETWEEN @begdati AND @enddati)

    AND   (Interchange_tb.Agency = 69 /* EDIFACT */ OR Interchange_tb.Agency = 88 /* X12 */ OR Interchange_tb.Agency = 85 /* UCS */)

    AND   ExtDataXref_tb.GentranType = 1 /* 1 = Document, 0 = Interchange */

    AND   Lookup_tb.Item = 'BILLING'

    GROUP BY

    -- put appropriate GROUP BY CLAUSE HERE. Need to see the key structure to find out the

    -- appropriate group by clause

     

  • Hi,

    Your Program looks like following

    You are getting a lot of records in to a Cursor and each row you are taking and processing and inserting to another temp table.

    I suggest to do the following things

    1. Instead of Cursors Use tables.

     Take always the top 1 of the table and as soon as you insert the values into a table delete teh current row. so teh size of the table will reduce each time Will save memmory for you.

    2. I dont know weather you are usingset nocount otion. if youare not using suppresee the output fromeach select statement.

    3. why you are using IF (SELECT @InterchangeBytes) > (SELECT @ApplicationBytes)

     Why dont you use simply @Var1 > @var2  ??

    4. Avoid using of Select * statements use select fld1,fld2 etc even if you are returning all fields.

    5 Most impornat create a flow chart for the program and compare the program so that you can decide what is needed and what not .

    6. Use Index Tuning Once you done with this.

    Hope it will help you.

    -/Jeswanth

     

    --------------------------------

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply