asynchronous execution

  • SQL Server 2000

    we receive data via stored procedure with an XML stream as parameter from an external system (BizTalk) in our central interface database. The external system receives a status information when data are successfully transferred. There are two time consuming processes that should be executed immediately after the transfer from the external system is done. The external system should not wait until further processing is done.

    My first approach was a trigger firing when the status was set to "Available for further processing". This trigger calls a procedure that does the further processing. Unfortunately the procedure that is called from the external system waits until all processing is done, before it terminates and sends its status information back to BizTalk.

    How can I achieve asynchronous processing immediately after the initial tranfer of data is done? Infrastructure does not allow web services. Scheduled jobs are no option because we expect a very high frequency.

  • My approach would be to create a semaphore table and a scheduled Job to process all rows using the semaphore table to communicate asynchronously. You app will return as soon as it has inserted or updated a row in the semaphore table and the Job will process as many rows that exist in the semaphore table. The latency would be the interval between the scheduled Job run.

    I have used tables with no RI to allow TRUNCATE TABLE to reset identity value, and use the INSERT / DELETE approach with further stage tables, so there can be a coordinated multi-threaded (staggered jobs) support.

    If designed properly this is also a great way to implement set based processing from a procedural requirement. Example: take a dynamic "setup" of a task / location requirement, procedurally insert rows into the stage tables (could include related tables, for sub tasks, milestones, etc.) from the "scheduler".

    Create multiple Jobs with staggered schedules that process this data using set operations, that SELECT / process / DELETE from the stage tables, reporting back using changes in the semaphore table to the "scheduler".

    There are many ways to skin the cat so to speak.

    Andy

  • Hi Andy, thanks for your reply. In between I found a solution that works well, but as more objects and processes involved so I think I will try your approach as well.

    My current approach: The procedure that is called from the external system, creates and executes a Job of type 'CmdExec'. A unique name is generated with the identity value (idi_id) of the monitor table. The command is "start e:\APPS\FPS\Common\Batch\transfer2local.cmd ' + CONVERT(VARCHAR, @rn_idi_id)". The Job is started immediately after creation. The transfer2local.cmd command file contains an osql call to the stored procedure that does the actual processing. After successfull transfer the Job is deleted.

    It works well, but the infrastructure team has not seen it yet. I am quite sure there are concerns on the cmd/osql call. To be honest, I have them as well. If the osql calls fails - for what reason ever - there will be dead operation system tasks. Your approach works with SQL Server only, that seems the better way.

    Thanks again

    Michael

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply