November 20, 2007 at 11:00 am
Hi Guys,
I am currently in the process of setting up a new automated process to pick up and process new Orders from my orders table.
Currently I have a dts job that runs every 5 mins and it processes them in a batch of 200 - putting them into sales,customer tables etc...
I now want to change this process and try to implement some sort of concurrent approach to picking up and processing the records...
What I want to do is split up the process so that multiple orders can be processed at the same time.Rather than a serial approach. My question is, what is the best way to set this process up?
I am thinking that if I have DTS job that runs each and every minute. This job will constantly pick up new orders from the orders table. If there are no new orders it will sleep then for a few seconds. wake up and look again for any unprocessed orders and do the same. This however doesn't seem right to me. Heavy load on the server etc.
Are there any other roads that I could go down to resolve this issue.
Thanks in advance for your help.
M
November 20, 2007 at 1:29 pm
We have a similar need, but chose a completely different approach. The batch processor is a C#.NET app running as a service on separate machines (actually a set of 5 VMs each running concurrently). That way we have a whole raft of programmatic ways we can control the timing and load balancing. The configuration info (other than the connection string) is stored in a separate database. There is also a little web app that we can use to configure the service, such as the polling interval, pause and restart processing, etc.
November 20, 2007 at 1:37 pm
loach (11/20/2007)
We have a similar need, but chose a completely different approach. The batch processor is a C#.NET app running as a service on separate machines (actually a set of 5 VMs each running concurrently). That way we have a whole raft of programmatic ways we can control the timing and load balancing. The configuration info (other than the connection string) is stored in a separate database. There is also a little web app that we can use to configure the service, such as the polling interval, pause and restart processing, etc.
Totally agree client side programing gives you much more flexibility and scalability for these sort of operations.
* Noel
November 20, 2007 at 1:47 pm
Have you heard of Service Broker? Basically, it allows you to send messages to a stored procedure that will then be executed as the message is received. Your application can format and send an xml message that sits in a queue until it is processed. This will give you the behavior that you are looking for I believe.
http://msdn2.microsoft.com/en-us/library/ms166043.aspx
This should help you get some info.
November 21, 2007 at 2:54 am
Yes. the initial approach was to use service broker but once we dug a little deeper we found that there were still a few more issues that needed to be resolved in service broker. I think the SB will be better in Katmai. So instead we are first trying to redesign the app so thats in a concurrent architecture that way if SB is better in Katmai it won't be as hard to redesign our app.
The area that looks after the importing of the orders from XML is OK. The only process I need to redesign is the I suppose processing and validation of the orders. This is what I want to get working concurrently. Do you think its a bit much to write a whole new .NET package?
November 21, 2007 at 1:45 pm
What kind of problems did you run into with Service Broker? I would much prefer to hear about them from you than have to run into them myself. The one time I used it it seemed to work fairly well but I'll admit I wasn't stretching anything.
Kenneth FisherI was once offered a wizards hat but it got in the way of my dunce cap.--------------------------------------------------------------------------------For better, quicker answers on T-SQL questions, click on the following... http://www.sqlservercentral.com/articles/Best+Practices/61537/[/url]For better answers on performance questions, click on the following... http://www.sqlservercentral.com/articles/SQLServerCentral/66909/[/url]Link to my Blog Post --> www.SQLStudies.com[/url]
November 22, 2007 at 12:17 am
You can run/execute the same stored procedure, using either a client application as described or multiple jobs (e.g. job a starts every minute a :01, job b at :10, job c at :20...).
However in the circumstance you've described blocking/locks could very well be the limiting factor as to how many concurrent executions of the stored procedure you can execute at any given time... depending on what you're doing the practical limit may well be one at a time.
November 23, 2007 at 10:23 am
MySpace uses SQL Service Broker to process tens of millions of messages per day so I can assure you that it is a viable solution for any asynchronous need you may throw at it. 🙂
However, my question is why are you wanting to do post-processing of these orders at all? You mentioned customers, sales, etc tables. Can't you modify the order processing logic in the client/middle tier to handle all requisite activities associated with placing an order at order entry time? If your system truly is too busy for this then it would seem that an asynchronous system is best.
You could also put a very simple trigger on the orders table to place the order PK into a driver table and have your watcher process check there for new records. A simple (probably binary encoded) status field would allow you to know the state of the record at any time.
Also, if you go with a SQL Agent Job and since you have a busy system, consider having two jobs. One that runs on a 'CPU Idle' condition and another on a schedule that picks up orders not processed by the idle job.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply