August 22, 2014 at 5:49 am
Hello
I am after some advice from anyone, basically I have inherited a system that I had no hand in designing and I am not happy with the current setup, so looking for ways to improve it.
I have a database with a table called RAW, this table receives msgs via XML, sms and various other formats. The data is written into this table at a rate of approximately 50-100 rows per minute.
I have a SP which takes the data written into the raw table and performs various actions looks for account information, writes to a log table, writes to incident table’s gets GPS information and so on.
The records written into the raw table need to be processed at almost instantly with a maximum of under a minute from when they arrive into the table.
At present there is a sql agent job which executes the SP, this consist of a step to execute the SP which on completion moves to the next step which is a loop waitfor delay and then back to step 1.
The trouble is it never actually finishes and runs 24/7 there is no break point for error handling and occasionally the records that arrive in the raw table do not get processed and the job has to be restarted for it to pick them up again.
I am looking for some advice in the best way to handle this process, I thought about a trigger but the performance impact on using a trigger was too heavy to consider any recommendations are welcome thanks in advance.
August 22, 2014 at 1:16 pm
Wow, sounds like poor design to start with. When I first read the question a redesign around Service Broker popped into my head because this is essentially a queuing operation and SB with an activation procedure seems like a good way to handle it, but you probably don't have the opportunity to re-architect it this way.
I can see why you wouldn't want to use a trigger to call the SP because it would fire for each insert and it sounds like you are trying to process the rows in a set-based fashion.
The next thing I would think of is a windows service that is polling the table and when it finds unprocessed rows it runs the SP. I think this will enable you to have "constant" processing and have error handling available through .NET, so that you can log errors and keep the service running so the next timer tick that sets off the processing will still run.
Another option is having the SQLAgent Job run an SSIS package that, like .NET, allows you to handle errors within the package and keep the job running because it will always return success to SQL Agent.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
September 1, 2014 at 1:25 am
I have same sentiments as Jack. SQL Agent jobs is not designed for your 1-min response time.
If you have any future plans for scaling, this solution won't last you long.
Does the RAW table also need to be updated as well, during each transaction?
If it's purely read-only, while you process the new data into another table, that would allow different strategies.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply