July 26, 2011 at 1:54 pm
Hello all; I apologize if this is a repost but I could not find exactly what I was looking for in search --
I have a few somewhat unusual requirements regarding audit and "data push" operations that I have elected to use a combination of Trigger and CLR procedures to accomplish. Without boring details; basically the trig will take some data and put it into both a log table, HTTP request, and or message/queue table. All of those steps could fail (data join issue, improper message, etc.. planned logic..)
In this unusual case, what I want is a literal "fire and forget" approach. I want trigger X to fire on INSERT,UPDATE something like exec usp_LogCallMessage where the USP is a CLR proc that is complicated, perhaps several seconds long, and in it's own "thread" In other words, I don't want a trigger on a heavily used table to be bogged down with this USP call.
Any thoughts on this? My initial thought was that my trigger would have to have something in it like a "sp_execute" call to my proc so it ran in it's own area outside the spid of the executed trigger call. But it has been a while since i've done serious work in Triggers that I'm a bit rusty on how all the pieces relate regarding series/threading/behavior.
Hopefully this makes sense, but if I am not clear plz let me know.
July 26, 2011 at 2:15 pm
You may want to look into using service broker for the "fire and forget" part of the trigger. It will give you a queuing mechanism which after the message is accepted within the queue, would allow you to trigger the rest of the processing outside of the main thread.
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
July 26, 2011 at 2:18 pm
Yeah - Service Broker is what I'd go with. There are a fair number of tutorials out there in the internet, just google Service Broker Tutorial and you should be able to figure most of it out. Lets you do asynchronous operations on your server.
July 26, 2011 at 2:24 pm
And if I'm forced to roll this out onto a SQL Server 2005 environment (which I'm fighting at the moment)? I don't see it installed on these instances; is it worth going thru the 2005 install process for SB to accomplish this type of processing?
July 26, 2011 at 2:45 pm
If you want to have a truly asynchronous process, then yeah, it is. There isn't really any other way of being able to accomplish what you're trying to do.
July 26, 2011 at 2:51 pm
Service broker is build into SQL 2005 and above. There's no separate installation process
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 26, 2011 at 3:00 pm
Ah yes, there it is; mistakenly looking thru the server tree and not the Database tree. That makes sense b/c I was wondering where the results of my CREATE MESSAGE TYPE SoapTestMessage went since it did not err.
SB/Message/Contracts are some of those things I learned for Exam purposes for 2008, then fall off the brain when I had no call to use. Perhaps that will correct itself if I go that route
Overall I haven't heard any "that concept sounds insane" yet back 😛 so I take it the basic approach isn't as loopy as it sounds?
July 27, 2011 at 8:35 am
if you're suggesting having the trigger execute a stored proc, well, the trigger will then just wait until the stored proc is finished before continuing, so that won't help you at all.
what you might be able to do though, is have your trigger execute a job, which processes the information
the problem with that idea, AFAIK, is that you can't have the same job running concurrently with itself. meaning if your trigger executes the job to do the work, then yes, that would be asynchronous, but then if another trigger fired and tried to execute that same job before it was finished, it would crash.
now i guess what you could technically do, is have your trigger insert a record into a table containing the information the job needed to know, and then have the job execute. when the job executes, it would look at the current information in the table, and execute the tasks necessary for the information at the time, then remove that information from the table.
so, suppose we had the following course of events:
trigger inserts record 1 into table, fires job
job begins, looks at table, sees record 1 and begins processing it
trigger inserts record 2 into table, fires job. job firing fails because it is currently executing
trigger inserts record 3 into table, fires job. job firing fails because it is currently executing
job finishes, removes record 1 from table
trigger inserts record 4 into table, fires job
job begins, looks at table, sees records 2-4, begins processing them
trigger inserts record 5 into table, fires job. job firing fails because it is currently executing
etc...
July 27, 2011 at 8:55 am
K --
Had considered the job route also, as well as potentially using a Job as a stage platform. In other words, trigger just adds a queue by writing the record to another table, then have a job that runs every few minutes pick up anything in that queue table, batch it up in the HTTP req, and send out.
The "waiting" was why I made the comment about sp_execute -- SQL Server compiles the Transact-SQL statement or statements in the string into an execution plan that is separate from the execution plan of the batch that contained the sp_executesql or the EXECUTE statement (ref: msdn), and while it's really meant for t-sql I know it can work with procedures but it's a pain in the backside.
Still giving real thought to the Service Broker; one limitation/concern is that the rest of the dev team have had no exposure to it at all.
July 27, 2011 at 9:08 am
you could do it even simpler probably. just have your trigger execute an HTTP request, and don't bother waiting for the response, since you don't really care about it. just have it fire off the request and then go on its way.
on the web server side, it would receive the request, and then fire off the stored procedure or whatever other process you've written, to handle the information.
only downside is that if you're dealing with a lot of info, you might overload the web server with too many requests.
July 27, 2011 at 9:16 am
That's true -- in one of the use cases that is driving this is a SOAP API call thru webService to push items up to the clients Salesforce.com cloud. In that predefined API, they have Webservice soap "push" calls that accept in an Array of the object type. So you could call the service 20 times for 20 "Task" type objects (a Salesforce objtype), OR you can call the binding once and pass it an array of 20 Task objects (instead of a Array with only one item, all arguments are arrays in their API)
Benefits is
1. they process it WAY faster the 2nd route (80+% faster)
2. one call to the cloud = one "billable API" transaction
So in that one case, that's where the job/queue/etc thought came into the mix..
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply