January 30, 2009 at 7:11 am
Hi,
I would like to monitor a table that is basically an errorlog. Now I would llike to be notified once a line is added, BUT even if I receive 10000 new entries at one - I just would like to have one single email.
So I could use triggers, but I need somthing like a suppression for let's say one hour until a new email is sent. (if the table still get's new inserts at that time)...
Does anyone have an idea? The single Idea I have is a job that is looking for changes on a regular base - but the tradeoff would be that I don't get a new alert immediately, in the worst case after 1 h...
thx and regards
Andreas
January 30, 2009 at 7:34 am
Ok, I got it:
- two parts:
1: a trigger that fires on update/insert. At the end it disables itself
2: a scheduled job, running every 1 hour and enabling the trigger again
regards
Andreas
January 30, 2009 at 7:57 am
I don't think you need a trigger for htis....you need a tracking table to determine whether an email should be, or has already been sent.
if the table you have to monitor has a datetime field when it was entered, or an indetity field, that is what you need.
some new Tracking table has the highest identity field or highest datetime from your monitoring table.
the scheduled job, once an hour or so, compares the highest value in Monitored Table to your Tracking table...if they are different, it updates the tracking table and sends an email...if they are the same, nothing shoudl happen.
hope that helps
Lowell
January 30, 2009 at 8:02 am
Hi,
thanks for your idea, but that doesn't work for me.
I don't want to pull for new entries regularly, I want to get them immediately as they arise. I just want to avoid getting flooded if I get 10000 entries at once. So I want a trigger, one email and after that I want to have no more information for 1 hour.
The solution I described above works perfectly for me 🙂
regards
Andreas
January 30, 2009 at 8:20 am
Personally I think you should have a person counting the records in the table. An hour later the person can start counting again. Of course there's always the possibility they will forget the count and they'll have to start again.
Max
January 30, 2009 at 8:32 am
I guess you missed my point; you can have a trigger that does whatever the trigger needs to do. no problem.
but that is separate from an email notification that rolls up those results and sends them once an hour if necessary.
a trigger should never send email. never. a trigger should only deal witht eh data it needs to....if i need an email to be sent as the result of a insert or update event that i might catch in a trigger, i log that information in the trigger to a separate table, and a job scans that table separately on it's own schedule to send the emails.
here's an example why. The actual email event can take seconds...sometimes lots of seconds to send. while that is sending, the table is locked and no other inserts cang et in. that's bad.
If an email fails within the trigger (ie network down, emails erver off line, DNS can't resolve) you trigger could rollback and you lose the data that was being inserted. This is really bad and hard to track down.
sending info to a tracking table has minimal impact. it's just another kind of audit you might be doing anyway. based on info in that tracking table, you can use a job or service broker to send the emails, and track wether the email was successfull or not, and resend again if needed.
ab.sqlservercentral (1/30/2009)
Hi,thanks for your idea, but that doesn't work for me.
I don't want to pull for new entries regularly, I want to get them immediately as they arise. I just want to avoid getting flooded if I get 10000 entries at once. So I want a trigger, one email and after that I want to have no more information for 1 hour.
The solution I described above works perfectly for me 🙂
regards
Andreas
Lowell
January 30, 2009 at 8:42 am
going back to your specific issue....the Service Broker job can be scheduled to run every minute...
say you have a trigger that puts an event in the new Tracking table instantly...that's fine.
the Service broker scans the tracking table every minute, and if a new item has been found, it MIGHT send an email... but based on the LastEmail Sent tracking table time, if an email has been sent in the last hour, it doesn't send, only when a new event is foudn and the lastEmail SentDateTime is > 1 hour from getdate().
does that help?
Lowell
January 30, 2009 at 9:11 am
DBMail is an asynchronous process. Having it in a trigger is unlikely to interfere with transactional completion. Failure to contact the e-mail server definitely does NOT interfere with transactional completion, for example. Something other than DBMail might have that problem, but not DBMail. (That's the process that SQL 2005 uses to send e-mails with sp_send_dmail.)
It could, thus, work from an insert trigger. That trigger could also log that it was fired (in a separate table). Before it sends another e-mail, it could check the last log date and time, and quit if it was within the last hour. I'd recommend keeping the log table it checks down to one row, with just the last run-time in it, for performance reasons.
You'd definitely want to load test such a trigger, by playing back a trace of a heavy traffic load most likely.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
January 30, 2009 at 12:09 pm
Before my rather pointless post above, I was also thinking: insert a record into a log table with the last inserted date from the trigger, run the job hourly, check if there's a new record and if yes send a mail and truncate the table, tada. Oh, and do an uncommitted read from the trigger to the log table to check if there's already a bogy.
See, there is a professional side to me too. 😛
Max
February 2, 2009 at 12:28 am
Thanks for all your input.
I indeed use the DB Mail feature using the sp_send methods in sql 2005, so I think It's really handled asynchronously.
Lock conditions don't impact me - the table is only used to track some errors. It is only filled if users already HAVE an issue, so it basically doesn't matter if I had locks there that would impact performance. Even if I would lose some information in that table it wouldn't hurt me..
thx and regards
Andreas
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply