September 22, 2011 at 10:10 am
I have SQL 2005 server with a DB that has a table that changes automatically (new rows added frequently). I would like to detect the changes/inserts and automatically send an email to some email recipienets.
If I was to add a 2nd layer of compexity:
Instead of having the application SQL server doing the emailing (which can tax the server), I would like another machine (windows XP PC setup as a webserver) to do that. The machine has an ODBC connection to the SQL server and can query the table successfully. Is this feasible? or is it too complicated.
I am maybe combining 2 problems in one post. Feel free to answer one of the 2 questions..
Thanks
September 22, 2011 at 10:13 am
I'd like to suggest some reading, Books online. Specifically read about Database Mail. You can setup a trigger that can send emails. This is where I would start, then come back and ask more detailed questions if things don't make sense.
September 22, 2011 at 10:19 am
#1 I wouldn't send the email from the trigger.
#2 I would schedule a job to find the rows to process and schedule that every x minutes
#3 If you're willing to create a whole new application to send mail, you might as well just edit the one you have and send the mail after the insert is done.
Sending mail afaik is not going to kill your sql server unless you're talking about 100K + e-mails day I don't think you need to worry about a mail server.
September 22, 2011 at 10:26 am
The trigger could populate a queue table that is then processed by a scheduled job. This would eliminate the trigger from directly sending the emails and it eliminates writing a process to determine what was entered or updated.
September 22, 2011 at 10:44 am
THe new rows added frequently kind of bothers me. Do you really want to send emails for each row? How do you want to do that? From my own perspective I know I don't want to get a bunch of emails all the time about data additions..
CEWII
September 22, 2011 at 11:38 am
Ok. Will do
September 22, 2011 at 11:41 am
Elliott Whitlow (9/22/2011)
THe new rows added frequently kind of bothers me. Do you really want to send emails for each row? How do you want to do that? From my own perspective I know I don't want to get a bunch of emails all the time about data additions..CEWII
Yes. It's required that every new row gets emailed out. The new raws are alarm conditions about the building that must be communicated to people.
September 22, 2011 at 11:56 am
here's a tested, working model you could use;
it takes multipel inserts into consideration,and sends the email from within the trigger.
so if say, 5 rows were isnerted in a single INSERT command, a column i picked out of the results gets concatenated together and is part of the email body.
CREATE TABLE WHATEVER(
WHATEVERID INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
DESCRIP VARCHAR(30)
)
GO
ALTER TRIGGER TR_WHATEVER_NOTIFICATIONS
ON WHATEVER FOR INSERT
AS
BEGIN
SET NOCOUNT ON
--gather the information, making sure you get it from the INSERTED virtual table, and not the full table
DECLARE @CAPTUREDSTRING VARCHAR(max)
--In this example i want a comma delimited list of important facts about what was inserted.
--using the xml technique to make my comma delimited string.
SELECT @CAPTUREDSTRING = [Skills]
FROM (
SELECT TOP 1 GETDATE()As TheDate,stuff(( SELECT ',' + DESCRIP
FROM INSERTED s2
--WHERE s2.WHATEVERID= s1.WHATEVERID --- must match GROUP BY below
WHERE 1 = 1
ORDER BY DESCRIP
FOR XML PATH('')
),1,1,'') as [Skills]
FROM INSERTED s1
GROUP BY s1.WHATEVERID --- without GROUP BY multiple rows are returned
ORDER BY s1.WHATEVERID) myAlias
--now email the results.
declare @body1 varchar(4000)
set @body1 = 'New Item Notification on the Whatever Table '
+ CONVERT( VARCHAR( 20 ), GETDATE(), 113 )
+ '
<P> The following new items were inserted into the table:<P>'
+ @CAPTUREDSTRING
+ '
'
EXEC msdb.dbo.sp_send_dbmail
@profile_name='Default Mail Account',
@recipients='lowell@somedomain.com',
@subject = 'New Item Notification',
@body = @body1,
@body_format = 'HTML'
END --TRIGGER
GO
INSERT INTO WHATEVER(DESCRIP)
SELECT 'APPLES' UNION
SELECT 'ORANGES' UNION
SELECT 'BANANAS' UNION
SELECT 'GRAPES' UNION
SELECT 'CHERRIES' UNION
SELECT 'KIWI'
Lowell
September 22, 2011 at 11:59 am
Also in Books Online read about Service Broker. This would eliminate the need the need for a scheduled job and still have the processing of emails completed asyncronously (sp ?) Of the inserts/updates on the table.
September 22, 2011 at 12:09 pm
Lynn Pettis (9/22/2011)
Also in Books Online read about Service Broker. This would eliminate the need the need for a scheduled job and still have the processing of emails completed asyncronously (sp ?) Of the inserts/updates on the table.
Lynn,
I'm with you, I would rather use SB than schedule a job that runs often. Since I'm a little paranoid, I ususally schedule a job anyway, but like every hour or two, just in case..
CEWII
September 22, 2011 at 2:19 pm
Thanks for the code. I am looking at it. One of the requirments is to have one email for each event.
There probably needs to be some sort of buffering mechanism to accomodate multiple rows being inserted very quickly.
Lowell (9/22/2011)
here's a tested, working model you could use;it takes multipel inserts into consideration,and sends the email from within the trigger.
so if say, 5 rows were isnerted in a single INSERT command, a column i picked out of the results gets concatenated together and is part of the email body.
CREATE TABLE WHATEVER(
WHATEVERID INT IDENTITY(1,1) NOT NULL PRIMARY KEY,
DESCRIP VARCHAR(30)
)
GO
ALTER TRIGGER TR_WHATEVER_NOTIFICATIONS
ON WHATEVER FOR INSERT
AS
BEGIN
SET NOCOUNT ON
--gather the information, making sure you get it from the INSERTED virtual table, and not the full table
DECLARE @CAPTUREDSTRING VARCHAR(max)
--In this example i want a comma delimited list of important facts about what was inserted.
--using the xml technique to make my comma delimited string.
SELECT @CAPTUREDSTRING = [Skills]
FROM (
SELECT TOP 1 GETDATE()As TheDate,stuff(( SELECT ',' + DESCRIP
FROM INSERTED s2
--WHERE s2.WHATEVERID= s1.WHATEVERID --- must match GROUP BY below
WHERE 1 = 1
ORDER BY DESCRIP
FOR XML PATH('')
),1,1,'') as [Skills]
FROM INSERTED s1
GROUP BY s1.WHATEVERID --- without GROUP BY multiple rows are returned
ORDER BY s1.WHATEVERID) myAlias
--now email the results.
declare @body1 varchar(4000)
set @body1 = 'New Item Notification on the Whatever Table '
+ CONVERT( VARCHAR( 20 ), GETDATE(), 113 )
+ '
<P> The following new items were inserted into the table:<P>'
+ @CAPTUREDSTRING
+ '
'
EXEC msdb.dbo.sp_send_dbmail
@profile_name='Default Mail Account',
@recipients='lowell@somedomain.com',
@subject = 'New Item Notification',
@body = @body1,
@body_format = 'HTML'
END --TRIGGER
GO
INSERT INTO WHATEVER(DESCRIP)
SELECT 'APPLES' UNION
SELECT 'ORANGES' UNION
SELECT 'BANANAS' UNION
SELECT 'GRAPES' UNION
SELECT 'CHERRIES' UNION
SELECT 'KIWI'
September 22, 2011 at 2:23 pm
techzone12 (9/22/2011)
Thanks for the code. I am looking at it. One of the requirments is to have one email for each event.There probably needs to be some sort of buffering mechanism to accomodate multiple rows being inserted very quickly.
it might be part of your process already; if an event occurs, does one row get inserted as a single INSERT INTO.... command?
all my example does is handle the scenario when three events, for example , are inserted as a INSERT...SELECT FROM situation.
Lowell
September 22, 2011 at 2:29 pm
I don't know how the rows are getting inserted, They are part of an application (not mine) that I am trying to build upon. Thanks
Lowell (9/22/2011)
techzone12 (9/22/2011)
Thanks for the code. I am looking at it. One of the requirments is to have one email for each event.There probably needs to be some sort of buffering mechanism to accomodate multiple rows being inserted very quickly.
it might be part of your process already; if an event occurs, does one row get inserted as a single INSERT INTO.... command?
all my example does is handle the scenario when three events, for example , are inserted as a INSERT...SELECT FROM situation.
September 23, 2011 at 11:34 am
For the record Database Mail is built on Service Broker technology. In the past we needed to create the "insert into queue table + job to look for records" architecture...not any more...and creating a Service Broker solution to handle emailing would be redundant.
To the OP, if you need to send an email for every row added to that table you're safe using Database Mail in a trigger, i.e. sending an email using Database Mail is analogous to inserting a row into a user-defined queue table. Sending email from a trigger is an atypical configuration, but there is nothing technically wrong with it.
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply