Challenge with trigger to send email, including attached file. @query needs variables...

  • The below code works as it is today. The problem is that I want to attach an email with the last 25 transactions that have been performed within that crib (location), and for the specific item. This is essentially a history lookup, that gets attached to the email as a csv. Saving our staff from going and finding the exact same information when the email is sent.

    When I had the code that is commented out, enabled, my trigger fired on the event as expected, but the @query select statement appeared to freeze the entire system, and I had to find the process in the Activity monitor for our server and kill it. Once killed things resumed functioning, I commented out the query and the mail section, and retested and fired fine again. So somewhere within @query I built is my problem.

    All the reading I have been finding points towards refers to using stored procedures for this type of function - which I have no experience with, on top of the fact that this is a trigger - so user input is not possible, as it only triggers when a business function occurs within our software. The trigger fires when the table is updated and quantity on hand is equal to 0. (Please note I am actually quite novice with SQL, in my opinion, and the trigger work I have been doing are ways for me to expand my knowledge).

    Anyways, thoughts/suggestions/insight would be greatly appreciated. (note that the database entries and email entries are valid in my code, I just sanitized for posting here)

    USE [MyDatabase]

    GO

    /****** Object: Trigger [dbo].[TRG_Custom-StockedOut_NormalBin] Script Date: 1/27/2015 9:00:29 AM ******/

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    ALTER Trigger [dbo].[TRG_Custom-StockedOut_NormalBin] ON [dbo].[TRANS] for Insert, Update NOT FOR REPLICATION As

    DECLARE @TransNo INT

    DECLARE @cribbin VARCHAR(15)

    DECLARE @TransDate DATETIME

    DECLARE @Item VARCHAR(30)

    DECLARE @Description VARCHAR(50)

    DECLARE @Crib INT

    DECLARE @BinQty INT

    DECLARE @Qty INT

    DECLARE @min-2 INT

    DECLARE @max-2 INT

    DECLARE @Critical INT

    DECLARE @OnOrder INT

    DECLARE @Supplier VARCHAR(12)

    DECLARE @BinType INT

    DECLARE @Type VARCHAR(5)

    DECLARE @EmployeeID VARCHAR(15)

    DECLARE @EmployeeName VARCHAR(50)

    DECLARE @SiteID VARCHAR(12)

    DECLARE @SiteName VARCHAR(50)

    DECLARE @Body NVARCHAR(550)

    Declare @Subject VarChar (100)

    DECLARE @Stopped VarChar(10)

    DECLARE @StopReason VARCHAR(50)

    DECLARE @StopDate VARCHAR(10)

    DECLARE @query_result_separator CHAR(1) = char(9);

    DECLARE @query NVARCHAR(500)

    -- Get the needed pieces out of the transaction when a bin hits it's critical point

    SELECT@TransNo = TransNumber,

    @Cribbin = Cribbin,

    @Item = Item,

    @Crib = Crib,

    @Qty = Quantity,

    @Transdate = TransDate,

    @BinQty = BinQty,

    @Type = TypeDescription,

    @EmployeeID = employee

    from Inserted

    SELECT @EmployeeName = Name FROM Employee WHERE Employee.ID = @EmployeeID

    SELECT @SiteID = SiteID FROM Crib WHERE Crib = @Crib

    SELECT @SiteName = SiteDescription FROM SiteProfile WHERE SiteID = @SiteID

    SELECT @Description = Description1, @supplier = VendorNumber FROM Inventry WHERE Inventry.ItemNumber = @Item

    SELECT @min-2 = OverrideOrderPoint, @max-2 = Maximum, @Critical = CriticalPoint, @OnOrder = OnOrder, @BinType = BinType FROM Station WHERE Station.Cribbin = @Cribbin

    SELECT @Stopped = StopOrdering, @StopReason = UDFAPCloseReason, @StopDate = UDFAPCloseDate FROM STATION WHERE Station.Cribbin = @Cribbin

    IF Update (Cribbin)

    BEGIN

    IF @BinType in (2,3) or @Type NOT IN ('ISSUE','ADJUS','COUNT','RETNW')

    Return

    IF @Critical IS NULL

    Return

    IF @BinQty = 0

    Begin

    Select @Subject = CAST('CRIT ALERT/STOCK OUT @ ' + @SiteName + ' (' + @Description + ')' AS VARCHAR(100))

    Select @Body = CAST(

    'Investigate the following ASAP, as a stock out has been detected.' + Char(13) + Char(13) + Char(13) + Char(13) +

    'Crib-Bin: ' + @CribBin + Char(13) +

    'Item Number: ' + @Item + Char(13) +

    'Description: ' + @Description + Char(13) +

    'Maximum: ' + Convert(varchar(10),@Max) + Char(13) +

    'Min: ' + Convert(varchar(10),@Min) + Char(13) +

    'Qty on Hand: ' + Convert(varchar(10),@BinQty) + Char(13) +

    'On Order: ' + Convert(varchar(10),@OnOrder) + Char(13) +

    'Quantity Changed: ' + Convert(varchar(10),@Qty) + Char(13) +

    'Trans Type: ' + @Type + Char(13) +

    'Employee: ' + @EmployeeName + Char(13) +

    'Stopped: ' +

    CASE

    WHEN @Stopped = 1 THEN 'Yes - ' + @StopReason + ' (Date: ' + @StopDate + ')'

    Else 'No'

    END

    + Char(13) + Char(13) + Char(13) + 'The attached file contains up to 25 of the last transactions performed on this item.'

    AS NVARCHAR(550))

    /** Commented out for bug fixing

    SELECT @query = '

    SELECT TOP(25)

    t.CribBin AS "Bin",

    t.Employee AS "EmpID",

    e.Name AS Name,

    t.Item AS "Item Number",

    i.Description1 AS "Description",

    t.TransDate AS "Date",

    t.TypeDescription AS "Type",

    t.Quantity AS "Qty Changed",

    t.BinQty AS "Bin Qty"

    FROM trans AS t

    INNER JOIN inventry AS i

    ON t.item = i.ItemNumber

    INNER JOIN employee AS e

    ON e.id = t.Employee

    WHERE t.Crib = ' + CAST(@Crib AS VarChar(10)) + ' and t.Item = ' + '' + @Item + '' + ' ORDER BY t.Transdate DESC'

    **/

    EXEC msdb.dbo.sp_send_dbmail

    @Profile_name = 'myprofile',

    @Recipients = 'our@email.com',

    @Body = @Body,

    @Subject = @Subject,

    @importance = 'High';

    /** Commented out for bug fixing

    @execute_query_database = 'MyDatabase',

    @query = @query,

    @query_attachment_filename = 'Last25Trans.csv',

    @query_result_separator = @query_result_separator,

    @attach_query_result_as_file = 1,

    @query_result_no_padding= 1

    **/

    END

    END

  • first a redesign; there is no reason to put this in a trigger. if the trigger fails, your data disappears because the transaction gets rolled back.

    the second issue is permissions: every end user who is able to Insert, Update must also be a user in the msdb database, and also be in the DatabaseMailUserRole.

    the third is design: this is just a notification that goes out via email, so timing is not of the essence. it doe snot have to be done the absolute millisecond the quantity goes to zero, since you are only notifying people via email,and not kicking off an ordering functionality or something.

    instead, create a query that detects whether your quantity has fallen to zero.

    UPDATE(Cribben) only tells you if the columns was included in the insert or update statement, not that it changed value.

    then you create a scheduled job that runs that query every x minutes.

    that job has to check if it already sent an email for that specific outage(don't want to send the same email every x minutes, right?)

    the, since the job runs under the context of sa, the permissions issue is resolved.

    example of the detection script?

    select * from [dbo].[TRANS] T1

    WHERE T1.BinQty = 0

    AND NOT EXISTS(SELECT * FROM EMailLog E WHERE E.PRIMARYKEY = T1.PRIMARYKEY AND DateNotified > DATEADD(dd,-2,getdate()) )

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • first a redesign; there is no reason to put this in a trigger. if the trigger fails, your data disappears because the transaction gets rolled back.

    Actually, this trigger is going to eventually expand to combine some other functions, (including writing data to another table) for parsing in a 3rd party app - which is managed real time. Once our criteria are in place, I am not worried about data loss, as the parent application that writes this data actually has a processing agent, that posts data. If the data fails to post - (validation action), it writes an error log entry. Data loss is not an issue I am concerned with.

    the second issue is permissions: every end user who is able to Insert, Update must also be a user in the msdb database, and also be in the DatabaseMailUserRole.

    There are no permission issues being experienced. The trigger works flawlessly with the @query portions commented out. I desire adding the attached top(25) results of the query to the email, and I have used similar attach processes in scheduled agent jobs, but they did not require variables within their @query, therefor they functioned without any problems.

    this is just a notification that goes out via email, so timing is not of the essence. it doe snot have to be done the absolute millisecond the quantity goes to zero, since you are only notifying people via email,and not kicking off an ordering functionality or something.

    Without getting into the details of our business process, this is something that is actually needed on the event. This is a single central database with over 60 systems pointing at it. Monitored by a team of people that react to standard events. Running out of inventory is considered a cardinal sin, and therefor must be acted upon urgently. Due to the fact that we have 60+ systems pointing at it, the @crib variable and the @sitename references are actually what help determine who the specific event is for. These emails are often shared with the customer, to discuss alternatives, address if they have had business increase that was not predicted etc. Notification must be on the singular event AND maintain customer exclusive data. A query run on a schedule would not provide the required reactive response.

    The key issue is that this trigger is not able to set the @query variable, as I have shown, and therefor locks up when trying to run. I need it to be able to set the @query to match the criteria I laid out, so my email not only sends the message body information, but includes an attached file. with the last 25 transactions that have occurred on that item. This allows us to isolate problem employees consuming inventory at abnormal rates (customer staff), or trends in increased business usage that we can then address with the customers production planners and increase supported stock levels. This is all about vendor managed customer inventory. High service level requirements.

  • Select query will be locked because you have transaction still open when it is executed by email.

    I second Lowell idea of job or maybe you could use service broker when email conditions are met. We had to use service broker in our database for OLTP database in trigger because of perf problems.

  • I am not familiar with service broker etc. My experience with SQL is slowly growing as I work through this.

    Could you elaborate, or provide a reference to relevant material that I could read?

  • Trelin (1/27/2015)


    first a redesign; there is no reason to put this in a trigger. if the trigger fails, your data disappears because the transaction gets rolled back.

    Actually, this trigger is going to eventually expand to combine some other functions, (including writing data to another table) for parsing in a 3rd party app - which is managed real time. Once our criteria are in place, I am not worried about data loss, as the parent application that writes this data actually has a processing agent, that posts data. If the data fails to post - (validation action), it writes an error log entry. Data loss is not an issue I am concerned with.

    What have you learned, read, looked at that caused you to come to the conclusion that a trigger is the best way to do this?

    the second issue is permissions: every end user who is able to Insert, Update must also be a user in the msdb database, and also be in the DatabaseMailUserRole.

    There are no permission issues being experienced. The trigger works flawlessly with the @query portions commented out. I desire adding the attached top(25) results of the query to the email, and I have used similar attach processes in scheduled agent jobs, but they did not require variables within their @query, therefor they functioned without any problems.

    The reason for this is working for is because you are either testing this with a high level of permissions, or the login that the application is using has a very high level of permissions. Not a good practice in either case.

    this is just a notification that goes out via email, so timing is not of the essence. it doe snot have to be done the absolute millisecond the quantity goes to zero, since you are only notifying people via email,and not kicking off an ordering functionality or something.

    Without getting into the details of our business process, this is something that is actually needed on the event. This is a single central database with over 60 systems pointing at it. Monitored by a team of people that react to standard events. Running out of inventory is considered a cardinal sin, and therefor must be acted upon urgently. Due to the fact that we have 60+ systems pointing at it, the @crib variable and the @sitename references are actually what help determine who the specific event is for. These emails are often shared with the customer, to discuss alternatives, address if they have had business increase that was not predicted etc. Notification must be on the singular event AND maintain customer exclusive data. A query run on a schedule would not provide the required reactive response.

    If running out of inventory is the cardinal sin, wouldn't it make more sense to define a minimum quantity, and when that is hit, start the process to trigger re-ordering of the item?

    From what you are saying, zero puts the users out of business until more stock comes in. Don;t let that happen.

    The key issue is that this trigger is not able to set the @query variable, as I have shown, and therefor locks up when trying to run. I need it to be able to set the @query to match the criteria I laid out, so my email not only sends the message body information, but includes an attached file. with the last 25 transactions that have occurred on that item. This allows us to isolate problem employees consuming inventory at abnormal rates (customer staff), or trends in increased business usage that we can then address with the customers production planners and increase supported stock levels. This is all about vendor managed customer inventory. High service level requirements.

    Locks will likely only be the tip of your problems.

    At minimum, use the trigger to insert a record into a table that says "do something" when another process looks at it.

    When you call sp_send_dbmail, any latency from the time the proc is called until the mail is actually processed by the SMTP server will cause your code to wait.

    To the users, this will appear to be an application hang.

    To learn more on Service Broker, go here:

    http://www.google.com

    Type in Service Broker.

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • What have you learned, read, looked at that caused you to come to the conclusion that a trigger is the best way to do this?

    Fair. I came here looking for help, I should not assume I am right.

    The reason for this is working for is because you are either testing this with a high level of permissions, or the login that the application is using has a very high level of permissions. Not a good practice in either case.

    When I am working on the triggers, I am doing so with a high level of access to the database in question. The software itself also runs at a high level - but that is unfortunately by design. The application handles security against the database internally. I am not going to say it is right or wrong, but that is how it works.

    If running out of inventory is the cardinal sin, wouldn't it make more sense to define a minimum quantity, and when that is hit, start the process to trigger re-ordering of the item?

    From what you are saying, zero puts the users out of business until more stock comes in. Don;t let that happen.

    All these things exist already, but inventory does not just appear. If an item normally has 1 piece a week consumed, and an employee takes a burst run at it - and pulls all 10 on hand pieces. We want to know. Our other processes automatically order it - and we have been handling those actions for our customers. I just want the specific incident of zero to trigger deeper and immediate investigation.

    At minimum, use the trigger to insert a record into a table that says "do something" when another process looks at it.

    When you call sp_send_dbmail, any latency from the time the proc is called until the mail is actually processed by the SMTP server will cause your code to wait.

    To the users, this will appear to be an application hang.

    Good to know. I was not aware a hang in SMTP would delay from sp_send_dbmail. I was under the impression that the process was handed off. Other areas where I use this, have not had any performance issues we have noticed, so I did not correlate. Fortunately the way our remote systems work, they store and send when the employee is complete - like a checkout record. Which means they have already walked away. They will not see the latency. Our management team will however, so that is good to know.

    To learn more on Service Broker, go here:

    http://www.google.com

    Type in Service Broker.

    I stared at that screen for a good ten minutes. You forgot to say press search or enter. 😛

    (Yes I already did this before you typed your reply, I just simply asked because sometimes people have resources that they have found very useful, and if they had one to share, I would have appreciated reading it. That is all.)

  • Sending mail from code in the application is, most of the time, problematic. A typical approach is to insert a message into a queue table, and have a separate job process this queue.

    As an example, this was the method of a previous client. The application took a request from a client, and notified the proper technician via email. When they moved the database servers to another location, the app appeared to hang until the mail was processed.

    Two days of blaming the database.

    Read the stairways series on this sight.

    And, seriously, how did you come up with this architecture? The original intent of triggers was to enforce referential integrity. That's it.

    EVERY app that I have encountered that rely upon triggers to enforce business logic are some of the worst performing and troublesome applications I have had to deal with. There is no exception for me. If logic is encapsulated in a trigger, there is trouble.

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • Michael L John (1/27/2015)


    Sending mail from code in the application is, most of the time, problematic. A typical approach is to insert a message into a queue table, and have a separate job process this queue.

    As an example, this was the method of a previous client. The application took a request from a client, and notified the proper technician via email. When they moved the database servers to another location, the app appeared to hang until the mail was processed.

    Two days of blaming the database.

    Read the stairways series on this sight.

    And, seriously, how did you come up with this architecture? The original intent of triggers was to enforce referential integrity. That's it.

    EVERY app that I have encountered that rely upon triggers to enforce business logic are some of the worst performing and troublesome applications I have had to deal with. There is no exception for me. If logic is encapsulated in a trigger, there is trouble.

    The more I read on this, and think about what you said, the more I want to come up with a way to do this (and a few other actions that happen due to triggers) separately.

    Would this be an appropriate path, or are there pitfalls I should worry about?

    1: Determine my events that I would like emails on, and build a table that contains the following:

    - Subject

    - Body

    - IncludeCSVQuery (True/False)

    - Query (select statement that I want to use in the @query field)

    - Sent (True/False)

    - SentDateTime

    2: Create a sql agent job that accesses all records where sent = false on that table and runs every minute... (from what I am reading, I would need to use cursor? or some logic to step through the query results) What I am concerned with, is can I store a functional select statement in a field, that @query can pull?

    3: Once the mail is sent, insert the datetime, and flip the sent flag to true. Step to next record that is unsent...

    4: This table would be populated by triggers, that insert the subject/body/query that I want to utilize into the email table based on whether or not I want a query, or just a message to be sent (not all incidents would include a query).

    2:

  • Trelin (1/28/2015)


    Michael L John (1/27/2015)


    Sending mail from code in the application is, most of the time, problematic. A typical approach is to insert a message into a queue table, and have a separate job process this queue.

    As an example, this was the method of a previous client. The application took a request from a client, and notified the proper technician via email. When they moved the database servers to another location, the app appeared to hang until the mail was processed.

    Two days of blaming the database.

    Read the stairways series on this sight.

    And, seriously, how did you come up with this architecture? The original intent of triggers was to enforce referential integrity. That's it.

    EVERY app that I have encountered that rely upon triggers to enforce business logic are some of the worst performing and troublesome applications I have had to deal with. There is no exception for me. If logic is encapsulated in a trigger, there is trouble.

    The more I read on this, and think about what you said, the more I want to come up with a way to do this (and a few other actions that happen due to triggers) separately.

    Would this be an appropriate path, or are there pitfalls I should worry about?

    1: Determine my events that I would like emails on, and build a table that contains the following:

    - Subject

    - Body

    - IncludeCSVQuery (True/False)

    - Query (select statement that I want to use in the @query field)

    - Sent (True/False)

    - SentDateTime

    2: Create a sql agent job that accesses all records where sent = false on that table and runs every minute... (from what I am reading, I would need to use cursor? or some logic to step through the query results) What I am concerned with, is can I store a functional select statement in a field, that @query can pull?

    3: Once the mail is sent, insert the datetime, and flip the sent flag to true. Step to next record that is unsent...

    4: This table would be populated by triggers, that insert the subject/body/query that I want to utilize into the email table based on whether or not I want a query, or just a message to be sent (not all incidents would include a query).

    2:

    I would slim down what the trigger does. Simply insert the primary key of the offending record into the table, and have your job do the work.

    In the job, I would create a loop (I can't believe I am recommending a loop!) to do the work

    There are tons of ways to do this, but something like this:

    DECLARE @Primary_Key_Field int = NULL

    Begin Try

    SELECT TOP 1 @Primary_Key_Field = Primary_Key_Field FROM Some_Table WHERE Processed_YN = 0 ORDER BY Primary_Key_Field

    WHILE @Primary_Key_Field IS NOT NULL Begin

    BEGIN TRAN

    --Do the work

    --If you get to this point, everything worked, so update the process flag

    UPDATE Some_Table SET Processed_YN = 1, Processed_Date = GETDATE() WHERE Primary_Key_Field = @Primary_Key_Field

    COMMIT TRAN

    SELECT @Primary_Key_Field = TOP 1 Primary_Key_Field FROM Some_Table WHERE Processed_YN = 0 ORDER BY @Primary_Key_Field

    End

    End Try

    Begin Catch

    --Error handling here.

    IF @@TRANCOUNT > 0 Begin

    ROLLBACK TRAN

    End

    End Catch

    Why are you storing the query? There are probably a finite number of queries that are required. Create stored procs and pass in the appropriate values. Take advantage of the database engine.

    Same with the subject. Build it on the fly. If you want an historical record, it'a already stored in the msdb database.

    All that being said, this really looks like something that service broker was made for.

    Google this

    Jonathan Kehayias service broker

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • Michael,

    Thank you for the extra information. I have a lot of avenues to consider now, and I just need to wrap my head around some of the concepts and figure out which one I can best implement. I like the idea of service broker, I just do not have a clear enough understanding of it.

    I did comment on injecting the data into a table - mostly for followup action later. Our main inventory management application alerts us of events we should be concerned with, but I trend the events (which are manually acted upon). We currently insert that data into a table that the main app does not touch (in the same database, to simplify reporting). We have an in house built application that access that table, and allows our staff to record their actions against each entry. We can then determine what course of action was taken.

    IE, Received an alert that inventory was getting low, noticed that we did order inventory previously, but it has yet to be received. Manual investigation determines that it was shipped to site, and we had to remind the customer to receive it. We record this action based on a few true/false actions - and then I can report on how many times we ship to site, and have to remind the customer, which is reviewed with performance metrics at regular meetings. (This is a really simplified summary - but just to give you the idea why storing the data is important.)

    What was a "how do I make this trigger work" has evolved into a complete revisit of the structure in place. I now just have to wrap my head around it and make sure I go down a path I am able to support, and that my team can work underneath.

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply