Exposing Result Sets to SMPT in SSIS

  • I am trying to send the result set of a query via an email IF the result set is not an empty set. Essentially, I have a couple Execute SQL tasks that check for missing info that may cause a data integration problem upstream. I would like to put in conditional "Send Mail" Tasks that will kick off if the parent Execute SQL task returns an result set that is not an empty set. I would also like to place that result set into the body of the email (formatting is not that important - it's going to IT folks only).

    Anyone have any ideas for this one?

  • Hope this will help:

    Create a new Execute SQL task as the first task in the control flow.

    Sqlstatement: Select count(*) .......(rest of your query)..

    Reslutset: Single row

    Link the resultset to a new variable (varRowCount) in the result set tab.

    Connect the green arrow to the next task. Right click the arrow and select 'Edit...'

    In the Precedence Constraint Editor select 'Expression' as the Evaluation operation. The expression is : @varRowCount > 0

  • Make sure you name the resultet 0 (that's zero)

  • Thanks very much!

    Now, any idea on how to get an actual result set into the body of an email? These would be simple result sets...like 3 rows showing missing accounts which could be emailed to an ERP admin for setup.

  • Try this:

    Create a variable with datatype object and name varResult

    In the general tab set ResultSet to 'Full result set'

    In the result set tab add a resulset named '0' and in the variable name column select varResult.

    In the Send Mail Task go to the expression tab, click on the eclipse (...) and select the property 'MessageSource'. Click on the eclipse. In the expression builder drag the varResult in the expression box.

    Please let me know whether this works. At the moment I've got no means to test this myself.

  • I have had really good luck using the sp_send_dbmail when I need to send query results via email. You can use T-SQL in SSIS, just make sure you grab the correct box.

    Of course for your solution you'd want to make the email depend on the existence of the result set with an IF statement.

    You can even format the email with some html if you'd like, but it's not necessary.

    This page has all the info needed to use.

    http://msdn2.microsoft.com/en-us/library/ms190307.aspx

  • Thanks!

  • I am trying to send the entire resultset from sql task to email task using the steps that are mentioned in this thread. I cannot assign a variable defined as datatype object in the expression window for send email task. Is there something I am missing? Can only string variables be assigned to MessageSource, if yes then how does one convert the variable defined as object to string?

    Thank you!

  • Call me pragmatic, but why can you not export the resultsets to a file with a normal dataflow task, stick a multikast in there, which aggregates and sends the count of the results to a variable

    Now add an expression on your control flow that invokes the send mail task only if the count is greater than zero. Add the file as an attachment.

    Tada

  • pduplessis,

    would it be possible to explain in detail how to do aggregate in muticast task and use the variable in the send email task.

    Thanks!

  • hey rupalee,

    a) Add a variable with a scope of package. For arguments sake, lets call the variable varRowCount.

    b) On the dataflow section, add the source as being your OLEDB source.

    c) Add a multikast, and drag the output of your OLE DB to the multikast.

    d) Add a destination. For arguments sake, lets make this output equal to an excel destination

    e) Drag the first destination from the multikast onto your excel destination

    f) Add an aggregator transformation and drag the second destination from the multikast here. In the aggregator, count the number of rows. For arguments sake, lets call the count xfmCount

    g) Add a scripting transformation task, and make sure that you make it a destination scripting task. Be sure to as the variable you created in point a as a ReadWrite variable.

    After hours of searching, I came across the correct script to use (so this entitles me to at least one free beer if we were to ever cross paths...)

    Imports System

    Imports System.Data

    Imports System.Math

    Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper

    Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

    Public Class ScriptMain

    Inherits UserComponent

    Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)

    '1. Local variable declaration

    Dim RowCount As Integer

    '2. Set the variables equal to the inputrow

    RowCount = CInt(Row.xfmCount)

    '3. Add the package variable to the variable dispenser

    Dim vars As IDTSVariables90

    Me.VariableDispenser.LockOneForWrite("varRowCount", vars)

    '4. Set the package variable, and unlock it

    vars("varRowCount").Value = RowCount

    vars.Unlock()

    End Sub

    End Class

    ***NOTE: Microsoft does state that it is bad practice to set a variable during a transformation, as it will be set per row that passes through the pipe. However, in this case because we are aggregating and are gauranteeing that it is a single row passing through, its fine.

    TADA

    You now have the rowcount as a variable

    h) In your control flow, have a conditional expression which does the send mail task.

    @varRowCount != 0

    In the send mail, you can add the spreadsheet that you created in point D

    Trust this helps

    ~PD

  • psduplessis,

    thank you for the detailed solution.

    I also tried to use the rowcount task instead of aggregate task and it seems to work too. I did not need to use a script task if I use rowcount as I can easily assign it to a variable.

  • Oops, my bad. Forgot about the rowcount transformation.

    Where I had to use the aggregator is when I had to decide if a run balances or not based on row count as well as a sum on one of the fields. Therefore, it made more sense using the aggregator.

    But for your purposes, the rowcount transform is perfect.

    ~PD

  • Hey David,

    It is actually not that difficult.

    a) Add a variable called varCurrentDateTime, with a variable type of datetime.

    b) Add a SQL task pointing to any old OLEDB connection, with a SQL statement of 'Select getdate()'. Assign the value to the varCurrentDateTime variable

    c) I am assuming you are going to loop over your products table or something. Please dont shoot me for the incorrect assumption, but for arguments sake, lets say you are... Add an object variable which will hold all your product names. Lets call it varProductName

    d) Add a SQL task pointing to the OLEDB connection which does the selection of your products from your products table (select top 10 works for me...) . Assign the resultset to varProductName.

    e) Add a For Each ADO enumerator which will loop on your object variable.

    f) Add a variable called varFullDateTimeFileConnectionString with a type of string.

    g) Add a scripting task inside the ADO enumerator. Make sure all the variables (foldername, currentdateandtime, productname) are set as readonly, with the varFullDateTimeFileConnectionString being set as readwrite.

    h) Good luck on the .NET script. If you get stuck, please let me know. You will have to cast everything as strings, and at the very last step set your DTS variable equal to the "massaged" value

    i) Add the dataflow. On the text file destination, on the expressions, make the connectionstring point to the variable created in point f. What this will do is to set your connectionstring of your file at runtime

    Good luck

    ~PD

  • Oops!!!! Wrong thread. Please ignore

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply