save the result of a lookup task to a variable

  • Guys,

    I have a dataflow task that creates a header record with a batch ID as an identity column,

    I have another dataflow task then does a lookup for this batch ID (using the filename being processed) and writes batch records with it to a separate table.

    I would like to be able to remember this batchID so that I can pass it back to the user as part of an email, without having to do another lookup.

    My solution is to multicast the lookup, aggregate on Max(batchID) and output to an ADO recordset object. my email task is then inside a forEach loop which iterates the recordset into an integer variable which can then be added to the email expression but it seems a bit cumbersome. Does anyone have any tidyer solutions?

  • Can you describe in a bit more detail how the batch Id is generated/retrieved?

    Is there only ever one batch Id per package execution?

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

  • Phil,

    Thanks once again for coming to my rescue.

    Yes, there will only be one batch per package execution

    create table dbo.batchHeader(batchID int identity(1,1), description varchar(200),filename varchar(200))

    create table dbo.batchLine (batchID int, partno int)

    Import file is text file

    123456|the batch description

    123457|the batch description

    ...

    a dataflow task adds the filename as a derived column and aggregates the data: group by description and filename to get one row then inserts the values into the batchHeader table.

    a new dataflow task then takes the same file and does a lookup task on filename and description and returns the batchID (Description and filename is unique) This is then added to the dataflow so that it can be written into the batchLine table.

    The results are:

    BatchHeader: BatchID = 555 (autoallocated), Description = "the batch description", Filename = "Importfile_ABC.txt"

    BatchLine: BatchID = 555, Partno = 123456

    BatchLine: BatchID = 555, Partno = 123457

    ...

    does that help?

    Within the dataflow task, I have returned the created header batchID, but I now need to put this into an email on the control flow - but the lookup value is lost on completion of the dataflow task unless I stick it in a recordset which works, but feels untidy as I know there will only ever be one value - to then use a forEach loop to assign it to an integer variable seems like an unnecessary step.

  • aaron.reese (2/11/2013)


    Phil,

    Thanks once again for coming to my rescue.

    Yes, there will only be one batch per package execution

    create table dbo.batchHeader(batchID int identity(1,1), description varchar(200),filename varchar(200))

    create table dbo.batchLine (batchID int, partno int)

    Import file is text file

    123456|the batch description

    123457|the batch description

    ...

    a dataflow task adds the filename as a derived column and aggregates the data: group by description and filename to get one row then inserts the values into the batchHeader table.

    a new dataflow task then takes the same file and does a lookup task on filename and description and returns the batchID (Description and filename is unique) This is then added to the dataflow so that it can be written into the batchLine table.

    The results are:

    BatchHeader: BatchID = 555 (autoallocated), Description = "the batch description", Filename = "Importfile_ABC.txt"

    BatchLine: BatchID = 555, Partno = 123456

    BatchLine: BatchID = 555, Partno = 123457

    ...

    does that help?

    Within the dataflow task, I have returned the created header batchID, but I now need to put this into an email on the control flow - but the lookup value is lost on completion of the dataflow task unless I stick it in a recordset which works, but feels untidy as I know there will only ever be one value - to then use a forEach loop to assign it to an integer variable seems like an unnecessary step.

    Hmm, not sure that I understand 100%, but maybe this would work.

    Within your FOREACH loop, before doing any data flows, could you use an Execute SQL task to insert the batch ID master row? If you could, you could return the inserted ID directly to an SSIS variable:

    insert table (col1, col2)

    select 'x', 'y'

    select scope_identity()

    This only works if one file corresponds with one batch, of course.

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

  • Phil,

    That would work but we try to avoid inline SQL tasks in our packages: where we need to use SQL to upsert data it shoudl go through a stored procedure.

    The data that I would need to pass to the sproc is still going to be derived from the flat file so I would still have to aggregate the data down to a single record to provide the values to sproc.

    I think I will leave it as-is. It works and has the benefit that if the import has different values in the batch descriptions, I can create two batches from the same file and the part numbers end up in the right batches.

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply