August 24, 2010 at 7:08 am
Hello all,
I have found myself in a situation in which, it would be very useful to execute a procedure for some of the rows in a Df to retrieve some specific values.
I have been unsuccessful in achieving that, and looking around it just simply seems not possible under SSIS.
This question is addressed to find out if anyone has worked out how to achieve this. It is not the success/completion value that I am interested in, but the return columns (single-row table) instead. I cannot seem to be able to get the output columns from sp mapped into the "execute oleDb" component no matter what I try.
Please, take notice that I am not posting "to get the job done", there are several ways to achieve the "goal"(script component, more complex Df, other auxiliary df, staging table...), and for whatever I needed it, this was already done.
I reiterate on this matter because people tend to say "you could do A, B or C instead", but all I want is to know is if the tool can or cannot do it. I don't want people to waste their time into that 😉
Thanks in advance,
Francisco Isla.
August 25, 2010 at 1:24 am
I guess I am out of luck then? :crying:
August 25, 2010 at 1:54 am
Francisco
I'm not aware of an Execute OLEDB task. Are you referring to the OLE DB Source? If so, what happens if you choose the SQL command data access mode and type "EXEC MyProc @MyParam = 'MyParam'"?
John
August 25, 2010 at 2:37 am
My fault. I meant to say "Execute OleDb Command". It is used frequently for update statements on an individual row basis.
Francisco.
August 25, 2010 at 2:57 am
Ah yes, the OLE DB Command transformation. I don't use this for the very reason you mentioned - it acts on each row individually and therefore isn't very efficient. But, as I understand it, you need to type EXEC MyProc ?, ? in the SqlCommand field and then map each of your input columns to one of the parameters to the stored procedure.
John
August 25, 2010 at 3:00 am
John,
Is the output of the SP what is the problem, say that the procedure returns two rows... like lastId int, nextId int.
That is what I am struggling to get mapped back into the data flow.
Regards,
Francisco.
August 25, 2010 at 3:05 am
As in being inneficient, yes it is, but that depend on your case/purpose.
If you however only need to apply that sp to 100 rows of a DF of 20 million rows, then that is the "cheapest" way to achieve some very real goals. 😉
Now, if I could only map the return columns, I would be very happy. 😀
Francisco.
August 25, 2010 at 3:17 am
Francisco
I'm not sure I agree with you on that - if you have an operation that can be done on 100 rows at once then that's going to be more efficient than doing it on individual rows 100 times. I do understand that there are operations that can't be handled like that, so maybe your case falls into that category.
Anyway, you did say you weren't interested in other ways of doing it, so have you looked at this?
It suggests using output parameters in your stored procedure definition. I don't know whether that means the OLE DB Command transformation doesn't support result sets.
John
August 25, 2010 at 4:55 am
Yes, you are right, 99% of the time, set-based or in-memory pipeline is the way to go. However, for performance reasons, sometimes the counter-intuitive way is the way to go. I use a lot raw files for speed reasons (they are an order of magnitude faster on data retrieval compared to ole db selects), and (as usual) I need to keep the order on these from flow to flow. Therefore, when data flows get weighty, I find it is better to treat exceptions in a way that looks costly, but that in fact avoid things such as resorting the data or using unnecessary staging tables. The speed improvement can easily be tenfold in some scenarios. 😉 It is good to remember that these are exceptions to the rule, not rules by themselves.
On a more agreeing topic... yes! :-DCreating output parameters in the procedure works! Although it is quite confusing/messy at best though... No output columns are generated for any of the output parameters. Instead, in the Column mapping tab, all parameters (input AND output) are mapped as "Available destination columns" and all columns in the Data Flow are shown as "Available input columns". :crazy:
That means that, if the parameter is flagged as an output, the "destination column" will in fact act as a (post-oledb execute) "source" and the "input column" will become the "destination", being its value effectively overwritten in the pipeline by the component.
As a side effect of these mechanics, if your DF does not previously have the columns (not my scenarios), you will have to create them previously in a Derived Column component.
I don't know whether that means the OLE DB Command transformation doesn't support result sets.
It doesn't (what a pitty!)... and if you cannot modify your sp, you will have to wrap it onto another one and pass the necessary output parameters a such.
Thank you very much John!
Francisco.
August 25, 2010 at 5:14 am
Interesting thread guys, thanks for all the info.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply