June 18, 2007 at 9:58 am
I have some complicated data flows with many conditional split transformations. Each decision made in the conditional splits result in some data needing to be updated, inserted, or deleted.
I'm surprised not to find an insert/update/delet task available in the data flow window. I guess I'm looking for the equivilent of the "Execute SQL" task that is available in the control flow window.
I feel like I'm missing something. Do I need to jump out of the data flow back into the control flow every time I want to insert/update/delete data?
I know I can insert data using an OLEDB destination, but it's a terminator. No records come out the other side execpt the records that fail the insert and travel down the error output.
Am I thinking about this the wrong way?
Thanks in advance,
daustinash
June 18, 2007 at 10:13 am
You _can_ use the OLE DB Command component but if dealing with many rows, it will be slow as SQL handles the rows one at a time.
If you do have many rows (>1000) I would say use a SQL dest on each update or delete path and then outside the data flow, use a SQL task to join the tables and do the delete etc. it is a much fast option.
Cheers,CrispinI can't die, there are too many people who still have to meet me!It's not a bug, SQL just misunderstood me!
June 18, 2007 at 10:25 am
Thanks crispin. I overlooked the fact that the OLE DB Command would execute sql. It makes sense though. What else would it be for.
I hear you about it being slow though. It may not matter since the data actions are technically decided row by row anyway. It will never be operating on large sets, but it will be executed many times.
I'll return to this post and update as warrented.
June 18, 2007 at 10:34 am
Hold yer horses.....
Yes, SSIS does indeed deal with each row one at a time. Each row in a buffer that is... it is designed to do this and is fast at doing it.
But SQL is a very slow beast when it comes to row based operations. Do not do it.
Example:
Run 1000 updates, one row at a time (i.e. Using the OLE DB Command) will be a LOT slower that bulk loading 1000 rows into a table and then doing a join update between the two tables.
Well, maybe not 1000 but that is where the performance would start to show. Anything greater would be slow.
Play around with the different methods and see which works for you.
Cheers,CrispinI can't die, there are too many people who still have to meet me!It's not a bug, SQL just misunderstood me!
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply