October 21, 2009 at 1:51 pm
Hi Pals,
How can I import data from a csv file and load to database table.
I want to use Script component(Source). I will pass the table name(destination) from a variable.
Means, I dont know the output columns.
So, how can I insert data to output buffer in the script component.
Is there any good solution for this?
thanks in advance
October 22, 2009 at 6:46 am
I think you'll have to do it all in a script component. Read the file and then do the insert into the destination.
Jack Corbett
Consultant - Straight Path Solutions
Check out these links on how to get faster and more accurate answers:
Forum Etiquette: How to post data/code on a forum to get the best help
Need an Answer? Actually, No ... You Need a Question
October 22, 2009 at 6:51 am
Couldn't you just use a flat file source in the data flow task, and use a script task in the control flow before the data flow task to set the connection string of the flat file connection manager?
--------
[font="Tahoma"]I love deadlines. I like the whooshing sound they make as they fly by. -Douglas Adams[/font]
October 23, 2009 at 3:59 am
Hi,
Thanks for the replies.
No, I can't do using simple flat file source coz it is a csv file and delimited("). The column value also contains commas and double quotes and text. The column is a text field.
So, I am not getting correct columns during transformation.
The reason is, I was asked to create a generic package which accepts Destination table name, source file path(.csv) and inserts to destination table. Means, I dont know the exact columns in the source or destination.
Do you suggest any solution?
November 16, 2009 at 10:08 pm
a2zwd (10/23/2009)
Hi,Thanks for the replies.
No, I can't do using simple flat file source coz it is a csv file and delimited("). The column value also contains commas and double quotes and text. The column is a text field.
So, I am not getting correct columns during transformation.
The reason is, I was asked to create a generic package which accepts Destination table name, source file path(.csv) and inserts to destination table. Means, I dont know the exact columns in the source or destination.
Do you suggest any solution?
That presents some problems.. Dynamic source file and destination table are no problem in general.. Where it gets tricky is that the fields in the pipeline of the dataflow are set at run-time, meaning they can't change, either in count, in name, or in type.
With that said, you could build a .net application that dynamically builds a package for a particular combination and then executes it but that would be painful..
It is almost easier to write a .net app to load the data..
CEWII
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply