May 1, 2008 at 11:19 pm
How to loop all the files and which way is better to proceed. Please provide me any sample code. User will provide data on daily basis and need to insert/update the data from 100 txt files into 150 tables with validations.
May 2, 2008 at 8:25 am
Assuming the files are all formatted the same way:
1. Add a Foreach Loop Task
2. In the collection section specify "Foreach File Enumerator"
3. Specify the folder where the files will exist and any file qualifiers (Prefix, suffix) Ex: *.txt
4. Select "Fully Qualified"
5. In the Variable Mappings section, Create a package variable to store this file name.
Within the Foreach Loop container:
1. Add a Dataflow task to Read in one of the source files and populate the destination table
2. Add an expression to the Source File definition to override the ConnectionString to the package variable above.
This should read all specified source files from the folder and perform a data flow task on each. You will need to create similiar logic to determine which destination table you will be inserting into.
Hope this helps!!!
May 5, 2008 at 6:20 am
Hi Thanks for your reply.
I dont have idea on how the OLE DB Destination will work, where as it needs different table for each txt file. How to create connection manager for oledb, because it is asking for a single table or view name.
Thanks & Regards,
Kishore
May 5, 2008 at 9:38 am
Assuming the database tables all have the same format, you will need to initially point to one of these tables in your data flow. Once you have completed this, you can create another package variable with the destination table name.
Then go back into the oledb destination for the dataflow and change the "data access mode" to "Table name or view name variable". Point the the destination package variable you created above. You will obviously need to modify this name during the for loop process.
Good luck!
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply