June 14, 2009 at 10:10 am
I am developing an ssis package to import a number of csv files into my database.
Each csv file name is the same as the database table name I am trying to insert into.
I have a parent package that loops through the csv files in a given folder.
I would like to call a generic child package that dynamically maps the csv columns into the correct database table schema.
I would need to change the mappings at runtime for each csv file being processed.
Can this be done?
Any help appreciated.
Regards,
Paul.
August 6, 2009 at 1:17 pm
Check CozyRoc Data Flow Task Plus. It includes support for dynamic data flows and may help in your scenario. We will be glad to hear your feedback.
August 6, 2009 at 3:00 pm
wadep,
So far my understanding is - there is no dynamic connections for multiple file schema. You define schema at the connection and it uses that connection for that file. The same applies to sql tbl.
I don't think you will be allowed to download this custom build task from third party into QA or PROD box. may be in test u can.
the only way i think is. u have to define connections for each cvs file type. so lets' say if u have 5 cvs file of different schema, then u create five different connections for each file based on their schema type. But durring the load , u can changed to use particular connection string based on file name.
I don't think of anyway for sql tbls, unless u have different dataflow task ( each file type will have its own dataflow task) alreday mapped to particular tbl in sql. and then use expressions in flow to control ur flow.
Let me know if u found differet way for this
thanks
August 6, 2009 at 3:09 pm
Gkhadka,
You will be amazed how many big companies are CozyRoc SSIS+ customers. We are quickly establishing as the go-to-provider for third-party SSIS components and technology. The CozyRoc components are saving time and money, which is especially important in these times.
We will be happy to hear your feedback regarding the improved Data Flow Task Plus.
August 6, 2009 at 3:43 pm
CozyRow,
I have no doubts that indeed it is a great tool. and i always wished that my company ( or any place i have been or will be) allow me to use all its products. Life would be easier, I agree.
I always download in my dev machine and it helps me a lot.
But I have never used in QA or PROD as compnay policy didn't permits or something.
All i can say is " I am not aganist it, I want to use it, but the fact is I am not in a position ( at least for now) where i can change rules or make policy.
Thank you
August 6, 2009 at 8:01 pm
Gkhadka (8/6/2009)
CozyRow,I have no doubts that indeed it is a great tool. and i always wished that my company ( or any place i have been or will be) allow me to use all its products. Life would be easier, I agree.
I always download in my dev machine and it helps me a lot.
But I have never used in QA or PROD as compnay policy didn't permits or something.
All i can say is " I am not aganist it, I want to use it, but the fact is I am not in a position ( at least for now) where i can change rules or make policy.
Thank you
I have heard this before. My point was that you were talking as if most other people or companies are in your position. And this is clearly not the case.
August 6, 2009 at 9:11 pm
Have you considered using BCP? That is a great tool. You can call it from within an SSIS package for each of your source files if required.
August 6, 2009 at 9:25 pm
August 6, 2009 at 9:39 pm
CozyRoc (8/6/2009)
Paul_J (8/6/2009)
Have you considered using BCP? That is a great tool. You can call it from within an SSIS package for each of your source files if required.How are you going to do the mapping of source to destination columns in BCP?
How about using a format file (which can be generated in XML/plain format and manipulated if necessary).
August 6, 2009 at 9:41 pm
CozyRoc (8/6/2009)
Paul_J (8/6/2009)
Have you considered using BCP? That is a great tool. You can call it from within an SSIS package for each of your source files if required.How are you going to do the mapping of source to destination columns in BCP?
How about using a format file (which can be generated in XML/plain format and manipulated if necessary).
August 7, 2009 at 2:57 pm
Dynamically building out the BCP command and calling it from SSIS is as good an option as any. Any other way you go, you get bound to having to have a unique data flow for each destination table. You could build a package for each destination type and have a master package loop through the directory, determine filename/destination table, and call the appropriate child package. This would keep your data flows for your destinations isolated so making changes would be easy and adding new file types / destinations would also be easily done.
August 7, 2009 at 4:19 pm
John Rowan (8/7/2009)
Dynamically building out the BCP command and calling it from SSIS is as good an option as any. Any other way you go, you get bound to having to have a unique data flow for each destination table. You could build a package for each destination type and have a master package loop through the directory, determine filename/destination table, and call the appropriate child package. This would keep your data flows for your destinations isolated so making changes would be easy and adding new file types / destinations would also be easily done.
John,
Data Flow Task Plus can build the data flow dynamically on the fly at runtime. You can provide external mapping meta data to control how the source columns gets mapped to the destination, but this is not mandatory. If the source and destination columns match in name, then the component will connect them automatically. The BCP doesn't provide easier solution. You are still obligated to do the mapping either manually or thru programming. And btw BCP doesn't offer the rich transformation capabilities that the SSIS has.
I hope you have a chance to try this new and improved functionality. It is in beta testing period and you can influence the end product. We have designed the new component not to require programming skills for solving the broadest possible scenarios.
August 10, 2009 at 9:54 am
That's great. I'm sure it works just fine, but the fact is that this problem is easily solved with tools and techniques already built into SSIS and SQL Server.
I think one of the reasons so many professionals use this site is because it is, for the most part, free from advertising. I suggest that if you want to gain exposure for your product, you work with Steve to get a product review done on your product and published on SSC as an article.
August 10, 2009 at 10:23 am
John Rowan (8/10/2009)
That's great. I'm sure it works just fine, but the fact is that this problem is easily solved with tools and techniques already built into SSIS and SQL Server.I think one of the reasons so many professionals use this site is because it is, for the most part, free from advertising. I suggest that if you want to gain exposure for your product, you work with Steve to get a product review done on your product and published on SSC as an article.
John,
I don't want to be disrespectful, but this problem is NOT easily solved with the existing tools and techniques. The fact that the SSIS data flow is static is one of the most often raised issues in the forums. So to say this problem is already easily solved means either you don't have good understanding of the SSIS framework or you have issues with CozyRoc trying to be helpful and offer solutions.
Based on what you have said in your second paragraph, I would say it is the latter.
August 10, 2009 at 11:04 am
It may just mean that my definition of 'easy' differs from yours. My post previous to your request that I try your product and provide you with feedback provided 2 separate solutions to the problem, both of which I would consider easy.
Would I use bcp for this, probably not. Would I go to my boss and tell him that I can't do this without purchasing a 3rd party component, definatly not!
Personally, I would go with a solution where each data flow for each file type is managed in an individual SSIS pacakge and have a master package read through the directory and call the appropriate child component. This isolates each data flow and allows for ease of maintenance, data flow customization based on file type, and scalability.
Viewing 15 posts - 1 through 15 (of 31 total)
You must be logged in to reply to this topic. Login to reply