September 21, 2012 at 12:15 pm
Hello,
In a large database I wish to extract text files with the data from individual tables as well generate the schema for each table in Excel format. This is to enable a client that is no longer working with our company to get their data.
So, I envision this as a SSIS package that first gathers a collection of tables to export. Then a "for" or "foreach" container is encountered. This will repeat data-flow operations where a current table is put in a variable and a query similiar to "select [field1], [field2], etc from @currenttable" is created. The output is sent to a file. Then a second query is dynamically created (using the INFORMATION_SCHEMA.Columns table) so that the package can also create an Excel file with schema information.
The only problem I see with this is how to dynamically alter the columns in the connection manager associated with the table's data dump? I've been reading a lot of websites where the discussion of dynamically specifying columns in flat file destinations is discussed; e.g.,:
http://www.sqlservercentral.com/Forums/Topic628424-147-1.aspx
However, I haven't seen where someone explains how to handle dynamically fill columns for flat file destination connection managers.
Maybe the only way it can be done is via a script task. Does anyone have an example of a script task that does everything that a flat file destination connection manager does?
thanks!
September 24, 2012 at 8:03 am
I've just encountered this thread:
http://www.sqlservercentral.com/Forums/Topic457155-148-1.aspx
This mirrors my question. Apparently, SSIS's need to have columns defined in advance precludes using it to dump many tables from a single database for a single client.
September 24, 2012 at 12:07 pm
Please check the article :
http://msbi2012.blogspot.in/2012/09/expression.html
and
http://msbi2012.blogspot.in/2012/09/dynamic-creation-of-file-from-database.html.
It will help you a lot
September 26, 2012 at 6:46 am
Hi Sumit,
Thanks for the reply. However, what you are doing is different than what I wish to do.
You know in advance the fields that you are selecting from the person table. Meanwhile, I was hoping to use SSIS to dump the contents of a variety of tables with different structures.
Let's say I have a database with many tables that share a common field; e.g., vendor_id. If I know in advance I want data for vendor_id=444, then I can write a query that sticks the names of each table that has a vendor_id field into a temp table. (For example, table1, table2, ..., where each has a vendor_id field.)
Using this temp table (of tables to dump) I wish to use a Foreach container to output the contents of every single table where vendor_id=444. I'd like to have a generic flat file destination connection manager and dynamically load the columns for each new table prior to dumping the contents of it.
Unfortunately, SSIS doesn't allow this. It needs to have the columns for the flat file destination connecton manager defined in advance.
Considering how common the task of making SQL dumps is I'm surprised that SSIS doesn't have a solution for dynamically loading columns into flat file connection managers.
If I'm mistaken about anything please let me know. Again, thanks for the good information. 🙂
September 26, 2012 at 3:57 pm
I assume that the columns you are referring to varies from table to table, correct?
If this is the case, I don't think SSIS can handle dynamic "formation" of columns. You may try setting the delay validation to false but I don't think that will work.
Programatically doing it in XML might do the trick though.
September 27, 2012 at 6:27 am
quasar_phoenicis, yes, I was trying to find a SSIS solution where the column definitions will change from table to table.
Too bad SSIS doesn't have a mechanism to handle this kind of data dump. I don't think what I wish to do is all that unique.
Thanks for the reply.
May 14, 2016 at 7:58 am
Did you solve this? I surprised SSIS cannot do that - hoping something new came up since your question, but i'm stuck.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply