SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table

  • Hello everyone,

    I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.

    On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?

    I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.

    Does anybody know which is the best way to setup the Dataflow task for this requirement?

    Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.

    Any help would be much appreciated. It's very urgent.

  • ankithimmatlalshah (4/29/2015)


    Hello everyone,

    I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.

    On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?

    I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.

    Does anybody know which is the best way to setup the Dataflow task for this requirement?

    Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.

    Any help would be much appreciated. It's very urgent.

    SSIS does not support dynamic column mapping at runtime. If all of the table structures are different, you need 50 data flows.

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

  • Hello Phil,

    Thank you for the quick response.

    Yes, the table structure is different for each SQL table.

    Is there any way to achieve it dynamically using script component on Data Flow task?

    Thanks,

    Ankit

  • ankithimmatlalshah (4/29/2015)


    Is there any way to achieve it dynamically using script component on Data Flow task?

    No. You cannot change the metadata of the data flow at runtime.

    An alternative is using a script task (in the control flow) and using the .NET SQLBulkCopy class.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Hello Koen,

    Thank you for the response.

    So, you mean use Script task, which will pick the .csv file one by one from the specified folder, and use .Net SQLBulkCopy to push the data to SQL table. Is my understanding correct?

    How can we extract out the corrupted rows with this approach?

    Can you provide any links for .NET SQLBulkCopy?

  • No idea about the corrupted rows.

    Don't know if it's possible. Maybe you can check out bpc.

    First result from google for sqlbulkcopy class:

    SqlBulkCopy Class

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply