Loop through tables and transfer

  • Hi all,

    I am trying to loop through a list of tables (stored in a table in a sql server database) and transfer the table from one server to another. So far what I have done is the following:

    Create a table on source database with list of tables to transfer

    In SSIS 2005:

    Create object variable called Batch1Tables

    Create string variable called Batch1TableName

    Source and destination OLE DB connections (SQL Server 2005)

    Process:

    Load table names from table into variable Batch1Tables

    For ADO loop through those tables putting the table into Batch1TableName

    Truncate table in destination

    Date flow task has an OLE DB Source and OLE DB destination objects both using "Table name or View Name Variable" which then uses

    Batch1TableName.

    In order to get this working I had to set a value in the variable of the first table name.

    When I run the package, the first loop through works, but the second table fails with the error:

    "The external metadata column collection is out of synchronization with the data source columns. The column "REFC" needs to be added to the external metadata column collection."

    "The column "LNAM" needs to be added to the external metadata column collection." ... this is repeated for all the first tables columns.

    It seems to me because I created it with the variable set to a particular table, the mappings have been made for tha table and on the second table, those columns are no longer valid.

    Does anyone know a way of clearing the mappings and automatically redoing them at runtime? Has anyone done something like this before?

    Many thanks.

  • The data flow component does not work well in dynamic situations: you need to find another way.

    One is the use of dynamic SQL to do all of the DDL and DML you need.

    Another may be the use of the transfer SQL Server Objects task.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Thanks for the quick response Phil.

    I Initially tried using the SQL Server Objects task, but unfortunately it wouldn't work with the databases being in compatibility mode (which we need for the time being).

    Will I not have the same problem with using dynamic sql? I'll give that a go although that might be a little messy when looking at it in a few months time (when I've forgotten what it is doing)!!

  • I don't think it should be that messy - depending on your coding skills, of course 🙂

    For each table in [array]

    create destination table

    copy data across

    Next

    Simple:-D

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • You've hit the nail on the head "depending on your coding skills"!!!

    Thanks again, I will give it a go.

  • Phil Parkin (3/10/2011)


    I don't think it should be that messy - depending on your coding skills, of course 🙂

    For each table in [array]

    create destination table

    copy data across

    Next

    Simple:-D

    We can achieve the same thing through using Script Task? Just fetch the Table Name then prepare dynamic sql. Fire it. Work done. Dont you think it will work ??:-)

    __________________________________________
    ---------------------------------------------------
    Save our mother Earth. Go Green !!!

  • Ok so from the postings:

    loop through the tables doing:

    1. Script task to create dynamic sql to create the table if it doesn't exist on destination.

    2. Run sql against destination OLE DB

    what I am not sure about now is how to get the data from source to destination. Are we saying do a dynamic sql which "insert into dest select * from source" and if so, will i not have to have a linked server on the source?

  • Andrew Williams (3/10/2011)


    Ok so from the postings:

    loop through the tables doing:

    1. Script task to create dynamic sql to create the table if it doesn't exist on destination.

    2. Run sql against destination OLE DB

    what I am not sure about now is how to get the data from source to destination. Are we saying do a dynamic sql which "insert into dest select * from source" and if so, will i not have to have a linked server on the source?

    For that solution, yes, you would need a linked server.

    It would be easier if you used a statement like this:

    SELECT * INTO myServer2.mySchema2.myDestinationTable FROM myServer1.mySchema1.mySourceTable

    That way the destination table will be created for you. (but you'll need to make sure it doesn't exist yet)

    Just to mess with your head, there's still another solution:

    since you already have the connectionstrings stored into the metadatatable, you can create a script component that will use the DataReader class and the SQLBulkCopy class to transfer the data. The first one will read the data from the source, the second one will bulk insert it into the destination. It is a bit (advanced) .NET coding. You can find an example of this solution in the excellent book SSIS - Problem Design Solution.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Hi Andrew,

    If you can use third-party solutions, check the commercial CozyRoc Data Flow Task Plus. It is an extension of the standard Data Flow Task and it is designed to solve exactly that type of requirement that you have. It can setup the metadata of the source and destination components dynamically at runtime. The solution doesn't require programming skills.

    ---
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

  • Andrew Williams (3/10/2011)


    Ok so from the postings:

    loop through the tables doing:

    1. Script task to create dynamic sql to create the table if it doesn't exist on destination.

    2. Run sql against destination OLE DB

    what I am not sure about now is how to get the data from source to destination. Are we saying do a dynamic sql which "insert into dest select * from source" and if so, will i not have to have a linked server on the source?

    Hi Andrew

    Instead you can use these steps. Its simple enough and doesn't need any complex process:

    1. Drag n Drop a script task.

    2. Create source n destination ADO.Net connection(Or access it through connection collection)

    3. access source table name and create an "Insert into table select * from [database].dbo.table"

    4. execute this sql string into destination server. All you need is to write pure C# code nothing else.

    If I am correct then you can perform the whole operation inside the script task. No need to go for any source and destination connection. But again I am talking about the specific requirement about moving data.

    Script task is useful in cases where we are unable perform required task by consuming existing tasks that are available in SSIS but can be done easily using C# & ADO.Net.

    Hope it will help.

    __________________________________________
    ---------------------------------------------------
    Save our mother Earth. Go Green !!!

  • Anjan Wahwar (3/10/2011)


    Instead you can use these steps. Its simple enough and doesn't need any complex process:

    1. Drag n Drop a script task.

    2. Create source n destination ADO.Net connection(Or access it through connection collection)

    3. access source table name and create an "Insert into table select * from [database].dbo.table"

    4. execute this sql string into destination server. All you need is to write pure C# code nothing else.

    If I am correct then you can perform the whole operation inside the script task. No need to go for any source and destination connection. But again I am talking about the specific requirement about moving data.

    Script task is useful in cases where we are unable perform required task by consuming existing tasks that are available in SSIS but can be done easily using C# & ADO.Net.

    Wouldn't it be better to use the SQLBulkCopy class in C#? It reduces the logging overhead, so it will certainly run faster.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Thanks for all your answers, I really appreciate it.

    I could create a stored procedure to do a select * into, but I feel the transfering of data in SSIS is quicker.

    So my question is, if I do it in a script task, will that be the same as doing it in a stored proc (in terms of speed)?

    If I use a "SQLBulkCopy class in C#" will that give me the speed an SSIS transfer usually gives me?

    Thanks again.

  • That is not an easy question to answer.

    SQLBulkCopy is a bulk operation, but so is the OLE DB Destination with Fast Load in the dataflow. Speed can depend on batch sizes and maybe other configurations (such as triggers, constraints, indexes). However, the dataflow can not be made dynamic, while the SQLBulkCopy class can.

    SELECT INTO in a stored procedure can also be dynamic through the creation of dynamic SQL.

    If the recovery model of the destination database is simple or bulk-logged (which it should be if you are transferring a lot of rows), then it is also a bulk-logged operation (see logging behaviour in this article: http://msdn.microsoft.com/en-us/library/ms188029.aspx)

    So I really can't answer your question 🙂

    You should test it out for your specific situation.

    Or just develop the solution in the technology you feel most at home with.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Or just develop the solution in the technology you feel most at home with.

    Hmm, risky recommendation. Not sure I could solve this one using my iPod 🙂

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Phil Parkin (3/11/2011)


    Or just develop the solution in the technology you feel most at home with.

    Hmm, risky recommendation. Not sure I could solve this one using my iPod 🙂

    I shall let this one slip 'cause it's Friday 😀

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply