Deploy package to multiple servers

  • Hi,

    we currently have several servers running SSIS packages building tables for a data warehouse. These packages were created once and deployed to each server being extended server by server. During optimization process we want to unify the packages, so we have to maintain and upload only one package.

    First we wanted to copy the T-SQL into tables that are copied from one source to all servers over night and execute the SQL in a SP. But this requires to re-create most of the package content (as there exist some dataflow tasks) having the column limitation (i.e. nvarchar(max)) in mind.

    The best way would be some kind of broadcast to all servers when saving / deploying a package.

    Any ideas?

    thanks in advance

    Martin

  • I might be missing something, but is there a reason you can't deploy to ONE server (an SSIS server), and dynamically change the sources/targets, instead of deploying the same package multiple places?

  • I have not found any broadcast mechanisms in SSIS. We have packages that deploy the same data to multiple servers by running the same package but overriding the connection string properties.

  • @ Nevyn: I've already thought about that, but how would I do this?

    @ EricEyster: We have also some packages that cycle through different connections, executing the steps and overriding the connection. But in my case, we want to run the packages on servers with different time zones every day at the same time ...

  • What version of ssis?

  • It is 2008 R2

  • Well, it is tough to get too specific without knowing what your source/destination servers are or the particulars of your package.

    But you can define an expression on a connection manager, and you can use those to dynamically switch the source and/or destination server. If the server uses windows authentication, you can do an expression on just servername (assuming the same user has access to all servers). If not, you can do an expression on the connection string.

    You assign a variable inside that expression, and then you either pass in that variable when executing that package (you would need separate jobs set up for each set of target servers), or you can create a database table on the ssis server with a list of source/destination servers (and maybe the time zone for those servers if needed).

  • I think I've got it.

    I just create a table on each server containing login information. With this table I will dynamically create my connection string for my packages that are all uploaded to one "master" server.

    When creating a job on a server I will have to refer to the packages stored on the master server. Right?

    But ...

    I still have got some dataflow tasks. How does that behave? Is dataflow task data first transferred to the master server because the package is stored there? Or does my server just fetch the package from there?

  • Not exactly.

    Im talking more about having the master server be where you schedule and manage all the jobs. Any data flows would pass though the ssis server's memory before being written to the destination. Any tsql commands would run on the server you connected to.

  • In case of the dataflow task, this method is not the best one for us. Our dataflow tasks are just the tasks, where the most data is transferred... So jobs will take a very long (too long) time.

    I was already thinking about a batch job, that uploads the packages to multiple servers using DTUTIL.exe

    But at the moment I have no clue how to.

    So far, thanks for you replies.

  • I was already thinking about a batch job, that uploads the packages to multiple servers using DTUTIL.exe

    But at the moment I have no clue how to.

    Out of interest, how many servers?

    Regarding DtUtil, what is your issue exactly? Syntax? Cross-domain issues? How to run it? Something else?

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

  • Hi Phil,

    27 at the moment, number increasing.

    The most comfortable solution would be a table stored on a master server, from where the batch file fetches all information, which packages shall be copied to where. There are 2 or 3 servers that receive slightly modified packages.

  • martin.hock (2/6/2014)


    Hi Phil,

    27 at the moment, number increasing.

    The most comfortable solution would be a table stored on a master server, from where the batch file fetches all information, which packages shall be copied to where. There are 2 or 3 servers that receive slightly modified packages.

    How's your Powershell? That would be one way.

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

  • Never used it. But if you think it will solve my problem, I will 😉

  • martin.hock (2/6/2014)


    Never used it. But if you think it will solve my problem, I will 😉

    Certainly a good opportunity for you to learn. I am no PoSh guru by any means, but I think it will do the job very nicely.

    If you want a free PoSh IDE to help you along the way, there is one here which seems decent enough to me.

    The absence of evidence is not evidence of absence.
    Martin Rees

    You can lead a horse to water, but a pencil must be lead.
    Stan Laurel

Viewing 15 posts - 1 through 14 (of 14 total)

You must be logged in to reply to this topic. Login to reply