February 23, 2009 at 4:31 pm
Hi all,
I searched throught he forum a bit, but I couldn't find anything concrete that would help me out.
Some background:
At my company, my team is responsible for managing about 110 instances (about 85 production). We have our own administrative server where we keep track of the servers we support, the instances, databases, etc. I have an SSIS package that will loop through instances, update connection managers, and execute statements (ie each morning we collect DB size information, last backup date/time, etc all through an SSIS package via a scheduled job).
Current situation:
I recently put together a few packages, one for a weekly reindex (rebuild or reorganize), and one for a DBCC check DB (full and physical only). Within the packages, I am writing to a local copy of a database called "DBA" to write the status (DBCC failed, success, etc) to which I would then add to my daily/weekly pull of gathering data. An SSRS report would then cap off the full solution to show us what DB's haven't had a DBCC check ran recently, which failed, etc.
The problem:
Is there an easy way to deploy the package, update the connection manager within the package, to a large number of systems. I've looked at the dtutil, but that will deploy the package, i'd need the connection manager updated within the package before it gets deployed. Really, I'd like to conform my existing SSIS package with the ForEach container to loop through each 2005 instance we have, drop the package if it exists, and re-deploy the updated package.
Sorry for the long post, but I figured all of the information would be relevant to anyone trying to help.
Thanks in advance,
Steve
February 26, 2009 at 7:33 am
Is there a reason you can't use SSIS configurations in the pacakge? If you save the connection string information in a configuration XML file, you can deploy the file to each instance with seperate connection information on each. You can also use BIDS to create a deployment manifest that will deploy the configuration file along with all of your packages. You would just copy the deployment package to each instance and execute it and then update the config file.
February 26, 2009 at 2:42 pm
Thanks for the suggestion! I'll take a look at package configurations. I haven't done too much with them. But I'm assuming i could loop through a list of servers and within a foreach loop, have the first step update the configuration, and have the connection managers look at the configuration.
At the least, it's something to try that I haven't really looked at.
Steve
March 2, 2009 at 11:24 am
After looking over the package configurations, this would probably be better for another situation, and was very useful.
However, I ran into issues with generalizing the location of the XML file on clusters and named instances.
The solution that I came up with, for anyone who is interested:
1. Created the SSIS package with a connection manager named "Instance_Connection"
2. Created a package variable called "instance_name" with a package scope.
3. Created a Script Task to use the package variable "instance_name" content to dynamically set to instance connection at run time. (this could also probably be done via expressions on the connection manager)
4. In the Job step to call the package, added a set value of \Package.Variables[instance_name].Value and set the value to the name of my instance.
5. When the job runs, the value that is set for that package variable in the "Set Values" step will dynamically update the connection manager within the package, thus causing the rest of the package to run under the instance provided.
This will allow me to create "generic" packages that i want to run on all of my instances, use dtutil to push the packages, then create the jobs to run the packages. Each job on each system would obviously need to be updated to set the instance_name variable correctly.
Hopefully someone benefits from this info.
Steve
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply