June 12, 2012 at 4:12 am
I was wanting some advice, we have a senior DBA at the company who believes that having a separate instance for all ssis packages i.e. just integration services installed no db engine etc is better practice than having ssis installed on all sql instances which are then used to execute require packages for that specific instances dbs
Can I have your opinions please, I can see the benefits and advantages for both, but would like some advice from people with better experience than myself
June 12, 2012 at 4:19 am
SSIS is not instance aware, it is server aware and is only used to query MSDB/FileSystem to list the packages stored within it when you connect to a SSIS server in SSMS, so you will need to ensure that the SSIS XML configuration file has multiple entries to the various FileSystems, MSDB databases of the servers where SSIS packages are stored.
You do not need SSIS installed to execute SSIS packages.
So in essence, the SSIS service is only for SSMS clients, the only benefit to having the SSIS service installed on all servers is that you dont have to mess about with the XML configuration file to be able to query where packages are.
You also introduce a single point of failure for SSIS packages should you store them in the filesystem on that one server. If you store them in MSDB then you will need to install the DB engine as well on that one server. So I would ensure that you save all your solutions and backup the filesystem / msdb should you go down the 1 server route.
I personally think there are no advantages to doing such a task as when you execute the package the local server will still issue dtexec on the local host and reference the remotely stored package so your just introducing more network traffic into the environment.
June 12, 2012 at 1:10 pm
I come down on the opposite side of the fence from Anthony on this, for a few reasons.
First is memory usage. SSIS runs in its own memory space outside the server cache. Due to that, it can end up starved on large processes and the like on systems that are heavily running the engine itself. Pushing SSIS off to its own node (box, VM, whatever) can mitigate these problems as well as give you more control.
Second is organization. If you know any/all jobs on any particular server are maintenance based, then only the DBAs will ever need access to them. However, when you start mixing in data jobs and similar issues, you need to give devs production access for troubleshooting. The problem with this is SQLAgent and SSIS access require significantly high level access. Giving them rights to the SSIS machine is much less intrusive.
Included in organization is while yes, you have a single point of failure, you ONLY have a single point of failure. If that machine stops emailing for some reason you will usually know pretty quickly. Having to test the login (and additional failures) off all machines (something that SQL 2K5 is known for, email issues) and figure out what else has gone horribly wrong vs. simply scrolling a list on a single machine for all known data issues is much simpler.
Continuing with the organization, it makes it a lot easier going through QA and Prod rollouts to have everything centralized. Part of this, again, is security. Another part of it is your SQLAgents on your main machines, the ones doing maintenance, can have higher level accesses than the one coming off your SSIS dedicated instance/node. You can directly control how much damage a mistake from dev can actually transfer when your SSIS_SQLAgent login only has datareader while your local maintenance SQL Agent has DBO so it can do reindexes without worrying.
Finally, all those data manipulation packages are in a single place. When I do schema modifications (or similar) to some product, I have a single place I need to go to review packages that may touch it. In a free-for-all where the SSIS is stored/ran from all different machines, I'd need to check every one in case something was pulling/pushing from a foreign destination/source.
There's a lot of plusses to a centralized SSIS server to me. It's not a good idea in all cases, of course, but if you're maintaining 20+ servers and want to be able to control security access and memory issues, it's very useful.
The only con I have is that since the memory/transformation manipulation is occuring on another machine, it's yet one more network transfer that needs to occur at either the source or destination.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply