October 10, 2019 at 6:21 pm
We're tinkering with an approach where we pull DEV vs PROD environment variables from a database, rather than have a one *.dtsx file with hard-coded values for dev and a separate one for prod. So, mostly, our connection managers use the Expression Builder and point to variables to populate ServerName and InitialCatalog, and then those variables are overwritten at runtime with values based on whether the *.dtsx is run from a dev or prod environment.
In order to compile our *.dtsx files, we populate these variables with a hard-coded value (sInstanceName = "DEV01"). What seems to be happening, is that even though we're successfully pulling the updated values from the database, the connection managers are, maybe, verifying connectivity prior to the step where we overwrite the variables underpinning the ServerName and InitialCatalog properties.
This overall design model works in our dev environment, as we're hard-coding the dev values into the *.dtsx file. When we move the *.dtsx project to prod, though, the jobs are failing with a database connectivity check. And if you check the SQL Server error log in the development db environment, you can see the failed login from the production SQL Agent service account.
Is there a way to defer this db check at runtime, until after the variables are retrieved from the database?
Thanks,
--=Chuck
October 10, 2019 at 6:36 pm
Your approach seems more cumbersome than necessary. There's rarely any need to hard-code environment-specific parameters - just configure them to pick up the values of SSISDB environment variables.
The absence of evidence is not evidence of absence.
Martin Rees
You can lead a horse to water, but a pencil must be lead.
Stan Laurel
October 10, 2019 at 10:24 pm
I probably described this poorly, "hard-coding" was the best I could come up with. I suppose saying that "we put a string literal" in works also.
But, what ended up fixing our issue, is that there's a "DelayValidation" property that has to be enabled on items which use a connection manager, in order to keep SSIS from attempting to use the string literals used, before those values are overwritten by the initial Script Task in the package.
Cumbersome is definitely one way to look at it, but, we have our jobs catalogued in an in-house built system, and we wanted to make these values controllable by someone other than a person who has access to the SQL Agent. Admittedly, they may not be comfortable changing some of the values, but we store others, like "email contact", which they can control without any major issues.
--=Chuck
November 8, 2019 at 3:13 pm
Just as a quick note, we do something similar but we use the configuration manager to store the server and initial catalogue values.
That way, it doesn't care which server we're pointed at in SSDT as the values are always valid.
We have DEV and PROD environments and just change the config manager to run tests and just deploy to the correct server as necessary.
Works well for us.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply