January 31, 2017 at 11:07 am
Hi Jeff,
Thank you so much for this post and the script. It saved me tons of time, for sure.
I had to do 2 small changes in your script, as it wasn't working as I'd like:
- Instead of several PRINT, I use a temporary table and SELECT the SQL code at the end. This will overcome the issue with the PRINT size limit. Copy the code and paste it in a new query.
- When the environment variable type is "nvarchar", declare the local variable in the script as "nvarchar(1000)"
Please see attached my version of your script.
Thank you and keep up the great work,
Miguel Oliveira
March 24, 2017 at 7:00 am
I had to make correction, too. Just want to emphasize that use nvarchar
without lenth leads to truncation of configured values to 1 symbol. So it is is not just improvement but necessity.
(SELECT statement that starts from line 49 of original script)
SELECT [env_var] = @tab + @tab + ', @' + ev.name + ' ' + ev.base_data_type
+ CASE ev.type WHEN 'String' THEN ' (1000) ' ELSE '' END
+'= N''' + ISNULL(CONVERT(varchar(max), ev.value), '<REPLACE_ME>') + ''''
, [name] = ev.name
INTO #env_var
FROM [SSISDB].[catalog].[folders] f
INNER JOIN [SSISDB].[catalog].[environments] e ON f.folder_id = e.folder_id
INNER JOIN [SSISDB].[internal].[environment_variables] ev ON e.environment_id = ev.environment_id
WHERE (f.name = @folder) AND (e.name = @env);
April 3, 2017 at 1:38 pm
Thanks, Jeff! This is a great article! Question, though. Once you have duplicated your environment variables across servers, you still need to map each of your project/package parameters to those environment variables. Do you know of an easy way to script out and copy the mappings of projects to environment variables? I have dozens of projects, each with a dozen or more mappings to environment variables. These projects need to be migrated to a new server. I would like to avoid the tedious task of remapping all of the variables if possible. Thanks!
June 2, 2017 at 12:35 pm
ichv - Friday, March 24, 2017 7:00 AMI had to make correction, too. Just want to emphasize that usenvarchar
without lenth lids to truncation of configured values to 1 symbol. So it is is not just improvement but necessity.(SELECT statement that starts from line 49 of original script)
SELECT [env_var] = @tab + @tab + ', @' + ev.name + ' ' + ev.base_data_type
+ CASE ev.type WHEN 'String' THEN ' (1000) ' ELSE '' END
+'= N''' + ISNULL(CONVERT(varchar(max), ev.value), '<REPLACE_ME>') + ''''
, [name] = ev.name
INTO #env_var
FROM [SSISDB].[catalog].[folders] f
INNER JOIN [SSISDB].[catalog].[environments] e ON f.folder_id = e.folder_id
INNER JOIN [SSISDB].[internal].[environment_variables] ev ON e.environment_id = ev.environment_id
WHERE (f.name = @folder) AND (e.name = @env);
You can simplify this further by just using sql_variant since that is what the stored procedures expect anyway. The procedure as-written also does not support environment variables with spaces in their names. This will fix both.
SELECT [env_var] = @tab + @tab + ', @' + REPLACE([ev].[name], ' ', '_') + ' sql_variant = N'''
+ ISNULL(CONVERT(varchar(MAX), [ev].[value]), '<REPLACE_ME>') + '''',
[name] = [ev].[name]
INTO [#env_var]
FROM [ssisdb].[catalog].[folders] [f]
INNER JOIN [ssisdb].[catalog].[environments] [e] ON [f].[folder_id] = [e].[folder_id]
INNER JOIN [ssisdb].[internal].[environment_variables] [ev] ON [e].[environment_id] = [ev].[environment_id]
WHERE ( [f].[name] = @folder ) AND ( [e].[name] = @env );
The code that generates the procedure calls will have to be changed also:
+ @tab + ', @value=@' + REPLACE([ev].[name], ' ', '_') + @cr + @tab + @tab + ', @data_type=N''' + [ev].[type] + '''' + @cr + @tab
March 2, 2018 at 5:57 am
I enjoyed reading this post again.
It's somewhat disappointing that we're still in a position where it's nearly impossible to have SSISDB environment variables and their mappings to project parameters scripted in such a way that it can run as part of a CI process ... an SSISDB post-deployment script.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
March 2, 2018 at 10:04 am
Hmm... in my last job where I did SSIS work, we had .dtsConfig files with database connections for all our database servers and databases, one .dtsConfig file/database. They're stored in the same local folder, with the same file name (say, c:\SSISConfigs) on all the SSIS servers - dev, it, prod, etc. So the config files to AdventureWorks were named just that - "AdventureWorks.dtsConfig", not "AdventureWorksDev.dtsConfig". The .dtsConfig file name needs to be the same regardless of environment though, since the package can't selectively load from different .dtsConfig files on its own; the important information (server name, actual database name, trusted security vs username/password, etc) is within the .dtsConfig file anyways.
As developers, we all have our local copies (for the dev and test servers, of course) in the same folder. The Server gods do not grant developers file system access to the SSIS servers, so any secret sauce stored in those .dtsConfig files remains protected, at least from us - developers can't copy the prod versions to their local computers. We have an in-house application that we use to push our SSIS packages (by way of a proxy account) to our SSIS servers. [yes, malevolent SSIS developer could deploy a package to prod that copies the prod .dtsConfig files off that SSIS server...]
As SSIS package developers, we also use flavors of those .dtsConfig files locally, stored in the same local directory structure. By doing this, we can now seamlessly deploy SSIS packages to different servers, whether deployed in file system or SSIS database, and it all Just Works with 0 post-deployment configuration, since the SSIS packages are looking for .dtsConfig files in the same directory structure. (remember, SSIS packages do not really do relative folder paths...for anything. That's why BIDSHelper has its "fix relative paths" functionality).
Our deploy tool also freed the system admins from having to set up their system jobs to get the package configurations right in the DTEXEC command lines, since the packages themselves already had that built-in to them.
Since Visual Studio caches the configuration file(s) information when the package is initially opened anyways, I could live with "open package with dev flavor of .dtsConfig files, make changes, close project, copy IT/QA/etc. .dtsConfig files to C:\SSISConfigs, reopen project & package" cycle.
Previous experience working with system admins to get production configurations just right was always a frustrating time sink for all involved. Especially for trying to write the deployment/configuration steps for the system admins to try and follow.
YMMV.
March 2, 2018 at 10:10 am
corey lawson - Friday, March 2, 2018 10:04 AMHmm... in my last job where I did SSIS work, we had .dtsConfig files with database connections for all our database servers and databases, one .dtsConfig file/database. They're stored in the same local folder, with the same file name (say, c:\SSISConfigs) on all the SSIS servers - dev, it, prod, etc. So the config files to AdventureWorks were named just that - "AdventureWorks.dtsConfig", not "AdventureWorksDev.dtsConfig". The .dtsConfig file name needs to be the same regardless of environment though, since the package can't selectively load from different .dtsConfig files on its own; the important information (server name, actual database name, trusted security vs username/password, etc) is within the .dtsConfig file anyways.As developers, we all have our local copies (for the dev and test servers, of course) in the same folder. The Server gods do not grant developers file system access to the SSIS servers, so any secret sauce stored in those .dtsConfig files remains protected, at least from us - developers can't copy the prod versions to their local computers. We have an in-house application that we use to push our SSIS packages (by way of a proxy account) to our SSIS servers. [yes, malevolent SSIS developer could deploy a package to prod that copies the prod .dtsConfig files off that SSIS server...]
As SSIS package developers, we also use flavors of those .dtsConfig files locally, stored in the same local directory structure. By doing this, we can now seamlessly deploy SSIS packages to different servers, whether deployed in file system or SSIS database, and it all Just Works with 0 post-deployment configuration, since the SSIS packages are looking for .dtsConfig files in the same directory structure. (remember, SSIS packages do not really do relative folder paths...for anything. That's why BIDSHelper has its "fix relative paths" functionality).
Our deploy tool also freed the system admins from having to set up their system jobs to get the package configurations right in the DTEXEC command lines, since the packages themselves already had that built-in to them.
Since Visual Studio caches the configuration file(s) information when the package is initially opened anyways, I could live with "open package with dev flavor of .dtsConfig files, make changes, close project, copy IT/QA/etc. .dtsConfig files to C:\SSISConfigs, reopen project & package" cycle.
Previous experience working with system admins to get production configurations just right was always a frustrating time sink for all involved. Especially for trying to write the deployment/configuration steps for the system admins to try and follow.
YMMV.
OK, but what point are you trying to make?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
March 2, 2018 at 10:18 am
Corey, I understand what you're saying. However, the scenario you're talking about here is using configuration files and SSIS Environments and Environment variable, or the Project deployment models. These we're meant to be an improvement over using config files. Can you still deploy you packages the old way and use configuration files? Yes, that works perfectly fine but there are several advantages to using the project deployment model and SSIS environment/variables.
Thanks,
Jeff
March 2, 2018 at 10:20 am
Thanks Phil! I'm glad you enjoyed it.
March 2, 2018 at 2:55 pm
Absolutely great scripts . However if we add the following code we can also scripts out the SSIS package local parameter reference to environmental variables
SET @sql += @tab + '/*************************************************************' + @cr;
SET @sql += @tab + @tab + 'References Package Variable to Enviromental variables' + @cr;
SET @sql += @tab + '**************************************************************/' ;
PRINT @sql;
/* Generate the variable creation */
SELECT [cmd] = @tab + 'RAISERROR(''Referencing variable: ' + p.parameter_name + ' ...'', 10, 1) WITH NOWAIT;' + @cr
+ @tab + 'EXEC @ReturnCode = [SSISDB].[catalog].[set_object_parameter_value]' + @cr
+ @tab + @tab + '@object_type=' + CONVERT(nvarchar(5),p.object_type) + @cr
+ @tab + @tab + ', @parameter_name=N''' + p.parameter_name + '''' + @cr
+ @tab + @tab + ', @object_name=N''' + p.object_name + '''' + @cr
+ @tab + @tab + ', @folder_name=N''' + f.name + '''' + @cr
+ @tab + @tab + ', @value_type=N''' + p.value_type + '''' + @cr
+ @tab + @tab + ', @parameter_value=N''' + p.referenced_variable_name + '''' + @cr
+ @tab + 'IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback;' + @cr
, [parameter_id] = p.parameter_id
INTO #cmd1
FROM [SSISDB].[internal].[object_parameters] p
inner join [SSISDB].[internal].[projects] p2 on p.project_id = p2.project_id
inner join [SSISDB].[internal].[folders] f on p2.folder_id = f.folder_id
WHERE p.Value_type='R' and f.name = @folder
/*Print out the variable creation procs */
WHILE EXISTS (SELECT TOP 1 1 FROM #cmd1)
BEGIN
SELECT TOP 1 @sql = cmd, @parameter_id = parameter_id FROM #cmd1 ORDER BY parameter_id;
PRINT @sql;
DELETE FROM #cmd1 WHERE parameter_id = @parameter_id;
END;.
Thank You
April 19, 2018 at 5:38 am
jordan.jeffrey - Friday, March 2, 2018 7:35 PMMohammad,Thanks for the reply and I appreciate the code improvement. I'll have to see if I can incorporate your suggestion and update the scripts in the article. You are, of course, free to use and modify the scripts to fit your needs.
Thanks,
Jeff
I faced exactly the same issue with the string variables being truncated. I have not tested all the scenarios but from what i see the definition of the values in the stored procedure call is expecting sql_variant. It might make better sense to define all environmental variable values as sql_variant?
SO I would suggest instead:SELECT [env_var] = @tab + @tab + ', @' + ev.name + ' ' + ev.base_data_type + '= N''' + ISNULL(CONVERT(varchar(max), ev.value), '<REPLACE_ME>') + '''' ...
to have SELECT [env_var] = @tab + @tab + ', @' + ev.name + ' sql_variant= N''' + ISNULL(CONVERT(varchar(max), ev.value), '<REPLACE_ME>') + ''''
I have not tested with the other data types other than strings. The other option is to specify a long enough (n)varchar variable.
HTH
May 7, 2018 at 7:13 am
Dave Pendleton - Friday, June 2, 2017 12:35 PMichv - Friday, March 24, 2017 7:00 AMI had to make correction, too. Just want to emphasize that usenvarchar
without lenth lids to truncation of configured values to 1 symbol. So it is is not just improvement but necessity.(SELECT statement that starts from line 49 of original script)
SELECT [env_var] = @tab + @tab + ', @' + ev.name + ' ' + ev.base_data_type
+ CASE ev.type WHEN 'String' THEN ' (1000) ' ELSE '' END
+'= N''' + ISNULL(CONVERT(varchar(max), ev.value), '<REPLACE_ME>') + ''''
, [name] = ev.name
INTO #env_var
FROM [SSISDB].[catalog].[folders] f
INNER JOIN [SSISDB].[catalog].[environments] e ON f.folder_id = e.folder_id
INNER JOIN [SSISDB].[internal].[environment_variables] ev ON e.environment_id = ev.environment_id
WHERE (f.name = @folder) AND (e.name = @env);You can simplify this further by just using sql_variant since that is what the stored procedures expect anyway. The procedure as-written also does not support environment variables with spaces in their names. This will fix both.
SELECT [env_var] = @tab + @tab + ', @' + REPLACE([ev].[name], ' ', '_') + ' sql_variant = N'''
+ ISNULL(CONVERT(varchar(MAX), [ev].[value]), '<REPLACE_ME>') + '''',
[name] = [ev].[name]
INTO [#env_var]
FROM [ssisdb].[catalog].[folders] [f]
INNER JOIN [ssisdb].[catalog].[environments] [e] ON [f].[folder_id] = [e].[folder_id]
INNER JOIN [ssisdb].[internal].[environment_variables] [ev] ON [e].[environment_id] = [ev].[environment_id]
WHERE ( [f].[name] = @folder ) AND ( [e].[name] = @env );
The code that generates the procedure calls will have to be changed also:
+ @tab + ', @value=@' + REPLACE([ev].[name], ' ', '_') + @cr + @tab + @tab + ', @data_type=N''' + [ev].[type] + '''' + @cr + @tab
Using that version all integer variables declared as sql_variant get assigned a value like so: sql_variant = N'4' and consequently calling create_environment_variable with @value as the parameter results in a conversion error (not compatible with the data type of the 'Int32').
The solution is to either manually edit all your int variable declarations in the scripted output and remove the N'' (sql_variant = 4) or perhaps improve the ichv's version to declare each string variable using its original length, rather than giving all of them length of 1000:
SELECT [env_var] = @tab + @tab + ', @' + ev.name + ' ' + ev.base_data_type + CASE ev.type
WHEN 'String' THEN '('+CAST(LEN(ISNULL(CONVERT(VARCHAR(MAX), ev.value), '<REPLACE_ME>')) AS VARCHAR)+')'
ELSE '' END +'= N''' + ISNULL(CONVERT(varchar(max), ev.value), '<REPLACE_ME>') + ''''
, [name] = ev.name
INTO #env_var
FROM [SSISDB].[catalog].[folders] f
INNER JOIN [SSISDB].[catalog].[environments] e ON f.folder_id = e.folder_id
INNER JOIN [SSISDB].[internal].[environment_variables] ev ON e.environment_id = ev.environment_id
WHERE (f.name = @folder) AND (e.name = @env);
May 22, 2018 at 3:27 pm
Great way to script those environments. Thank you very much.
October 31, 2018 at 10:01 am
Viewing 15 posts - 16 through 30 (of 33 total)
You must be logged in to reply to this topic. Login to reply