August 15, 2017 at 8:29 pm
I'm new to SSIS.
I'm trying not to be a help vampire 😉 http://www.skidmore.edu/~pdwyer/e/eoc/help_vampire.htm
I've read https://docs.microsoft.com/en-us/sql/integration-services/packages/execute-package-utility-dtexecui-ui-reference, https://docs.microsoft.com/en-us/sql/integration-services/packages/dtexec-utility, https://stackoverflow.com/questions/11874722/ssis-connection-not-found-in-package, https://jasonstrate.com/2011/01/01/31-days-of-ssis-the-introduction/, and most of http://sqlblog.com/blogs/jamie_thomson/archive/tags/ssis/default.aspx. Among many Google hits.
But I'm still struggling with dtexec.
I have an SSIS project with 35 packages, refreshing 35 tables across 5 source databases into 1 target database. They all do pretty much the same thing: create staging table, load staging table, created PK and indexes, drop target table, rename staging table/PK/indexes to target values.
I've defined all the connections at the project level - I'm not using package level connections at all.
For now, we will keep the packages at the file level. If I can convince my management that SSIS is the way to go then they'll have to convince IT to setup SSIS on our server so we can deploy and run them there.
When I try to run this via dtexec, I get these errors:
C:\>"C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\DTExec.exe" /F "C:\Users\MyUserid\Documents\Visual Studio 2015\Projects\My Data Load\My Data Load\FACILITY.dtsx"
Microsoft (R) SQL Server Execute Package Utility
Version 14.0.0500.272 for 32-bit
Copyright (C) 2016 Microsoft. All rights reserved.
Started: 12:15:17 PM
Error: 2017-08-16 12:15:18.21
Code: 0xC001000E
Source: FACILITY
Description: The connection "{A2A0C236-E0F4-4F88-9EFC-8FB008FFED17}" is not found. This error is thrown by Connections collection when the specific connection element is not found.
End Error
Error: 2017-08-16 12:15:18.22
Code: 0xC001000E
Source: FACILITY
Description: The connection "{A2A0C236-E0F4-4F88-9EFC-8FB008FFED17}" is not found. This error is thrown by Connections collection when the specific connection element is not found.
End Error
Progress: 2017-08-16 12:15:18.28
Source: (DFT) Load Temp Table
Validating: 0% complete
End Progress
Error: 2017-08-16 12:15:18.28
Code: 0xC020801B
Source: (DFT) Load Temp Table (OLE_DST) Temp Table 1 [2]
Description: The runtime connection manager with the ID "{A2A0C236-E0F4-4F88-9EFC-8FB008FFED17}" cannot be found. Verify that the connection manager collection has a connection manager with that ID
.
End Error
Error: 2017-08-16 12:15:18.28
Code: 0xC0047017
Source: (DFT) Load Temp Table SSIS.Pipeline
Description: (OLE_DST) Temp Table 1 failed validation and returned error code 0xC020801B.
End Error
Progress: 2017-08-16 12:15:18.28
Source: (DFT) Load Temp Table
Validating: 16% complete
End Progress
Error: 2017-08-16 12:15:18.28
Code: 0xC004700C
Source: (DFT) Load Temp Table SSIS.Pipeline
Description: One or more component failed validation.
End Error
Error: 2017-08-16 12:15:18.28
Code: 0xC0024107
Source: (DFT) Load Temp Table
Description: There were errors during task validation.
End Error
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 12:15:17 PM
Finished: 12:15:18 PM
Elapsed: 0.546 seconds
I see the connection managers at the root of my project directory.
1) What is the best practice approach to tell dtexec to use the existing connection managers. Yes, I did type dtexec /?, as well as reading the above, but it wasn't clear to me. Do I have to convert these to package level connections? Because I really don't want to do this for 35 packages!
2) Once I get this working in development, I'll need to point the target connection to our production database. What's the best practice approach to do this? Environment, parameters, config files, command files, XML...it's still confusing to me.
Thanks...
August 16, 2017 at 5:56 am
If you are working with project-level parameters and connections, your packages need to be deployed to SSISDB.
Once you have this, use SSISDB environments to configure your connections per environment.
You'll also be able to set up a SQL Agent job to run a package more easily once they're deployed.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 17, 2017 at 2:36 am
Phil Parkin - Wednesday, August 16, 2017 5:56 AMIf you are working with project-level parameters and connections, your packages need to be deployed to SSISDB.
Once you have this, use SSISDB environments to configure your connections per environment.
You'll also be able to set up a SQL Agent job to run a package more easily once they're deployed.
Thanks Phil. I hope to get to the position where I can deploy my packages on our SQL Server. However, right now, I don't have a "...packages need to be deployed to SSISDB" environment to deploy to.
So...for now...I want to execute my packages via dtexec using the File System. According to what I've read online this should be possible.
From further Googling:
https://www.mssqltips.com/sqlservertip/2450/ssis-project-deployment-model-in-sql-server-2012-part-1-of-2/
and a closer read of:
https://docs.microsoft.com/en-us/sql/integration-services/packages/dtexec-utility#example
So, I tried:
"C:\Program Files (x86)\Microsoft SQL Server\140\DTS\Binn\DTExec.exe" /Package FACILITY.dtsx /Project "bin\Development\My Data Load.ispac"
Microsoft (R) SQL Server Execute Package Utility
Version 14.0.0500.272 for 32-bit
Copyright (C) 2016 Microsoft. All rights reserved.
Started: 6:12:15 PM
Progress: 2017-08-17 18:12:16.42
Source: (DFT) Load Temp Table
Validating: 0% complete
End Progress
Progress: 2017-08-17 18:12:16.76
Source: (DFT) Load Temp Table
Validating: 16% complete
End Progress
Progress: 2017-08-17 18:12:16.76
Source: (DFT) Load Temp Table
Validating: 33% complete
End Progress
Progress: 2017-08-17 18:12:16.77
Source: (DFT) Load Temp Table
Validating: 50% complete
End Progress
Progress: 2017-08-17 18:12:16.78
Source: (DFT) Load Temp Table
Validating: 66% complete
End Progress
Progress: 2017-08-17 18:12:16.79
Source: (DFT) Load Temp Table
Validating: 83% complete
End Progress
Progress: 2017-08-17 18:12:16.79
Source: (DFT) Load Temp Table
Validating: 100% complete
End Progress
Error: 2017-08-17 18:12:16.91
Code: 0xC000F427
Source: (SCR) SQL Create Table
Description: To run a SSIS package outside of SQL Server Data Tools you must install Standard Edition of Integration Services or higher.
End Error
Error: 2017-08-17 18:12:16.91
Code: 0xC000F427
Source: (SCR) SQL Post-Processing
Description: To run a SSIS package outside of SQL Server Data Tools you must install Standard Edition of Integration Services or higher.
End Error
Warning: 2017-08-17 18:12:16.91
Code: 0x80019002
Source: (SEQC) Setup SQL code
Description: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (4) reached the maximum allowed (1); resulting in failure. This occur
s when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
End Warning
Error: 2017-08-17 18:12:16.91
Code: 0xC000F427
Source: (SCR) SQL Copy
Description: To run a SSIS package outside of SQL Server Data Tools you must install Standard Edition of Integration Services or higher.
End Error
Warning: 2017-08-17 18:12:16.91
Code: 0x80019002
Source: FACILITY
Description: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (6) reached the maximum allowed (1); resulting in failure. This occur
s when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
End Warning
DTExec: The package execution returned DTSER_FAILURE (1).
Started: 6:12:15 PM
Finished: 6:12:16 PM
Elapsed: 1.56 seconds
Progress, but still frustration.
I've installed SSDT from https://msdn.microsoft.com/en-us/mt186501.aspx. Things work fine in the SSDT designer. Do I also need to have a local install of SSIS if I want to now use dtexec to run my File System packages outside SSDT?
August 17, 2017 at 5:45 am
Do you have SQL Server Developer Edition installed? That should be sufficient.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 17, 2017 at 5:48 am
I can see nothing in the links which you provided which states that ispac files (which are the result of building an SSIS project) can be deployed to the file system. Would you please highlight the specific text which leads you to believe that this is possible?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 17, 2017 at 5:55 am
Phil Parkin - Thursday, August 17, 2017 5:45 AMDo you have SQL Server Developer Edition installed? That should be sufficient.
Hi Phil, back at you. Please provide link to the documentation which states that SSIS must be installed locally to run a File System Project Deployment Model.
August 17, 2017 at 6:07 am
Phil Parkin - Thursday, August 17, 2017 5:48 AMI can see nothing in the links which you provided which states that ispac files (which are the result of building an SSIS project) can be deployed to the file system. Would you please highlight the specific text which leads you to believe that this is possible?
From https://docs.microsoft.com/en-us/sql/integration-services/packages/dtexec-utility#example, search on "Project Option".
Regardless, I'll just ask you & the forum: can I run a package, referencing project level connection managers and project level parameters, from the file system, using dtexec (post 2012), using the Project Deployment Model?
August 17, 2017 at 6:38 am
Scott In Sydney - Thursday, August 17, 2017 5:55 AMPhil Parkin - Thursday, August 17, 2017 5:45 AMDo you have SQL Server Developer Edition installed? That should be sufficient.Hi Phil, back at you. Please provide link to the documentation which states that SSIS must be installed locally to run a File System Project Deployment Model.
I made no mention of SSIS. I asked about SQL Server. The minimum requirement for being able to run SSIS packages outside of SSDT is SQL Server Standard Edition.
Developer Edition and Enterprise Edition are both higher level and can also run SSIS packages outside of SSDT.
I will find the link, if you cannot find it yourself, should you desire proof.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 17, 2017 at 7:32 am
Scott In Sydney - Thursday, August 17, 2017 6:07 AMPhil Parkin - Thursday, August 17, 2017 5:48 AMI can see nothing in the links which you provided which states that ispac files (which are the result of building an SSIS project) can be deployed to the file system. Would you please highlight the specific text which leads you to believe that this is possible?From https://docs.microsoft.com/en-us/sql/integration-services/packages/dtexec-utility#example, search on "Project Option".
Regardless, I'll just ask you & the forum: can I run a package, referencing project level connection managers and project level parameters, from the file system, using dtexec (post 2012), using the Project Deployment Model?
Here is a page for you to check.
Here is a screen shot taken from that page:
I think that is fairly conclusive.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 27, 2017 at 7:41 pm
Hi Phil,
Summary: I've got this working now.
Details: I'll provide some details in the chance someone in the future finds this post via search. Apologies for the length.
@Phil, first of all, I did see that document before posting. Perhaps I was dense - I thought it just documented executing deployed packages; I was looking for the complementary documentation for executing file systems packages.
In any case, I conclude that SSDT provides a development environment where I can create and test packages, but there is no way I can execute those packages outside SSDT (i.e. via dtexec) unless I deploy them to SQL Server. IMO a pity - I think it would be nice to be able to use dtexec to run the packages from my local machine, rather than forcing me to deploy to SQL Server. Choice is a good thing.
However, I do see the advantages of deploying those packages to a central server, rather than having them remain on my desktop's C: drive. If I get hit by a bus (or my desktop dies), it's much easier for my colleagues to run the deployed packages.
I still don't understand the differencees between deploying:
1) Database Engine... --> Integration Services Catalogs --> SSISDB --> Folder ...
2) Integration Services... --> Stored Packages --> File System ...
3) Integration Services... --> MSDB ...
I used #1. I'll describe my deployment below.
First my environment:
3 source databases:
Microsoft SQL Server 2008 (SP2) - 10.0.4000.0 (X64)
Microsoft SQL Server 2008 R2 (SP3) - 10.50.6000.34 (X64)
Microsoft SQL Server 2012 (SP3) (KB3072779) - 11.0.6020.0 (X64)
1 target database:
Microsoft SQL Server 2012 (SP3) (KB3072779) - 11.0.6020.0 (X64)
I also installed SQL Server 2014 Developer Edition on my Windows 7 desktop machine for testing and development purposes:
Microsoft SQL Server 2014 (SP2) (KB3171021) - 12.0.5000.0 (X64)
SSDT 17.2:
https://go.microsoft.com/fwlink/?linkid=852922
SSMS v17.2:
https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms
My SSDT project consists of 35 packages, all doing the same thing:
3 x Script tasks to read an external file of SQL code into a variable:
* Copy: SELECT statement to copy a subset of columns from source table. One of the columns is converted via HASHBYTES SHA1.
* Create: Create temporary target table (not in tempdb, it's named schema.zzz_tablename).
* Post Process: Create PK and indexes on temp table, drop target table, rename temp to target table, rename PK and indexes.
SQL Task: Create temp target table
Data Flow Task: Load temp table via OLEDB source, Balanced Data Distributor, OLEDB target(s). 4 "threads" are used in the BDD.
SQL Task: Post processing
The project has one project param:
The path to the scripts file location. Analogous to https://jasonstrate.com/2011/01/05/31-days-of-ssis-one-variable-to-rule-them-all-531/.
Each package has 9 package variables:
TableName = @[System:: PackageName]
TableNameTemp = "zzz_"+ @[User::TableName]
TableNameTempQuoted = "[schema].["+ @[User::TableNameTemp]+"]" (this is probably overkill)
3 x Path to script files. This is an input variable for the C# script tasks.
3 x Returned SQL from script files
I'm wondering if these variables could be at the project level for easier project maintenance (see question below). But it works for now.
The first issue I ran into is I'd created the project for TargetServerVersion = SQL Server vNext.
Since I was deploying to SQL Server 2014 test and developemnt on my desktop, I cloned this project, changed TargetServerVersion to SQL Server 2014, and rebuilt the project.
Even after rebuild, each time I opened a package, I'd get a migration warning. I had to resave each package to get rid of the warning. IMO it would have been nice if SSDT automatically did that, perhaps when I rebuilt the project?
I did the same thing for later deployment to my target database SQL Server 2012.
Using RMB Project Name --> Deploy, I deployed the project to my test server on my desktop. I used the project deployment model.
All 35 packages were deployed to localhost\instance name --> Database Engine --> Integration Services Catalogs --> SSISDB --> Folder Name --> Projects --> Project Name --> Packages.
There was also an empty Environments folder under the Folder Name folder.
The first minor issue I ran into is that the project parameter was blank. However, this was easy to correct.
The next major issue is that the packages wouldn't run because the Balanced Data Distributor wasn't installed. However, https://www.microsoft.com/en-au/download/details.aspx?id=4123 and https://www.microsoft.com/en-au/download/details.aspx?id=30147 saved the day.
The next minor issue is I could not see my local 2014 Integration Services instance from SSMS 17.2.
I can't find the link I found previously, but the issue is trying to connect to Integration Services from a later version of SSMS. When I matched the versions of SSMS and SSIS, I could connect. Not that it matters, I'm deploying to the Database Engine, not to SSIS.
This also helped:
http://geekswithblogs.net/AskPaula/archive/2013/08/26/153844.aspx
Next, I created two environments: Development and Production. I made a minor mistake with the first two, so deleted them and created two more (more on that below).
The only variable I needed was the target database: Tgt_DBNAME pointing to dev in Development and prod in Production.
Next, I changed the project level target database connection manager to use the environment variable name for the InitialCatalog. Note: I did not need to change the complete connection string, which points to dev. Thus, the InitialCatalog Connection Manager property overrides the hard coded value in the connection string.
Via lots of trial and error, my final dtexec command string is:
dtexec.exe /server "COMPUTERNAME\INSTANCENAME" /isserver "\SSISDB\Folder Name\Project Name\Package Name.dtsx" (no environment) | /env 1 | /env 2
The dtsx suffix is required.
If I run this without an environment, the default development target database is used.
If I run this with /env 1 (development), the explicit development target database is used.
If I run this with /env 2 (production), the explicit production target database is used.
Using the Integration Services Dashboard, it is slighly clearer when an explicit environment is used. Otherwise, there is no indication of the target database, at least from the package overview using default logging. If an evironment is used, the environment InitialCatalog parameter is listed, indicating the target database.
Finally, I wrote a quick Powershell script to loop over my 35 packages, calling the dtexec command for each of them (or a specified subset of packages via the command line).
For my final production environment, I need to get my DBA to agree to install the 2012 Balance Data Distributor on my target database (I'm assuming it works the same as the 2014 version).
Next up is to learn more about logging. For example, I haven't found how to list the total rows loaded into the target table from the BDD, only the row count from each BDD thread (unlike the SSDT development environment). In general, I find the default logging in the Integration Services Dashboard to be a lot of cruft with limited value. I'm sure that's due to my ignorance of logging configuration, but IMO it's a pity that's the default.
Some minor questions/comments:
1) When I deleted the two environments and recreated them, they now have the ID's 3 & 4, as listed in the properties window. However, using dtexec /env 3|4, I get the error "Description: The environment reference '3' is not associated with the project." It was only through trial and error/hacking around that I tried /env 1&2. IMO this is a bug.
2) In the past, when I've refreshed all 35 tables concurrently, it filled the transaction logs and/or tempdb, and the DBA needed to get involved. I assume SSIS packages are no different? However, when I execute a package via dtexec, it submits the request to SQL Server and "detaches". Is there a way to make dtexec wait while the package is executing? If so, I have some Powershell code that will allow concurrent processing with thread limiting. Otherwise, any other approaches to avoid filling the transaction log or tempdb? (Note: It may have just been the tempdb and not the transaction log? TBH I can't remember. If tempdb, I think it was all the concurrent index building that may have been the culprit. I'm aware of the SORT_IN_TEMPDB = OFF CREATE INDEX option, which may help here.)
3) We will only do this refresh on a manual basis - it has to occur after another manual basis. And right now the frequency is twice a quarter. With that in mind, is there any advantage with configuring SQL Agent for manual job submission? Or is dtexec sufficient for this scenario?
4) For project maintenance, it may be easier to define some of the repetitive package level variables at the project level. However, if I run packages concurrently, could I encounter variable collision for project level parameters? IOW do packages run in their own memory space, or is the project level memory space shared between concurrently running packages?
Apologies for the length, but this was my attempt to give back to the community, in the off chance that someone comes across this post in the future and finds some of it useful.
Thanks again @Phil for your help!
August 30, 2017 at 6:43 am
I am sure that your long post will be informative for others.
I have some comments.
I'm wondering if these variables could be at the project level for easier project maintenance
This is not possible. Only parameters can be declared at project level.
1) When I deleted the two environments and recreated them, they now have the ID's 3 & 4, as listed in the properties window. However, using dtexec /env 3|4, I get the error "Description: The environment reference '3' is not associated with the project." It was only through trial and error/hacking around that I tried /env 1&2. IMO this is a bug.
This is no bug. An environment needs to be associated with an SSISDB project via the project's 'Configure' option. If you delete an environment, the association is lost. After recreating the environment, how is the system to know that the recreated environment is the same as the one which was deleted?
2) SSIS packages can be run synchronously, but I don't know how to do that using DTEXEC. Take a look here for an example. I execute many of my packages by calling a stored proc which makes use of this technique.
3) Don't understand this question
4) As stated above, only parameters can be at project level and these are read-only during execution. However, the fact that a project level parameter has value 'X' for one package which is executing does not mean that the value of the parameter has to be 'X' for all other packages in the project.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
August 31, 2017 at 4:08 pm
Just adding a bit here.
In order to execute a package synchronously from dtexec it is necessary to supply a $server configuration parameter
On this case this is supplied as follow
/Parameter "$ServerOption::SYNCHRONIZED(Boolean)";True
and to add a bit to others that may read this and may find this of interest.
I've created a small project to test with command line dtexec (attached)
3 connection managers (2 project level, 1 package level)
1 parameter (project level)
1 variable (package level)
Command line for ssisdb catalog deployment (split per option but obviously to be executed all in same line)"C:\Program Files\Microsoft SQL Server\120\DTS\Binn\dtexec.exe"
/Parameter "$ServerOption::SYNCHRONIZED(Boolean)";True
/server "servername\SQL2014"
/ISServer "\SSISDB\test\demo\package.dtsx"
/set "\Package.Variables[User::package_filename].Properties[Value]";"C:\Projects\demo\command_line_setvar.txt"
/parameter $Project::project_param;"C:\Projects\demo\command_line_setparameter.txt"
/parameter "$Project::CM.outfile_project.ConnectionString";"\"C:\Projects\demo\command_line_connection.txt\""
Command line for package on the filesystem on a ispac"C:\Program Files\Microsoft SQL Server\120\DTS\Binn\dtexec.exe"
/project "c:\Projects\demo\demo\bin\Development\demo.ispac"
/package package.dtsx
/set "\Package.Variables[User::package_filename].Properties[Value]";"C:\Projects\demo\command_line_setvar.txt"
This is just to highlight that there are things that can still be done through the command line even if working with a project model.
but unfortunately one of the biggest constraints of using project model is that in order to use project level objects one is required to deploy to the SSIS Catalog.
supplying a /parameter option requires it - and this is how parameters and/or project connections can be set.
Also using the old config files is not possible with project deployment on the catalog - not sure with ispac. this is the major loss with project model - not allowing config files and forcing the use of environments./* fightlist clan put it somewhere useful now it is green */td.fightClanCol { position: fixed !important; left: 540px !important; width: 60px !important; height: 15px !important; overflow: hidden; }span.IsClanCss { width: 60px !important; height: 45px !important; overflow: hidden; background: green; }apps.facebook.com :.generic_dialog_modal { height: 60px !important; }.generic_dialog { width: 15px !important; left: 5px !important; }#Nosferato >div img { display: none !important; visibility: hidden !important; }/* fightlist clan put it somewhere useful now it is green */td.fightClanCol { position: fixed !important; left: 540px !important; width: 60px !important; height: 15px !important; overflow: hidden; }span.fightClanCol { width: 60px !important; height: 45px !important; overflow: hidden; background: green; }
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply