February 26, 2013 at 2:33 am
Hi folks,
This is my next challenge: I'm using an Execute SQL Task to run an update. The source and target tables are in the same server but in different databases. Therefore I'm using the fully qualify name for the target table and set the connection to the source table. Then I need to parameterized both, the source and target tables for deployment flexibility. I can use a package configuration and store the connection string to the source, but the connection to the target is in the SQL statement, that is [db_name].[owner].[tablename]. I know that the name of the tables and the columns cannot be a variable.
Some people recommend to use an expression to build the SQL statement, but this is not an option for me, because the script is about 100 lines long, and the lines are also very large. That´s make the whole package not easy to maintain and so on.
Could anyone recommend me another option?
Kind Regards
February 26, 2013 at 3:05 am
The only ways I can think of doing this are by generating & running dynamic SQL statements either through a stored proc (passing the table names as variables) or a task script (again passing the table names as variables)
Mack
February 26, 2013 at 3:26 am
Mackers (2/26/2013)
The only ways I can think of doing this are by generating & running dynamic SQL statements either through a stored proc (passing the table names as variables) or a task script (again passing the table names as variables)Mack
Or write all of the update logic in a single stored proc which encapsulates the cross-database logic and just call that - nothing dynamic required.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
February 26, 2013 at 3:28 am
Just out of interest, how would that work if you wanted to change the database name?
Mack
February 26, 2013 at 3:35 am
Mackers (2/26/2013)
Just out of interest, how would that work if you wanted to change the database name?Mack
Which database name?
Obviously, the stored proc has to live in a known location and if both the source and target database are completely dynamic, it would have to go into a separate database completely, which is known. Then it could accept parameters - source & target database and table info - and generate and execute dynamic SQL to perform the update.
But it's difficult to judge whether that level of dynamism is required, without more info from the poster.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
February 26, 2013 at 4:00 am
Good point Phil.
Maybe a script task would be a better option as it is database independent.
Mack
February 26, 2013 at 4:08 am
Hi all,
thanks Phil and Mack for your quick answer.
In my scenario both options (stored procedure and script taks) work fine.
I need to update an operational CRM (Dynamics - SQL DB) using data from an analytical CRM (SAS+SQL DB). Both databases are in the same server, so I just need to parameterized the target or the source. It is just to make the deployment more flexible but is not a strong requirement. As I said, both solutions will work.
Kind Regards,
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply