April 3, 2009 at 1:51 pm
I have a DTS package that needs to perform 3 major tasks.
- Task 1 is clearing out old data and import in new data.
- Task 2 is running an SQL script to insert the missing record of another database (they are not on the same server).
- Task 3 is validation and clean up.
The problem I have is that I have to update task 2 every week with the new SQL to include the missing records.
Is there a way that I can make task 2 to look at local hard drive and import this script automatically so I do not have to update the DTS package every week?
April 3, 2009 at 2:16 pm
Are you trying to make each database identical to each other?
Is Log shipping not an option in your situation?
April 3, 2009 at 2:20 pm
SQL_Easy_btn? (4/3/2009)
Are you trying to make each database identical to each other?Is Log shipping not an option in your situation?
I am not trying to make the database identical. Only certain tables will get a set of data from DB1 and DB2.
Thanks for a quick respond
April 4, 2009 at 10:36 pm
If i have understood your problem properly, let me rephrase
1. You want to run a script which is dynamic in nature
2. You dont want to update ur DTS / SSIS pkg everytime
I am leaving rest details (step1 and 3) as it is.
So do this -
if its a DTS
1. Create a "Execute Process Task"
2. Now in property box of "Execute Process Task" update
Win32 Process: osql
Parameters: -E -S your_server_name -i "script_file_with_path" -o "output_file_with_path"
if its a SSIS
1. Create a "Execute Process Task"
2. Now in property box of "Execute Process Task" update
Executable: your_sql_installation_path\90\Tools\Binn\SQLCMD.EXE
Arguments: -E -S your_server_name -i "script_file_with_path" -o "output_file_with_path"
Now you need to change your script file only not the package.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply