January 28, 2005 at 3:49 pm
I have 7 *.xls tables that get imported into 7 tables in SQL. All the tables have an individual DTS package to handle the import from Excel to SQL (so 7 DTS packages in all). Is there a way by which I can run one job that will execute all these 7 DTS packages and if so how? Alternatively is there a way in which I can handle this data transfer and if so how? I understand there are scripts that can be written, batch processes that can be run etc. but it would be great if someone could actually show me an example script etc.. Thanks
January 28, 2005 at 3:54 pm
A single DTS package can have multiple data transformations - you just string them together with workflows (or run them in parallel).
Here's an example - I have a package that imports 3 CSV files into a Sql Server staging area:
If you don't want to redo the 7 existing packages, you can group them in 1 new package, using 7 "Execute Package Task" tasks. This lets a package execute a sub-package.
January 31, 2005 at 9:41 am
Thanks for your inputs. I have now created a DTS package that imports 7 *.xls files into 7 tables in SQL. However, when I execute the package the Excel data gets appended to the SQL tables while I want the process to clean out the tables and then load the data into the tables. Pls let me know how this can be done. Thanks.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply