Many tables...multiple paths

  • I am developing a DTS package that will import between twenty and eighty ODBC files into a database. There are three destination tables, and a portion of the file name dictates where the data should go. I have the script necessary to pick up the files based on the file name variation. My questions are:

    (1) Can a DTS package run more than one data pump task at a time when initiated by the same VBScript?

    (2) If it can't, how can I put a delay in to allow for the completion of the preceding task? The longest delay I would require if needed is 30 seconds. I can see how "WAITFOR DELAY" would be of benefit, but I'd rather put the delay in the VBScript Loop statement. Thanks much!

  • This was removed by the editor as SPAM

  • as far as I know, DTS won't run parallel tasks if they are using the same connection to the database.

    Let me see if I got ur problem right, you have a VBScript that is running the same DTS package more than once?

    if that's the case, maybe the best thing to do in your case will be to put all your logic within the same DTS, and loop through all your files processing one at a time, so you'd have an activexscript that gets all the file names, a pump task, and then another activeX to set the variables and maybe a dynamic properties task to set the next data pump properties, and then an activex task that will loop back to the data pump, with the new values set.

    by doing this you make sure that only one file will be processed at a time, and you don't have any unnecessary delays in your execution.

    hope it helps...

     

    Rayfuss.-

    http://www.e-techcafe.com

     

  • Default operation would be to have this run serially.  The VBScript would wait for the DTS package to complete before executing the next command.  Right??

  • Well, there seems to be a number of ways to do it. I've gone with importing the table names into a table in tempdb, rummaging through them for the ones I want, setting the filename as the source file in a data pump task, deleting the entry in the file name table once imported, then resetting one of the tasks in the package to "waiting"...at least that's where I'm at right now. It may be the long way to do it, but for now it looks like it will work. Thanks to all who read and posted!!!

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply