DTS Questions

  • I have done a fair bit of reading up in the last few days - and I have some basic DTS working. But what is the differance from this to writing a procedure - to copy from one db to another - what is the advantage of DTS?

    More defined questions -

    I have a DTS package - that moves data from dbA to dbB - based on a value in dbB (where the package resides) i want the package to run or not. (which will be scheduled to run or not every ten minutes)

    ie

    ID dbB value = yes

    BEGIN

    Do DTS

    END

    ELSE

    BEGIN

    Do Nothing

    END

    Also since I have multiple table to copy across - should I have multiple Transform Data tasks within the one package. Is there a good link to workflow these multiple tasks??

    Thanks

  • First of all, you said you wanted to move dbA to dbB, you meant all tables in dbA or one table in dbA?

    If the table you want to transfer is exactly the same in both database, you don't even need to create a DTS package, you can just use bcp command.

  • Hi, multiple tables' data based on criteria (ie only certain records.

    So some of the fields from table A...C...F where ID = ? into corresponding fields in table(s) on target db.

  • Adam,

    Are you asking if you can copy data between databases as easily with T-SQL in a stored procedure as with a DTS package? If so, I'd say "yes" if the databases are on the same server. But, if they're on different servers, I think DTS is easier to set up.

    If you want to put the DTS package in a scheduled job and execute it based on a value returned by a query, I'd say put the query in the first job step and the DTSRUN command in the second step. The first step would look something like this:

    if (select ID from dbB.dbo.table) <> 'yes'

    raiserror('skip package run',15,1) -- this sets an error code, failing the job step

    Set the on success/failure flow for the step so it goes to the next step (the DTSRUN step) on success but quits the job on error.

    You'll need a separate Transform Data task for each table. You can either link them with 'on success'

    or 'on completion' workflow so they execute serially or leave them unlinked and let them execute in parallel.

    Greg

  • Thanks Greg - I'll have a further look based upon your advice.

    No doubt I'll be back soon - with another not quite as dumb as the last time question.;)

    Thanks

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply