December 17, 2003 at 1:36 pm
I have a DTS Package that executes other DTS Packages using the Execute Package Task. These tasks are in turn using the Copy SQL Server Object Task to copy data from one server to another. The destination tables already exist and I am just replacing existing data.
The controlling task's MaxConcurrentSteps property is set to 4.
When I execute the controling task I am getting an error on "SOME" of the Execute Package Tasks saying that a file on the Source Server cannot be accessed becasue it is being used by another user or process. The Files end with .TAB, .PR2 and oter extensions. It's not always the same task or the same file that I am erroring out on.
When I run the Tasks seperately everything is just fine.
Please help. I have a deadline and I'm stuck.
Thanks,
December 17, 2003 at 2:33 pm
What type of data sources are you tring to copy, for example, SQL Server, Access?
December 17, 2003 at 2:35 pm
SqlServer
7.0 to 2000
December 17, 2003 at 2:38 pm
And it only happens with the Copy SQL Server Object Task.
If I use the Transform Data Task everything is all right... with the exception of my tables number if the 10s to 100s of millions of rows. I need the DTS to use BCP under the covers with no transactions.
December 17, 2003 at 3:25 pm
Because the Copy SQL Server Object tasks appear to be hitting the same "work" files, maybe you can reduce MaxConcurrentSteps to 1 or include workflow to carry out the tasks serially.
This is just a suggested workaround to help you meet the deadline. If it works ok you can then investigate ways to have concurrent tasks using separate work file locations.
HTH
Cheers,
- Mark
Cheers,
- Mark
December 18, 2003 at 7:08 am
Thanks.
I can also set the MaxConcurrentSteps to 1 to accomplish the same thing.
I there was some way to tell DTS to use a specific Source Server Directory for these files everything would be fine. I know it's because I am using the Copy SQL Server Object Task. It's making files that contain information about the transfers as well as files that contain the BCP Out data.
If you know anyone at Microsoft's DTS division I'd appreciate the introduction.
The real issue is that I need multiple threads all executing at the same time to move all the data I need moved within the time limit (one night.) You see, I am setting up a test bed for our production data warehouse. I am only moving part of the database but some of the tables have to be moved in total. The largest is almost 200 million rows with six other containing 30+ million. Some are fact tables while others are dimension tables. The test bed server is not equipped to be a full fledged warehouse and does not have the gigibit transfer speeds or the highspeed disks I need but I can use what resources I have to the max. I am the only person on the Test Bed while the transfer is processing.
Any help I can get would be appreciated.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply