DTS Package Problems

  • Dear Colleagues,

    First of all I'm extremely sorry for throwing many question in short span of time. Actually I'm new to SQL server.

    I have a DTS package which copies 3 tables in sequence from Ingres Database on Unix server to SQL Server 2k.Source table name is identified using ActiveX scriptand assigned using "Dynamic Properties Task".

    Also for each task step I have set the WorkFlow Properties as

    "Exectue on main package thread" & "Close connection on completion". This was done to avoid the "Catastrophic failure" error.

    For the first table datapump task works fine and transfers the rows which is less than 1 millions successfully.

    For the next two tables I face the 2 different problems.

    1. For the 2nd table it is supposed to transfer 10,858,658 rows. It transfers 10,858,000 rows in just 30-40 minutes. For remaining 658 rows it takes around 1-1.5 hours.Then the step completes successfully.

    2. For the 3rd table also it is supposed to transfer 10+ million rows. In this case it just copies few hundred thousand rows then goes into a hanging stage and never completes. At the end I have to kill the process.

    Any help please.

    Thanks in advance.

    Asif

     

     

  • 1. What's going on with the 2nd table is that SQL Server is bulk inserting the records, then waiting to commit all of them.  If you watch the log file (dbcc sqlperf(logspace) ) you'll see the transaction log filling up after the DTS step looks like it's done.  I'm not sure exactly how it works, but I know that's typical with DTS.

    2. for the 3rd file, I'd check the transaction log, make sure it's not full.  Also check sp_who2 and sp_lock to see if the process is being blocked.

     

    Dylan Peters
    SQL Server DBA

  • You can adjust the size of the commit by playing around with the 'Fetch Buffer Size' and 'Insert Batch Size' figures on the Options tab of the Datapump properties dialog.

    The settings that will give you the best performance are really dependant on how your environment is setup, so ensure you test a range of different values. eg: on one of our servers that's of a lower spec setting these two options to 1000 provided a better thoughput than setting them at 2000.

     

    --------------------
    Colt 45 - the original point and click interface

  • Hi Colleagues,

    Thanks very much for the reply.

    I will try to adjust the setting as suggested by Phills and run the package again.

    ACtually the second table for which it takes long time to commit is not worrying much as at least it is completing.

    The third table which never completes is putting me in trouble.

    I checked to see if process is being locked as suggested by Dylan and found no locking.Also there is no log space issue.

    Also I have set the "Commit transaction on successful completion of this step" in Options tab of workflow properties for seond table so that transaction is commited and log gets and buffer gets cleared but no success.

    ANy help please.

    Asif

  • I have set 'Insert Batch Size'  as 2000 as suggested by Phill and its going fine now without hanging.

    Also I executed the package by setting it to 1000 and realized that with 2000 it is much faster.

    Thanks.

  • As mentioned in my previous post, don't just set it to 2000, experiment with a range of values and see what gives you the best performance. On your hardware, a much higher value could be better.

     

    --------------------
    Colt 45 - the original point and click interface

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply