DTS optimization: table to table or use bcp or text files

  • What is the faster way to pull several tables from a source database (Progress database) and populate a SQL Server database empty tables having indexes.

    1. DTS: Table to table using a transform task (using one COM object in the field mapping)

    2. DTS: Table to text file; then from text file into SQL Server?

    3. DTS: Table to text file; then use BCP to pull into SQL Server? Doesn't this allow you to ignore indexes?

    4. Look more into bcp only?

     


    smv929

  • This was removed by the editor as SPAM

  • The age old answer is "... it depends ..."

    What sort of network bandwidth do you have between the databases?

    How much data are you transferring, thousands or millions?

    Are you transferring all the data, or only subset of the source data?

    Are you maintaining data within the SQL Database?

    We copy approx. 30 tables from a Progress database on a nightly basis using DataPump task connecting via ODBC. At the moment we average about 10mil rows copied in total. The destination is a lightly indexed (just a single primary key in most cases) SQL database which is then used as the source for updating our production SQL Database. The average time taken is about 3 hours.

    For us, extracting to text files isn't an option as the company that manages our database would charge us for the initial creation of the files and then a set fee every time we wanted updated data.

    Also, I wouldn't go the BCP path. If you're importing text files use BULK INSERT, either the T-SQL or DTS variations. The both give comparable or better performance than BCP.

    --------------------
    Colt 45 - the original point and click interface

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply