SSIS packages slow BCP processing

  • We are using SSIS packages to just do a data transform and map from one table to another.
    Within SQL Profiler this is performing bulk insert.
    Upon checking the Performance monitor the Bulk Copy Throughput/sec and Bulk Copy Rows /Sec is running at 20.

    This appears to be very low and only handle 4 millions an hour.

    Any advice.

    SQL 2016 and the software doesn't use any new features such as the memory and resource governor.

    Thank you.

  • Is it just a straight BulkInsert command?  Format file?  Local or over the network?  how big is the file?  What about the table it is inserting into?  Indexes, triggers, etc...???  4 million an hour is real slow

    For better, quicker answers, click on the following...
    http://www.sqlservercentral.com/articles/Best+Practices/61537/

    For better answers on performance questions, click on the following...
    http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

  • It is taking data from one data source on one SQL server to another and just within the SSIS package does a mapping only and into another table.
    Nothing complex within the mapping.

    Glad you agree with the slow speed, i am not sure how you increase it within the package

  • Package -> OLE DB destination editor has mapping one to one ,  properties is fastloadmaxInsert of 2147483647 access mode openrowset using fast load.

  • How about the table it is loading into?  Triggers, indexes???  Also, what kind of data are you moving? Is it large binary objects, all varchar(max) columns??  how long to do a simple bcp to flat file and how big is it?

    For better, quicker answers, click on the following...
    http://www.sqlservercentral.com/articles/Best+Practices/61537/

    For better answers on performance questions, click on the following...
    http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

  • is this a pure bulk insert or a merge component or something?
    give is a bit more info onthe package design please.
    also , are you using lookups to join to the data? if you are, you might be slowing things down looking up keys, so i would ask if that might be the case...what is the settigns for the lookup transform? full cache? partial? none?

    and you are sure it is using SQLBulkLoad to insert the data ?

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Within the source there is a connection manager with a variable value which is the select statement that has select cast as varchar etc...
    The select statement runs on SQL within a few seconds.
    Within the package there is a derived column  then a destination which as available input columns mapped to the available destination.
    There is nothing going on with lookups or any other coding.
    The table has it goes into has a index and is an empty table.

    Takes data from source into an empty table.

    What is the average times you are getting on the perf mon ? Bulk Copy Throughput/sec and Bulk Copy Rows /Sec

  • Could you just create a staging table on your data warehouse with the same structure as the table on the other server. Import the rows you need into the staging table then do the transform on the data warehouse server?
    This might improve performance.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply