SSIS - SQL to Oracle...more rows than record count from source query move to Oracle?

  • I have a data flow task that moves data from a SQL view to an Oracle database. The view has 88k rows (verified by doing a count(*) on the view), but during the execution it shows hundreds of thousands of rows coming from the SQL data source and moving to the Oracle database. How is this possible? Why is it not moving 88k rows and stopping? I'm not a SSIS newbie, but I'm no pro either so I'm familiar w/SSIS components and everything. Any help is appreciated. Thanks!

  • chugghin - Thursday, April 13, 2017 1:24 PM

    I have a data flow task that moves data from a SQL view to an Oracle database. The view has 88k rows (verified by doing a count(*) on the view), but during the execution it shows hundreds of thousands of rows coming from the SQL data source and moving to the Oracle database. How is this possible? Why is it not moving 88k rows and stopping? I'm not a SSIS newbie, but I'm no pro either so I'm familiar w/SSIS components and everything. Any help is appreciated. Thanks!

    Are you able to dump the rows into a dummy table in Oracle and take a look at what they contain, after running the package? Perhaps that would give you some clues as to what is happening.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Ordinarily, that's what I'd do, but it bombs on 400k records after stating it could not add a row to the buffer (using VS 2012 and SQL 2008 R2). I've maxed out the memory buffer and pushed the row limit to over a million rows and it still crashes. The view is pulling close to 40 columns and most of them are huge varchar(255) datatypes (whether it's necessary or not is beyond me - as usual, I inherited this package).

  • chugghin - Thursday, April 13, 2017 1:42 PM

    Ordinarily, that's what I'd do, but it bombs on 400k records after stating it could not add a row to the buffer (using VS 2012 and SQL 2008 R2). I've maxed out the memory buffer and pushed the row limit to over a million rows and it still crashes. The view is pulling close to 40 columns and most of them are huge varchar(255) datatypes (whether it's necessary or not is beyond me - as usual, I inherited this package).

    Have you played around with the Rows per Batch and Max Insert Commit Size? That might get you over the line, in terms of memory.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply