October 3, 2013 at 7:22 am
I have an SSIS package I developed using the Import/Export Wizard as a base. It truncates certain tables in our Dev environment then populates them with Production data. While the package works fine on my desktop (and does not fail when run via a job on the Production server), running it via the job doesn't always fully populate the tables down in Dev.
For instance, this is what I got from the log file:
DataFlow: 2013-10-02 09:31:05.10
Source: Load Destinations 0-4
Component "Destination 4 - Staging_X1" (1258) will receive 3744 rows on input "Destination Input" (1271)
End DataFlow
Staging_X1 in Production has 16802593 rows worth of data. Dev ends up with less than a third of that but more than the message above states (so I'm thinking the log entry is just telling me the rows per batch).
Has anyone seen this behavior before?
These are OLE DB Source and Destination connections, nothing between them. I'm using Fast Load with Keep identity, Table lock, and Check constraints. I don't have a Rows per batch set, and the Maximum insert commit size is set to 2147483647. AccessMode is OpenRowset Using FastLoad.
Viewing 0 posts
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy