3 million and counting!

  • Hi All!

    I have a problem with a one time DTS package. I have an access DB with one table in it, containing 3.5 million rows that I need to DTS into my production SQL box. Each time the DTS runs, it hangs at 3322000 rows and I have to kill it, then wait an eternity for the rollback..

    I have increased tempdb's log and data files so they have a clear 300Mb free on both (I noticed that they were rapidly running out of space when the DTS was running - is this where the initial data goes??) and the target DB has over 3Gb data and logfile space free.

    The DTS just hangs each time I run it (like it is now!) please help!

  • I'd split this, or perhaps change the transaction size. You likely have some data issue somewhere that is not getting handled well.

    Steve Jones

    steve@dkranch.net

  • I agree with Steve. Put a batch size on this. Also for this load I would make sure the table does not have a PK or any indexes on it for the load. You can build those later.

    For the batch size, in dts on the last tab you can specify the fetch size, commit size and also whether it should do the checks on the data.

    Good luck

    Tom Goltl

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply