Optimization Assistance for 270 million records

  • I have implemented a package that cycles through about 2700 folders and imports data from 200 .csv files within each folder for a total of about 540K files. Each file contains 300-800 records, so the entire directory contains about 270 million records.

    The details of the package is located here.

    I am finding very quickly that the processing time is becoming exorbitant. About 3% took 4 hours so 100% will take about 5.5 days.

    I'm concerned about both the SSIS process as I have many intermediate steps as well as on the SQL admin side as I'm proposing to import 270 million records into a single table.

    Any assistance in optimizing my process from the thread above will be greatly appreciated!!!

    Thank you so much for your time!

  • I don't think Phil realized you'd need optimization help right away (or had that many files to process) when he said "feel free to post another thread'. Lol.

    No reason this can't be in the same thread. I'll post a response there.

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply