DTS to load huge data into table

  • Hi,

    I am using DTS package for loading data from text file into a table where it loads huge number of records.

    I will be running four DTS packages successively, in this case tansaction log is getting full.

    What are the steps I should take to handle this scenario?

    Please let me know.

    --NS

  • are you doing any transformations on the data or is it a straight load? If all you're doing is importing data and not doing any validation etc, I might think about switching to using BCP or doing some other sort of bulk load as it should give you some performance increase.

    Also, you could look at switching from FULL Reocvery to Bulk Logged, but probably the easierst thing for you to implement would be doing a transaction log backup at the end of each of your DTS jobs.

    -Luke.

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • Thanks for the reply, Luke. I think transactional backup after DTS is good suggestion. I probably should shrink the log file as well?

  • Mostly that would depend on how often this operation will run and just how tight you are on drive space. Basically, if you backup your tlog you shouldn't have to worry about it growing out of control. If you continually shrink the log and the it autogrows back out you'll end up with file fragmentation that can be detrimental to your db.

    Also, depending on what you are doing with your DTS packages, I'd perhaps think about creating one package with your four transforms, and a few execute SQL tasks to do your Log Backups. Just have to make sure you put everything in the appropriate order. To me it makes it easier to deal with one package instead of four that may or may/not depend on others that are to run at the same time.

    -Luke.

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • Single DTS is having around 15 input files and same like other DTS as well.

    I would try with backup of the log.

    Thanks!

  • You should set the batch size on the DTS datapumps to a reasonable size, like 100,000 rows. This will make smaller transactions and help prevent expanding your transaction logs. You will also need to run frequent transaction log backups to keep log sizes under control, like every 15 minutes.

    There should be no need to shrink the log files if you set reasonable batch sizes, and run frequent transaction log backups.

  • How to set the rows in datapump? Please let me know.

  • Set Rows per transaction in the Oledb destination task

    Maximum Insert Commit Size.

    If your database Recovery model is Simple, OR bulk logged.

    You can also set the Tablock Option and the data will be minimally logged.

    But only do this if your okay with the table being locked during the duration of the load.

    If it doesn't take that long this could be good.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply