March 20, 2009 at 12:33 pm
Hi,
I am using DTS package for loading data from text file into a table where it loads huge number of records.
I will be running four DTS packages successively, in this case tansaction log is getting full.
What are the steps I should take to handle this scenario?
Please let me know.
--NS
March 20, 2009 at 1:27 pm
are you doing any transformations on the data or is it a straight load? If all you're doing is importing data and not doing any validation etc, I might think about switching to using BCP or doing some other sort of bulk load as it should give you some performance increase.
Also, you could look at switching from FULL Reocvery to Bulk Logged, but probably the easierst thing for you to implement would be doing a transaction log backup at the end of each of your DTS jobs.
-Luke.
March 20, 2009 at 1:34 pm
Thanks for the reply, Luke. I think transactional backup after DTS is good suggestion. I probably should shrink the log file as well?
March 20, 2009 at 1:40 pm
Mostly that would depend on how often this operation will run and just how tight you are on drive space. Basically, if you backup your tlog you shouldn't have to worry about it growing out of control. If you continually shrink the log and the it autogrows back out you'll end up with file fragmentation that can be detrimental to your db.
Also, depending on what you are doing with your DTS packages, I'd perhaps think about creating one package with your four transforms, and a few execute SQL tasks to do your Log Backups. Just have to make sure you put everything in the appropriate order. To me it makes it easier to deal with one package instead of four that may or may/not depend on others that are to run at the same time.
-Luke.
March 20, 2009 at 1:52 pm
Single DTS is having around 15 input files and same like other DTS as well.
I would try with backup of the log.
Thanks!
March 20, 2009 at 2:17 pm
You should set the batch size on the DTS datapumps to a reasonable size, like 100,000 rows. This will make smaller transactions and help prevent expanding your transaction logs. You will also need to run frequent transaction log backups to keep log sizes under control, like every 15 minutes.
There should be no need to shrink the log files if you set reasonable batch sizes, and run frequent transaction log backups.
March 20, 2009 at 4:00 pm
How to set the rows in datapump? Please let me know.
March 20, 2009 at 5:04 pm
Set Rows per transaction in the Oledb destination task
Maximum Insert Commit Size.
If your database Recovery model is Simple, OR bulk logged.
You can also set the Tablock Option and the data will be minimally logged.
But only do this if your okay with the table being locked during the duration of the load.
If it doesn't take that long this could be good.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply