Log file grows huge after each run of SSIS packages

  • I wasn't sure whether to post this in the SSIS forum or here, but have plumped for here...

    I have a DB which is only 7.5GB, but after running several SSIS packages which perform a daily import of data from 6 different databases, and do some processing of this data, the log file will grow from .5GB to about 17GB, in the half hour it takes the packages to run.

    The database is backed up nightly and the log is truncated, but there are space issues with this huge and sudden growth of the log file, as the server is used for many different databases and backups throughout the day.

    Is there anything I can do to prevent the log growing at this rate while the packages are running? Do I need to look at the packages rather than the database?

    TIA

  • robin-892784 (10/20/2009)


    I wasn't sure whether to post this in the SSIS forum or here, but have plumped for here...

    I have a DB which is only 7.5GB, but after running several SSIS packages which perform a daily import of data from 6 different databases, and do some processing of this data, the log file will grow from .5GB to about 17GB, in the half hour it takes the packages to run.

    The database is backed up nightly and the log is truncated, but there are space issues with this huge and sudden growth of the log file, as the server is used for many different databases and backups throughout the day.

    Is there anything I can do to prevent the log growing at this rate while the packages are running? Do I need to look at the packages rather than the database?

    TIA

    if the growth of the log file varies on a daily basis, I would size it to the 17gb, you havent mentioned if the log file is shrunk on a nightly basis. if the log file is sized to the maximum size that the daily imports grow it to, then there shouldnt be any issues regarding log growth as long as you have the storage capacity. you could on the other hand, change the package so that it batches the transactions, and backup the log in between the batches to minimise the log file growth.

    --------------------------------------------------------------------------------------
    [highlight]Recommended Articles on How to help us help you and[/highlight]
    [highlight]solve commonly asked questions[/highlight]

    Forum Etiquette: How to post data/code on a forum to get the best help by Jeff Moden[/url]
    Managing Transaction Logs by Gail Shaw[/url]
    How to post Performance problems by Gail Shaw[/url]
    Help, my database is corrupt. Now what? by Gail Shaw[/url]

  • What other processes are occuring? Reindexing?

  • Yup, I had said that the log gets truncated nightly...I will probably size the log to 20GB, get some more disk space and truncate nightly, although it still kind of feels like something is wrong in the SSIS packages with the log growing so rapidly.

  • Is it possible to change the recovery model on the database to simple? In other words, is it possible to re-run the imports and processing?

    Also, you could put the log file on a different drive than the database.

  • What are the packages using? Data Flow with ADO or OLEDB? Loading from what source?

    If you are taking regular log backups then I wouldn't truncate the log each night as that would just be creating fragmentation wouldn't it? No need to truncate if you have a set size and its sticking to it.

  • Shark Energy (10/20/2009)


    What are the packages using? Data Flow with ADO or OLEDB? Loading from what source?

    If you are taking regular log backups then I wouldn't truncate the log each night as that would just be creating fragmentation wouldn't it? No need to truncate if you have a set size and its sticking to it.

    The packages are using data flow with OLEDB connections and all of the sources are SQL Server 2000 or 2005, mainly the former, and also on the same server as the destination database.

    Thanks for the advice on the log

  • jbuttery (10/20/2009)


    Is it possible to change the recovery model on the database to simple? In other words, is it possible to re-run the imports and processing?

    Also, you could put the log file on a different drive than the database.

    The recovery model is simple...I'm not sure what you mean about re-running the imports and processing.

  • It was a simplistic statement to see if it was possible that the source databases remained static after you imported the data so that you could do the ETL thing again.

  • Review the SSIS packages - on the OLEDB destination you want to validate that you have set a batch size and a commit size. Setting these should help reduce the actual usage of the transaction log, since that would define how many rows are processed before they are committed to the database.

    Jeffrey Williams
    “We are all faced with a series of great opportunities brilliantly disguised as impossible situations.”

    ― Charles R. Swindoll

    How to post questions to get better answers faster
    Managing Transaction Logs

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply