SSIS Package in Production Takes Way Too Long to Execute

  • I suspect I am having a memory issue on my production box but I am not clear as to why this might be happening so I am hoping someone can shed some light on this issue for me!

    I am running SS2K8 on a server with 4GB of RAM. I have an SSIS package that pulls in approximately 5,000,000 rows and does some very complex calculations and aggregations on the data. The end result of the package is an insert of about 2000 rows into the DB.

    My problem is this... When I run the SSIS package from my personal computer (development), I am able to pull all 5,000,000 rows, perform all of the necessary aggregations, and complete the insert in about 3 minutes. During this process about 1.5 GB of memory is consumed. Now, when I run the exact same SSIS package in the production environment with a job, it takes about 75 minutes to complete.

    I suspect that my issue is memory related but I am unclear as to why this could be happening. Is it possible that not enough memory is being allocated to the SQL Server Agent and/or SSIS to complete all of the complex aggregations in memory? Is there some sort of memory setting that I need to adjust to ensure that jobs are allocated enough memory?

    Any help would be greatly appreciated!

    Thanks,

    Aaron

  • Aaron G (4/20/2011)


    I am running SS2K8 on a server with 4GB of RAM. I have an SSIS package that pulls in approximately 5,000,000 rows and does some very complex calculations and aggregations on the data. The end result of the package is an insert of about 2000 rows into the DB.

    :blink: 4GB on a server? That's what we use to run the OS these days. You're memory starved. Your OS is going to need a gig, absolute-bare-minimum-hope-nothing-goes-wrong-don't-use-it-much, to run properly. Now, assuming your SQL Server chews up and spits out the other 3 gigs, which is very easy to do, you're now in the swapfile. You've traded RAM for DiskI/O, and rarely good DiskI/O at that.

    5 Mill rows, with only an INT, is 20 mill bytes. that's 20 megs. Not so bad off hand, but my guess is there's a LOT more bytes then just an INT. IE: Varchar(20) will average 11-12 (10+1/2) bytes. 55-60 megs. That's one column. So, how much of this is in memory at any point...

    My problem is this... When I run the SSIS package from my personal computer (development), I am able to pull all 5,000,000 rows, perform all of the necessary aggregations, and complete the insert in about 3 minutes.

    ... is all of it. Multiple times. Aggregations stop the stream to collect all data before continuing. They're known to be massive memory overhead eaters.

    During this process about 1.5 GB of memory is consumed. Now, when I run the exact same SSIS package in the production environment with a job, it takes about 75 minutes to complete.

    Open up perfmon on the SQL Server when you run this, and trace the Memory pointer Page Faults/Sec. They're going to skyrocket. Page Fault is when your RAM writes to disk because it's out of room.

    I suspect that my issue is memory related but I am unclear as to why this could be happening. Is it possible that not enough memory is being allocated to the SQL Server Agent and/or SSIS to complete all of the complex aggregations in memory? Is there some sort of memory setting that I need to adjust to ensure that jobs are allocated enough memory?

    I would double check the settings on the SQL Server itself and find out what memory usage settings it currently has. I would also look into upgrading that server. It sounds like your desktop has more firepower then the server does. Also, when you're running on your desktop, you're getting two machines worth of effort. The SQL Server is doing what it does best, finding, transporting, and writing data, and not just for you. Your desktop is handling all of the memory and CPU work. When you offload both sides to the same box, they're going to compete for resources in a box that is that light on memory.

    As a side note, I'm well aware of many SQL Servers that run in smaller memory spaces and serve perfectly well, even rather large DBs (couple of hundred gigs). They just don't try to process that kind of volume at once.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA

  • What kind of connection managers are you using? The ADO.Net ones I find to be pretty much unusably slow. The old OLEDB ones work well with large datasets.

  • Thank you for the replies. As it turns out my problem was related to the server itself. The server is a VMWare machine and there were some issues with the VMWare configuration. I had this issue corrected and also had the RAM expanded from 4GB to 6GB and now things are running much more smoothly.

    Thanks again for the responses!

    Aaron

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply