May 2, 2017 at 2:54 pm
Hi,
Normally when I set up a new SQL Server, I set the instance's maximum server memory to 75% of the available memory on the virutal machine. So, if a VM has 8gb of memory, I'll initially set the SQL Server Max Server Memory to 6GB.
My coworker is working on an ETL process in SSIS and noticed that it was taking several hours to process a few hundred thousand records. In the development environment, he adjusted the Max Server Memory to 2GB based on a few articles he read about optimizing ETL processes. This change brought the ETL process down to 9 minutes!
Would anyone be able to shed some light on why this might have helped? It seems counter-intuitive to decrease max memory to speed things up.
Thanks in advance,
Mike
Mike Scalise, PMP
https://www.michaelscalise.com
May 2, 2017 at 3:04 pm
It is possible that SSIS was allowed too little memory. SSIS package executions allocate their own memory outside of what is provisioned for the database engine, and the max server memory in the SQL Server engine does not limit how much memory can be used by SSIS. Some processes in SSIS require a lot of memory, especially those with data flows that move large amounts of data. If I had to guess on the situation you describe, SSIS needed more memory than what remained after giving 6 of the 8gb to the database engine, and freeing up some of that memory for use by SSIS allowed the package to use more physical RAM resulting in less paging to disk.
Tim Mitchell, Microsoft Data Platform MVP
Data Warehouse and ETL Consultant
TimMitchell.net | @Tim_Mitchell | Tyleris.com
ETL Best Practices
May 4, 2017 at 4:57 am
Tim,
That makes a lot of sense. I never considered that SSIS was using its own memory allocation, though it is a separate application altogether so it also makes sense that it would.
Thanks for your input!
Mike
Mike Scalise, PMP
https://www.michaelscalise.com
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply