May 12, 2014 at 2:04 pm
What is the best practice for determining the amount of memory on an ssis server for big data?
May 12, 2014 at 2:18 pm
A lot and some, seriously the more the merrier.
😎
May 12, 2014 at 2:24 pm
What a vague question 🙂
What do you want to do in SSIS that is linked to big data?
It also depends a bit on your design of the SSIS packages: if you have a lot of packages running at the same time, you obviously need more memory.
If you use a lot of semi-blocking and full blocking components (such as the sort), you need even more memory.
If you do lookups with full cache against large tables, you also need a lot of memory.
If also depends on the amount of data that you transfer.
But as Eirikur mentioned, there is no such thing as too many when it comes to memory 😀
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
May 12, 2014 at 2:28 pm
Well I just asked our DBA how much the ssis server had and he said 8 GB and 4GB alloted to SQL Server. I have improved a package from a runtime of 2.5 hours to 45 minutes but I noticed the memory was pegged out. 45 minutes is fine because the package runs at night but I guess I just have my inner geek screaming for more memory...lol
May 12, 2014 at 2:35 pm
dndaughtery (5/12/2014)
Well I just asked our DBA how much the ssis server had and he said 8 GB and 4GB alloted to SQL Server. I have improved a package from a runtime of 2.5 hours to 45 minutes but I noticed the memory was pegged out. 45 minutes is fine because the package runs at night but I guess I just have my inner geek screaming for more memory...lol
If SQL Server has already 4GB and the OS needs some memory as well, SSIS doesn't have a lot to work with.
45 minutes is long in my opinion and you can probably speed it up by adding memory, but it also depends a lot on the design of the package itself.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
May 12, 2014 at 2:42 pm
I agree, I enabled as many tranformations as possible to run in parallel and removed a lot of merge joins, sorts and just did a join in the oledb source etc. Trying to get Jumbo frames enabled as well but with only 4GB Im not sure theres anything else I can do to speed it up
May 12, 2014 at 3:13 pm
dndaughtery (5/12/2014)
I agree, I enabled as many tranformations as possible to run in parallel and removed a lot of merge joins, sorts and just did a join in the oledb source etc. Trying to get Jumbo frames enabled as well but with only 4GB Im not sure theres anything else I can do to speed it up
With this little memory, it is important to limit the complexity of the packages, having them act more as a pipeline. As you mention, sort by source, but also do all data conversions by either source or destination, use staging tables with sql joins and such. Jumbo frames are hardly going to help if the content is scratched on disk during the transformation.
Quick question, is the SQL Server on that box also the main target for the SSIS? If so, try to get shared memory working, saves a lot of load and time.
😎
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply