November 14, 2017 at 6:37 am
I have a server running 2014 that has the database engine, SSIS, SSRS, SSAS running. Does anyone have experience or a thread that has best practices for memory allocation for this type setup? I Know it depends on workflow meaning when are the SSIS packages running versus Reports/Analysis Services. Most of our database engine heavy work is at night with backups and jobs. Then a little later in the morning(i.e 2:00 am through 6:00 am) is when we are loading the Data Warehouse utilizing SSIS Packages. During the day is when SSRS is hit the heaviest, so, I'm not sure there would be a static setting for this. Memory might need to be adjusted during these times mentioned. I just wanted some feedback relating to this.
thanks in advance.
November 14, 2017 at 1:19 pm
We actually have a similar setup where I work and it was a lot of trial and error to get thigns to work optimally.
Even after we thought we had things set properly, we had an SSIS job go crazy. It had been run about 100 times and all of them completed within a few minutes which was within our acceptable window. Then one day had a user complain that there was no data in the table he was looking at. I looked at the SSIS job and it had been running for 8 hours.
SSIS and SSRS operate in their own memory space outside of the database engine. I am not sure how SSAS handles its memory.
Our "window" was roughly 18 GB of memory for our SSIS, SSRS, OS and application overhead.
If you have a similar test environment, I would recommend doing a simulated load on that environment and watching perfmon to see how your memory handles things.
If you don't have a test environment, I would try to get a window in which you can do some testing. I expect your highest memory SSIS packages are going to be in your ETL load, I'd run that while perfmon is running so you can see the memory hit caused by the ETL load. Then it probably wouldn't hurt to run other SSIS packages just in case your ETL load isnt' the biggest memory hog.
As for SSRS, we don't have any huge reports; most don't even have any visualizations - just table outputs. So it depends on what your reports look like. I would pick a few reports and look at what perfmon says about the memory.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply