March 14, 2014 at 12:34 pm
We have run into an issue on a dedicated SSAS 2012 SP1 server where the allocated memory is not being utilized, causing some slowness in use, connections, and queries.
Total Memory on the server is 512, and after startup, the utilized memory gets up to a max of 60GB and stops there. Checking the Resource Monitor, msmdsrv.exe is only taking around 39GB overall. With the current properties, that should be at 330GB. Am I missing something in the settings or in configuration that should be changed?
Version: SQL Server 2012 SP1 Enterprise (11.0.3000)
OS: Windows Server 2012 Datacenter - Fully patched and up to date
Databases: 2 Tabular models
Server: 512GB RAM
Current memory configuration:
Hard Memory Limit - 0 (Default)
LowMemoryLimit - 65% (Default)
TotalMemoryLimit - 95% (Default is 80)
VertiPaqMemoryLimit - 60% (Default)
VertiPaqPaginingPolicy - 1 (Default)
MemoryHeapType - 2 (Default)
Thank you!
March 14, 2014 at 1:51 pm
Are you sure the SSAS needs the memory?
SSAS does not grab and hold memory like SQL Server, it only takes the memory that is necessary to keep the model in memory.
The LowMemoryLimit is not the minimum amount of memory taken ( 512 * 65% ~ 330), but the Specifies the threshold amount of memory allocated by Analysis Services at which it will begin cleaning caches to lower memory usage.
You can find more details here: http://msdn.microsoft.com/en-us/library/dn393915.aspx
March 15, 2014 at 3:52 pm
Thanks for pointing me in the right direction. That whitepaper has helped quite a bit. It does look like the tabular models, currently, are only about 50GB in size total.
After further investigation it looks like the Excel workbooks that are querying the models are hitting a tipping point of performance after selecting 6-7 slicers, at which point the data refresh takes 2-3 minutes. We will continue to look into this.
Thank you!
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply