June 11, 2020 at 7:35 pm
So babysitting SSAS only part time gig have over 1000 SQL Servers so pardon being a rookie. Working with a server that has SQL Server instance set at 32 gig of memory (box has 128) Currently I have the SSAS side setup with the recommendations in this link. http://byobi.com/2014/04/ssas-memory-configurations-for-common-architectures/. Still getting "the JSON DDL request failed with the following error: The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.. at Microsoft.AnalysisServices.Xmla.XmlaClient.CheckForSoapFault(XmlReader" Anyone hit this wall before. Does going to 2017 help any with these issues?
June 12, 2020 at 5:20 pm
Few things i would check before looking at upgrading from 2016 to 2017. My expectation is that upgrade won't help, but it might if it is a 32-bit vs 64-bit problem and you install the 64-bit version.
Something to note about the article you linked is that it is giving suggested values, not recommended values. They do not know what your workload looks like. Plus the article seems to contradict itself. It says that you should set your HardMemoryLimit on the SSAS instance to (Total Physical Memory) – (Memory for OS) – (SQL database instance Min). Then in the example it shows a system with 128 GB of memory, 6 GB for the OS and 10 for the MIN memory for the database you should set the hard memory limit to 90 GB even though 128-6-10 is 112. It also says "You can leave the SQL database instance Max memory setting at the default of 2 TB since SQL server is good about sharing" which I completely disagree with. SQL is NOT good about sharing. Once it has the memory, it is very reluctant to give it back to the OS. But the math they used in their example was 128 (total memory) - 6 (OS memory) - 32 (SQL MAX memory, not MIN memory) = 90 GB. For that particular system, they are assuming 6 GB is enough for the OS and any other thing running on the system (backup, antivirus, world of warcraft, etc). Your environment may be different.
First thing I'd check is how much memory is being USED on that system when the error is thrown. If it is using ALL of the 90 GB that you have given it and it isn't enough, you may just need more memory in the server or to look at how you are loading your SSAS data or you may need to drop the max memory on the SQL instance down and bump the hard memory limit to something higher than 90.
Now if you still have most of that free, I'd be looking to see if you installed the 32-bit version of things or the 64-bit. 32-bit has a max memory per process of 2 GB which is not that hard to hit when working with SSAS or SSIS. Offhand, I don't remember when SQL Server stopped offering a 32-bit version, but if you put a 32-bit SSAS install on there, you will likely hit that error without actually running out of memory on the system. Just ran out of memory for the process.
The last thing I'd be curious about is does this process ever succeed or is it always failing? If it does ever succeed, I'd be curious to know how much memory it uses on successful runs. It could be it needs more than 90 GB of memory in some cases and uses most of that 90 GB in others.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
June 12, 2020 at 6:41 pm
so this is 64 bit enterprise edition. The sql instance which is doing nothing at this point has a max of 28 gig but it never hits this anyhow. Feel free to correct me but when the tabular models are defined in memory (they have both a dev and uat on same box) the defined ones are created when the tabular instance comes up. After the instance comes up and you click on the databases on the tabular instance it can take up to 35 minutes for the databases folder to expand in SSMS. So then the first job step tries to run below there is nothing available for the memory and this error comes up. These databases are not that large but may not be done right
June 12, 2020 at 7:24 pm
Since you are on 64-bit and Enterprise Edition, it is likely that the system is running out of memory to process the data.
I would check things out on the server and see if the memory is actually being fully used up or if it stops at a lower number.
When I say check the memory, I mean the memory used on the physical server and sure the max memory limit for SSAS is under that value as well as high enough to process the data.
If you cannot process the first job step due to running out of memory, are you able to simplify the job step or possibly break it into multiple steps in a way that allows you to use less memory?
Another thing to try, which likely won't help the memory issue, but may help the 35 minutes to expand the folder in SSMS - are you using SSMS 2016 to connect to SSAS 2016? I know with SSIS 2012 (for example), if you connect from SSMS 2012 it works in reasonable time, but if you connect with SSMS 18.5 it is incredibly slow and has given me odd errors before; especially if you try loading an ISPAC file onto it. It will claim it is successfully loaded, but when you go to run it you get errors due to it being in the wrong version format. If you are using SSMS 2016, that is likely not the issue, but if you aren't it COULD be that SSMS is doing something wrong...
I may be wrong on all of this too. My experience with SSAS is limited (we don't use it much) and I've not run across that error before.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply