SSAS cude full process memory Usage

  • Hi

    I want to find out if it is normal for cube full processing task to use more that 10 times it's size in memory.

    Some detail.

    Cube type - Tabular

    Cube Size 1.5Gb

    Database - 11Gb

    Memory consumption when I do a full process of the cube bombs out. At the stage it bombs out SSAS have shot up from 4Gb IDLE to 24Gb.

    According to all information I could find this should not go above 3 times the Cube size. Should I troubleshoot this or just request extra RAM.

    Thanks in advance

    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    Do not reinvent the wheel.
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

  • As ever with this sort of thing the answer is very much "it depends"

    You could look at the advanced memory properties of the AS instance. More information on them here: https://msdn.microsoft.com/en-us/library/ms174514%28v=sql.110%29.aspx

    Be wary of changing these around because here be dragons.

    Have a look here too: https://www.sqlbi.com/articles/heap-memory-settings-for-analysis-services-tabular-2012-2014/


    I'm on LinkedIn

  • thanks for the reply. I will read that blog shortly.

    Do you know how much calculated measures influences the memory usage? The table in question has 15m record and 60 of the fields are calculated measures.

    According to my limited knowledge these fields should be calculated before it is imported into the cube. What is the best practice.

    In connection with the settings. I have tried the different suggestions from blogs and Microsoft. The memory gets maxed out every time.

    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    Do not reinvent the wheel.
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

  • In SSAS Tabular calculated columns are persisted (unlike in SSAS Multidimensional which runs calculations on the fly at query time) and if they are complex can hit processing resources highly. The following describes some techniques that you can use to mitigate this behaviour:http://www.sqlbi.com/articles/calculated-columns-and-measures-in-dax/


    I'm on LinkedIn

  • Thanks again. If I understand it correctly the guru is saying that Measures will not consume any memory at process time. This brings me back to my original question. Why would a 9m record table consume more than 17Gb of memory to do a full process. Is there some way of checking the internals to determine the cause? Is this normal?

    I have tested the table without any measures and it processes fine. In this test it consumes only 2GB of memory to process.

    Sorry for all the questions, I do not have the experience in the SSAS admin field.

    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    Do not reinvent the wheel.
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

  • OomBoom (3/23/2016)


    Thanks again. If I understand it correctly the guru is saying that Measures will not consume any memory at process time. This brings me back to my original question. Why would a 9m record table consume more than 17Gb of memory to do a full process. Is there some way of checking the internals to determine the cause? Is this normal?

    I have tested the table without any measures and it processes fine. In this test it consumes only 2GB of memory to process.

    Sorry for all the questions, I do not have the experience in the SSAS admin field.

    Well no, what he is saying is that if you have a calculated column then it gets computed during processing and then stored but a measure does not as it is contextual. He is saying that they may look similar and it's easy to get the two confused both in terms of the functionality that they are supposed to perform and how they are defined in your development environment. Look carefully at what you have. If you are saying that the only problem is in processing then it is probably to do with calculated colummns. However, how would you know if you can't process it? Think back to the last successful process; what has changed?


    I'm on LinkedIn

  • Hi

    Only the size of the data changed since the last successful full process. I am sure if I increase the RAM to 64GB it will go through. Actually I want to understand how the RAM is consumed when processing. As I mentioned the rule is 3 times the cube size is required. In my case it is bombing out at 12times the size.

    Thanks for all the effort. I think I will do the hard yards and test every single components by adding each one and testing afterwards the memory usage.

    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
    Do not reinvent the wheel.
    >>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>

  • Happy to help. Although if you are hitting these problems now (with the increase in data) perhaps it's time you considered whether SSAS Tabular is appropriate? Or can you reduce the data size in another way (do users have to see full history, for example)? Could you break up the tabular model into smaller individual cubes?

    I would say if the answer is positive to the last question then it may be time to think about a multidimensional cube instead. Just my opinion, but I'd much rather be administering one large cube rather than 10 small ones 😀


    I'm on LinkedIn

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply