March 12, 2002 at 9:13 am
When building a cube and setting up aggregations, there is a point in the process in which you set up the number of aggregations, based on either megabyte size, performance increase, or until you stop. It gives you a graph of performance vs size. This size number can get rather large, depending on the # and size of your dimensions. For example, I had a model in which I wanted a 70% increase in performance and the size # was 3 gigabytes. What does this number mean? After processing my cube was only 280 megabytes!
March 13, 2002 at 7:18 am
Hi Joe -
The size is Analysis Services estimated cube size. Although the results in your case are extreme, it's not all that unusual for the estimate to be substantially too high.
When Analysis Services generates this number it considers dimension depth (number of levels, number of members at a level) but has no way of knowing to what degree facts will align along common interestions.
So, when the cube is actually processed the number of actual intersections (those with data) is much less than the number of potential intersections. The result is much smaller than estimated cube storage.
Scot J Reagin
Consulting Manager, Rocky Mountain Region
Aspirity, LLC
425.519.3777 x312 main
720.244.9328 mobile
866.751.6333 fax
Scot J Reagin
sreagin@hitachiconsulting.com
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply