June 20, 2011 at 2:50 pm
I have a dimension called Clients
I have a dimensional property called [Clients].[User Count]
I also have a measure [Measures].[Errors Count]
I would like to devise a formula that does a roll-up of the [User Count] (using SUM) and divides the SUM of Errors Count by the sum of User Count.
If users was a measure, this wouldn't be tricky, but the best I have is the following calculation
CREATE MEMBER CURRENTCUBE.[Measures].[Calculated Member]
AS [Measures].[Errors Count]/[Clients].[User Count].children.membervalue,
FORMAT_STRING = "#.00",
NON_EMPTY_BEHAVIOR = { [Errors Count] },
VISIBLE = 1 , ASSOCIATED_MEASURE_GROUP = 'Measures Errors' ;
This works great at the lowest level of data, but when I try to change it around like.
CREATE MEMBER CURRENTCUBE.[Measures].[Calculated Member]
AS [Measures].[Errors Count]/SUM([Clients].[User Count].children.membervalue),
FORMAT_STRING = "#.00",
NON_EMPTY_BEHAVIOR = { [Errors Count] },
VISIBLE = 1 , ASSOCIATED_MEASURE_GROUP = 'Measures Errors' ;
My results change to #ERROR
I do wish that SSAS would tell me why I have an error, I keep feeling like it's hiding too much from me.
any help on making the user count roll-up to various values would be greatly appreciated.
June 20, 2011 at 3:15 pm
Hey mark,
Have you tried creating the calc measure first? And then using it (with it's default SUM rollup) in the subsequent calc member?
Steve.
June 21, 2011 at 7:01 am
stevefromOZ (6/20/2011)
Hey mark,Have you tried creating the calc measure first? And then using it (with it's default SUM rollup) in the subsequent calc member?
Steve,
I'm not sure I follow. The User count is a dimension attribute. Are you saying I should create a new measure group off of the dimension table and use the attribute as the measure value?
Does this complicate things when I say that the client dimension is an SCD Type 2 dimension and the user count is one of the most commonly changing attributes?
June 21, 2011 at 8:49 am
Hey Mark,
No, I was more hoping you could create a new calc measure derived from the attribute values (which you can) but it appears to do the same thing (ie not aggregate). The only alternatives it seems, are to try and drag the value from the dim to the fact (using view or in the DSV) or perhaps add an IIF statement to the calc measure that checks for the level of the current dim member, and if not leaf level, get the leaves to aggregate up to the current level.
Steve.
June 21, 2011 at 9:05 am
stevefromOZ (6/21/2011)
Hey Mark,No, I was more hoping you could create a new calc measure derived from the attribute values (which you can) but it appears to do the same thing (ie not aggregate). The only alternatives it seems, are to try and drag the value from the dim to the fact (using view or in the DSV) or perhaps add an IIF statement to the calc measure that checks for the level of the current dim member, and if not leaf level, get the leaves to aggregate up to the current level.
Steve,
I can honestly say I don't understand how to do the part you listed last. It's something I've tried to figure out ever since I started trying to work with SSAS 2005 (now 2008 R2). I keep thinking that I could do something like I want easily enough in SQL via Stored procedure or the like, and get stumped over and over by MDX.
I've made some progress by creating a new measures group on my clients. It does math, and actually seems to roll-up and do division correctly... still testing it. Need to eventually accomodate unknown clients (we get error logs back that are for client id's that we don't have a history for, so they get put into unknown), but it actually seems to work.
June 21, 2011 at 11:23 am
mtassin (6/21/2011)
I've made some progress by creating a new measures group on my clients. It does math, and actually seems to roll-up and do division correctly... still testing it. Need to eventually accomodate unknown clients (we get error logs back that are for client id's that we don't have a history for, so they get put into unknown), but it actually seems to work.
Ah well, it doesn't really work... turning the dimension into a measure group does do division, but it doesn't have time dimension awareness, and if I were to try to adapt it to work properly, I'd likely have to configure my ETL process to create a new measures table with each day's user count. Which would then start totalling the user count (even though it shouldn't), which leans to semi-aggregatable dimensions (total users across clients, but not by date). Which I also don't quite get.
Is there a good book on this? So far I have
Wrox's SQL Server Analysis Services 2005 with MDX (which is close enough for 2008 R2)
and
Microsoft Press' SQL Server 2005 Analysis Services Step By Step
But even the "Step by Step" book seems to assume that I understand MDX and/or how cubes work deep down.
June 21, 2011 at 12:13 pm
On the MDX problem, i was able to get the followng to work in a sandbox, so your mileage may vary...
so, I added a new field to my DSV in the fact, and added this to the measure group. The field value was literally 'null' (without the quotes). It's named in my test was 'testy'.
Below is my MDX script from the cube itself. I have a dim called Maindim where the key is Otherid. I added a dimension attribute called attr2 that contains integer values. So, this script basically creates an interim measure (called interim) and sets it's value to be the property value. Then, for the 'testy' measure at the Leaf level, we're setting the value explicitly to be interim * Amount (where Amount is some random bigger integer measure, a 'true' measure from the fact table). This seemed to work ok in my tests. Like I said, YMMV.
CALCULATE;
CREATE MEMBER CURRENTCUBE.[Measures].[interim]
AS
[Maindim].Otherid.currentmember.properties("attr2")
,
VISIBLE = 1 ;
({[Measures].[testy]}, Leaves()) = [Measures].[interim] * [Measures].[Amount];
re: the books, I have always treated MDX solutions (i think 2nd edn with Chris Webb) as my MDX bible.
I guess it would help to know - do you *want* to know how cubes work 'deep down'? If so, then different whitepapers (like the recently released operations guide and the perf guide [2005 is still relevant enoughf or 2008 but i think they released a 2k8 too]). Also, they just wrapped it up, but the Maestro's course (currently run only in Seattle and also now in UK/EU) would be great for that. Its like super deep (level 500+) but would be worth it.
Is one of your books by Teo Lachev? I remember an early edition of his 2005 book being good too (assuming it's gotten better in later editions).
Steve.
June 21, 2011 at 12:41 pm
I'll have to look for that book now.
The two I have (titles corrected from above)
Professional SQL Server Analysis Services 2005 with MDX is by Sivakumar Harinath & Stephen R. Quinn
Microsoft SQL Server 2005 Analysis Services Step by Step by Reed Jacobson, Stacia Misner, Hitachi Consulting
I'm trying to work out what you did into an MDX query first so I can better understand it.... I didn't even know there was a leaves(), I've been trying to work with just currenmember, children, etc... what does leaves() do exactly? There's not a lot of explanation in either BOL or my two books. There's mention of it in BOL about being used with SCOPE, but I haven't figured that out either.
My understanding that I'd like to get is what you'd typically get from a 200 level Database Theory course. I was lucky enough to get the joy of teaching said class several years ago, and was handed a text book suitable for teaching it. The end result for me was that I got a pretty good grasp on DB Theory, but I've never found a book or a class similar for MDX/SSAS/OLAP, and I keep trying to think of OLAP as some kind of super-powered relational system.
But it's not... and that's what's killing me. 🙂
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply