August 20, 2021 at 2:31 pm
Here's an example of what I mean...
A data set comprises of Customers, Orders and order details. These are based in stores, which are analysed by Area, County and Country.
An order can be placed in one area, which can be totalled up through Area, County and Country - a simple "count of orders" for sake of argument.
However, a customer is at liberty to go to any area, county and even country.
So, a top level subtotal should show one customer placing a total of 20 orders, but if they are in different areas or countries, it could misrepresent if the lowest level order totals were summed up (correctly) and matched with the count of customers from that level.
I've got round this so far, by having a set of data for each level, but this seems to be making things complicated for the eventual user.
Are there any alternatives to this approach?
August 20, 2021 at 7:58 pm
As a thought - what about instead of trying to build the data set up to contain all possible combinations of summing up the data, you do it in the tool? Let SSAS be used to combine the data and make it easier to consume, but use PowerBI, Excel, SSRS, Tableau, etc for doing the "analytics" work of doing the various counts? Excel's Power Pivot can handle that pretty nicely for example. SSRS you could handle it with filters. I've not used PowerBI or Tableau but I expect both of these have some sort of functionality for aggregating the results.
Now, I may be misinterpreting what you are asking. I am just looking at the section you posted it in (SSAS) and from what I am reading, it sounds like you are trying to perform all of the aggregation inside SSAS so end users can just pull the final numbers. If I am interpreting that wrong, my suggestion above is probably going to be wrong.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
August 20, 2021 at 9:00 pm
Hmm, interesting. I was using ‘bog standard’ excel accessing the database, will have to try/investigate power pivot. I have users trying to reconcile figures using management studio, but the grouping facility is rudimentary it seems.
August 20, 2021 at 10:12 pm
I think the first step is finding the right tool for the job. For doing analytics, I would recommend an analytics reporting tool (SAS Visual Analytics, PowerBI, Excel, Tableau, etc). I wouldn't use SSMS as an analytics tool myself except for some small one-off analytic things.
If you don't have any analytical tools except Excel, I would pull the data from the cube into a worksheet with the "raw" data. Once you have that, you can do all sorts of fun stuff with the data. Formulas on the data (COUNT, COUNTIF, COUNTIFS), pivot tables, power pivot, etc. Something to note though - if you start doing a LOT of calculations on the data, it can get slow. COUNTIFS are notorious for slowing things down. But even without using power pivot, pivot tables on your data will give you exactly what you are looking for.
Pull the data into a worksheet (I usually name it something like "RAW DATA" so it can be validated if end users think the pivot tables are bad), then make a new worksheet called something like Analytics. Then you use a pivot table on the raw data and add things as you see fit. You can filter the data, group the data, do aggregates on the data, graph the data... it's a powerful tool. The downside is it is all running on your local machine. This is a downside because servers generally have beefier cores and more memory than a workstation and that is where tools like PowerBI, Tableau, SAS VA, etc come into play.
The other nice thing with Excel is that end users can refresh the data to show current, or they can prepare the data (for example, before a meeting) and have a snapshot. Some of the other tools are not very good at "snapshot" data.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply