Viewing 15 posts - 1 through 15 (of 52 total)
Sounds like you're building a pretty large cube and have a pretty powerful server to run it on.
Couple of things that worries me.
1. Why do you have a...
May 31, 2013 at 10:01 am
Once the data is in SSAS, the most granular you can get is at the dimension keys. As mentioned by Steve, to be able to analyze the two records...
May 30, 2013 at 3:08 pm
Any way to design the underlying tables so you don't need 3 distinct count measures?
1. MG_B and MG_C can be partitioned on column B and C even if the...
May 30, 2013 at 2:48 pm
Are you going to be capturing any further location detail in the Geography Dimension like street? Is that level of detail needed? If no, then I would not...
April 19, 2013 at 3:11 pm
It looks like the standard edition only supports up to 4 processors so going to a 32 proc machine probably doesn't make a difference.
Were there upgrades in the...
April 19, 2013 at 2:41 pm
I think what L' Eomot Inversé is right on. You don't really need a hierarchical data structure since there really is only one level of data. A simple...
April 18, 2013 at 9:08 am
Depending on the number of changes, how often change happens, and how wide you're tables are, you'd have to implement different methods.
The only change tracking I've had to implement in...
April 18, 2013 at 8:50 am
If you need to access granular data that is mostly likely at the same level as your fact, there is no simple way to do it with SSAS.
I've actually had...
April 17, 2013 at 1:46 pm
I don't believe you can perform multiple DMLs from the MERGE.
You can however perform an insert from the output of a merge.
From msdn OUTPUT CLAUSE example K.
"INSERT INTO Production.ZeroInventory...
April 17, 2013 at 7:56 am
Q1. This should give you the mdf and ldf you need for the SQL database.
http://adventureworksdw2008.codeplex.com/
Q2. Yes, the datasource should be connecting to the SQL database that...
March 2, 2013 at 12:22 pm
Hey Anthony.
There's a couple questions before I can give you a good answer.
If you are trying to deploy and test the Adventure Works analysis services database, you need...
March 1, 2013 at 7:03 pm
Check the ExecutionLog view on the ReportServer database. The view will give you information on duration spent on Data Retrieval, Processing, and Rendering.
Compare the logs from Prod vs Dev...
January 31, 2013 at 3:11 pm
The most straight forward way would be to create 12 schedules.
Schedule 1
Daily: Monday,Tuesday
Start 6:00 AM
Schedule 2
Daily: Monday,Tuesday
Start 7:00 AM
...
Or you could try to manipulate the schedule for the...
January 31, 2013 at 3:02 pm
I forgot to mention that each load is done to a new table. At the start of the each load process, a new table.
The process happens ~1:00 am every...
December 12, 2012 at 1:23 pm
Similar topic was discussed in the following.
The solution I use is by running a query to check for data, if there is data, run dbo.AddEvent on the...
November 28, 2012 at 12:08 pm
Viewing 15 posts - 1 through 15 (of 52 total)