May 11, 2012 at 7:27 am
james.ciesla (5/11/2012)[/b]
It looks like you are attempting to return a large number of records. I would recommend you first create a new MDX Query that returns a small amount of data across a single dimension just to make sure you have setup the permissions properly. Once that is working you can go back to your orginal statement but include a condition to make it a small subset of the data.
OK I will try that way.
As for your second question, if I understand correctly I think you will need to create separate partitions in your cube to address those issues. If you create separate partitions you can limit the data size without affecting the other cubes.
Will check that as well. New to this topic. I get lost even for small issues.
Thanks for your help.
Smith.
May 11, 2012 at 8:43 am
Dear,
The simple answer to this is to deploy the cube and then import it in excel.
Regards,
Rafayz
May 11, 2012 at 12:58 pm
rumsmaster - How exactly do you 'Import a cube into Excel'?
May 12, 2012 at 1:17 am
rumsmaster (5/11/2012)
Dear,The simple answer to this is to deploy the cube and then import it in excel.
Rafayz
Thanks. However the requirement is to automate the import of large volume of data from cube into CSV file.
Excel is not there in picture at all. Can you import 7 million records into excel from cube ????
May 14, 2012 at 12:22 am
Hello James,
Once you are done deploying your project in SSAS, open Excel, select Data from the main ribbon, then Get external data, then from other sources and finally click analysis server project. After clicking it, a wizard will run which will help you in selecting the specific/all cubes from your already deployed analysis server solution. You can then use pivot table or pivot chart to play around with your data in MS Excel.
Regards,
Rafayz, IM Practice,
May 14, 2012 at 12:28 am
Hello Joy,
I haven't tried 7 million records ever but it should work although it will take some time.
Regards,
Rafayz
May 14, 2012 at 1:10 am
I haven't tried 7 million records ever but it should work although it will take some time.
..But maximum rows that an Excel-2010 sheet can have is 1 million.... right ?
However that's not the requirement... It should be CSV file.. automate export...
Bit complex MDX query with lot some filters in it.
Thanks,
Smith.
May 14, 2012 at 1:24 am
Dear Smith,
I think you are right on the money. Yeah one million rows is the limit. I will look into the MDM query today and will let you know something.
Best,
Rafayz
May 14, 2012 at 2:52 am
Thank you so much.
May 14, 2012 at 6:33 am
Dear Joy,
Not much help but I can come up with a couple of ideas.
1)- The first is that you can export data from SSAS using different dimensions on different excel files (as each workbook allows you a maximum of 1 million entries), though this is a very dirty method but will work.
Check out the below link for more reference.
http://technet.microsoft.com/en-us/library/gg399165.aspx
2)- The other method can be to directly feed a sample access database from the SSAS cube. Converting access database to csv is a pain but yeah I cannot say with surety if this method works.
Do let me know if you find any appropriate solution to this.
Regards,
Rafayz
May 14, 2012 at 6:34 am
rumsmaster (5/14/2012)
Hello James,Once you are done deploying your project in SSAS, open Excel, select Data from the main ribbon, then Get external data, then from other sources and finally click analysis server project. After clicking it, a wizard will run which will help you in selecting the specific/all cubes from your already deployed analysis server solution. You can then use pivot table or pivot chart to play around with your data in MS Excel.
Regards,
Rafayz, IM Practice,
What you are describing is using an SSAS data source for a pivot table and is not importing a cube (I just wanted to clarify that so no one reads this and thinks they can bring an OLAP cube into Excel). There are very few circumstances where bringing data into excel this way will work for large recordsets because of how Excel interacts with OLAP cubes. If you have ever tried to bring a large data set back you will see it is typically very slow and cannot be easily automated to perform the type of extract requested here. Even if you used 'Power Pivot' you would need to refresh the data source before each extract.
I would advise against using Excel for ongoing data extracts from an OLAP cube in a production environment.
James
May 14, 2012 at 6:38 am
James, I would agree with you that it would for sure degrade performance issues.....Is there any other way out to this problem, in your opinion??
Regards,
May 14, 2012 at 6:48 am
Joy Smith San (5/10/2012)
Exporting from a cube does not 'give' you a cube, only data at a summary level.
We want data only, in CSV format. We don't want cube.
We hide "mesure value", and display only other "details".
Thanks.
This might be a key to a better way to solve this.
To me, without measures, a cube is just the possible relationships.
You may already have the bulk of this in a view in your data model.
Why anyone would want a cube with no measures is beyond me.
May 14, 2012 at 6:54 am
Joy Smith San (5/11/2012)
There are many dimensions in one of the data source.
And for a particular cube only the necessary dimensions have been included.
But there will be too many records in that cube when we process.
we do not want all the records to be available in cube.
For eg. that cube should have data for specific applications only.
I don't understand where can I filter for those applications.
I CANNOT filter the records in DSV as it's used in other Cubes as well.
Also, don't want to filter at report side as there are some 100 application names.
[
A view used to build this perspective might be an option.
This might be an attribute of the application that would signify include or not rather than all the individual names.
May 14, 2012 at 6:56 am
I would continue trying to get the SSIS data flow task to work. I was able to extract over 2 million rows from an OLAP cube into a CSV file in under 20 sec.
Did you have any success getting your package to run?
You will most likely get a data type warning in the data flow task because of the MDX query, but it will still function properly.
James
Viewing 15 posts - 16 through 30 (of 37 total)
You must be logged in to reply to this topic. Login to reply