March 5, 2007 at 5:39 am
Hi.
I have a very old AS 2000 database with a number of cubes. some of these cubes hold up to 5 years of data, and all of this is required. these cubes are refreshed with new data nightly so each day the users have to refresh their reports. this can take over 45 minutes to refresh. is there anything i can do to speed this up. Note, i am very new to analysis services.
Thanks all
March 5, 2007 at 8:30 am
I will propose updating/ stats after the loads
March 5, 2007 at 8:33 am
If you have the correct edition of AS, ensure that the data is partitioned, then you could load the partitions in parallel.
Depending on your data sizing, 5 years in 45 minutes could be considered fast - what row counts are you talking about here? Also, have you optimized yur setup as per the AS2000 performance & admin guides?
Steve.
March 5, 2007 at 9:08 am
ive done no optimization as of yet. where do you find the AS 2000 guides? they are not in my version of BOL. I was looking on the web, and i see i can create indexes on all of the columns in my warehouse. is this on the actual cubes, or just on the tables in the data source view that feed the cubes? if i could create indexes it might speed the cubes up, but they cant be put in readonly mode as they are updated each night.
March 5, 2007 at 9:23 am
The guides arent in BOL but are viewable on the msft site -> performance and Operations.
I'm not sure why you'd want (need) to create indexes on your columns in your DW tables (other than foreign keys) as in most cases, cube dimension and cube fact builds involve full scans of these tables (different story ie users are querying the DW tables directly).
Look up Partitions in BOL and the perf guide. Partitioning your data allows for faster query response times (less data for the query engine to process) and more flexability in processing (ie load just a single partition if you want). Each year could go to a single partition, you could then load the most recent year/partition first and then work backwards.
Again, what is driving the requriement? Did the refresh used to take 30 minutes and now it's jumped to 45? What are the data volumes? If you're running in several hundred million rows then 45 minutes could be quite respectable.
Steve.
March 5, 2007 at 9:27 am
Is it the *build* thats taking 45min + or the *query* times?
If it's the query time(s), then see if you can log the mdx (in most cases with AS2K this needs to be done on the client, add the Log File="somepath\somefile.log" to the connection string).
Worst case, talk to your users and determine what it is that they're doing that causes the response to take so long. Try to replicate this. Also review the 'reports' they're trying to create - remember that just because they want to filter to State = 'WA' doesn't mean that this needs to be on the rows or columns - minimize the row/column instersections that are i) calculated and ii) need to be returned. If they're trying to bring back 10,000 cells, ask why as most people can't interpret this volume of data easily.
Steve.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply