January 7, 2008 at 5:44 pm
Server Config:
-Xeon 3.60GHz
-8 GB RAM
-Windows Server 2003 R2 Enterprise SP2
-SQL Server 2005 Standard Edition SP2 Version 9.00.3050.00
-3GB BOOT.ini switch enabled
Issue:
-The processing of our Sales cube in our Production database (SSAS) fails with Memory Error.
Normal process:
- Successfully process all dimensions in Production DB (SSAS)
- Restart SSAS service to clear memory
- Process Sales Cube in Production DB (SSAS)
Events related to the beginning of this failure:
For 2 weeks, this would fail every time our ETL would run. Then 2 weeks ago, we deleted our Development DB (SSAS) that was restored from a backup of the Production DB.
Once that database was deleted, the processing of Sales Cube in Production DB was successful every time for 2 weeks.
Last week, we restored the Development DB (SSAS) from a backup of the Production DB. Now, when we try to process Sales Cube in Production DB, it fails every time.
As you can see above, we restart the service before processing Sales Cube in Production DB. Also, this is the only object that we try to process. We never even reference of affect Development DB.
Somehow, the presence of this 2nd SSAS database causes our processing to fail even though we never try to do anything with it. It's mere presence on the SSAS instance is causing our processing on Sales Cube in Production DB to fail.
My questions are:
- Since it is restored from a back up of Production DB, could Development DB actually be getting processed as well when we specify to process Production DB...i.e., they are sharing an object link so that when we specify to process only one, it processes both?
- Is there any memory allocation that happens just because an SSAS database exists? for example, if no one is querying it and it is not processing, is it eating up lots of memory?
If you have any other thoughts which have a solution besides "put the 2nd SSAS database on a separate machine", please let me know. Thanks so much in advance. -John
January 17, 2008 at 2:40 pm
Hi,
there are several possibilities which could help:
x Maybe you could "tune" the processing of the cube by using other processing options as full process. You could use incremental processing.
x You could use ascmd.exe and xmla scripts instead of SSIS, which is an overhead concerning memory. I could also think of bugs when using SSIS for processing.
x You could divide the processing of the cube into processings of each of the measure groups, (even partitions if using enterprise edition). This would not consume so much resources on the server.
x I had some similar problems with endless transactions when doing a full process on a large measure groups on a machine like you have. I divided the processing into ProcessData and another step (running afterwards) ProcessIndexes. The first loads the data and the second refreshes the indexes and aggregations.
x You could reduce the number of aggregations in the cube if this causes the processing to fail.
Maybe this helps.
SK
January 18, 2008 at 2:29 am
I think SSAS needs enough memory to hold all the dimemsions of all the cubes. Therefore the more cubes you have, the more memory you need.
Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.
When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply