Long Running Process Cube Task

  • I just posted this in the DTS area but thought I would post it here as it overlaps DTS and AS. I have a DTS package that sets an incremental filter for a cube processing task after importing and scrubbing data. Usually the entire package takes about 90 minutes to execute. For the past few weeks this package, when it completes without error, has been taking about 12 hours to complete and rarely completes without error. The errors are as follows:

    Step Error Source: Microsoft Data Transformation Services (DTS) Package

    Step Error Description:Invalid clipboard format

    (Switch Management (80004005): Partition 'Switch Management' must be processed before merging)

    Step Error code: 8004006A

    Step Error Help File:sqldts80.hlp

    Step Error Help Context ID:700

    -and -

    Step Error Source: DSO

    Step Error Description:Cannot open connection to Analysis server 'MORPHEUS'.

    Timeout occurred.

    Step Error code: 80040050

    Step Error Help File:

    Step Error Help Context ID:1000440

    -and-

    Step Error Source: DSO

    Step Error Description:Cannot lock object 'Terminating Card'.

    Error: Connection to the server is lost

    Verify that the Analysis server service (MSSQLServerOLAPService) is running.

    Please restart the application.

    Step Error code: 80040075

    Step Error Help File:

    Step Error Help Context ID:1000440

    These all seem to have to do with the database being tied up processing. The locking error seems to suggest that the process cube task had processed another dimension and locked the terminating card dimension.

    Now, this package always completed in a timely manner without error for over a year. However, about the time these problems developed, the network admin installed a series of updates to the Win2k Server that AS and SQL Server run on. I am wondering if anyone else has experienced similar problems after installing updates to Win2k Server? Or if I should be looking in other directions for the sudden cause of these errors?

    Thank you,

    Michael

    Michael Weiss


    Michael Weiss

  • As a follow up, I noticed that after stopping and restarting the AS service, total memory usage on the server drops by approximately 2gb. Also, the msmdvldm.exe service starts two instances each using approximately 350mb of RAM. I assume this is 'leftover' processing [from the long running package that stopped when the AS service was shut down] of the large dimensions in the cube. Lastly, many of the Knowledge Base articles I came across that address problems similar to, but not the specific errors, I am encountering suggest to reinstall Analysis Services or the latest service pack (I am running SP3 for AS). Anyway, I thought this might shed some light on the issues.

    Thank you,

    Michael Weiss


    Michael Weiss

  • Hi Michael,

    We've experienced similar 'issues' in build times/speed. There's a few settings that you can mod wrt registry memory settings, I'll have to talk with our guys and re-read the docs to send thru what they are - have you read the reg settings doc yet? There's also another on the Msft site that details the mem usage for builds as well as querying, sorry can't find link but will send it thru when I get it from one of our guys. Just FYI, the following is a great jump page within the MSft site for most things BI/AS (http://www.microsoft.com/sql/evaluation/bi/default.asp).

    Going from 90 minutes to 12 hours definitely sounds suss. Have you looked at the memory (through windows) on the machine? All disks ok? Network running well? is the EDW on a different server to the AS service? Who manages the EDW? Has there been any removal of indexes that could be now killing your query performance?

    Steve

    Steve.

  • A couple of additional items to look at re: memory et al (from the *new* distinct Count doc on Msft web site).

    Other analysis services settings: Under the Analysis Server Properties dialog box, there are various settings that can be modified:

    Process buffer size and read ahead buffer size: Increase these default sizes depending upon the size of cubes and dimensions. In our case, we got the best performance by increasing the process buffer size to 400 MB and keeping the Read ahead buffer size at 10 MB. The more important setting to focus on is the process buffer size. The rule of thumb is to modify this setting until no temporary folder is used during processing.

    Keep data folder and temporary folder on different drives if possible.

    It is good practice to log the processing of cubes and queries into log files.

    Correct dimension and partition counts for aggregation levels: One very important observation was that it is important to have correct counts on various dimension levels and facts before specifying the aggregation levels. Correct counts (number of members in the relational database tables) should be verified by looking at the dimension editor, and so on. There is a tool called Partition Manager (included with the Accelerator for BI) that can help in managing these counts. Aggregation levels will not be right if these counts are incorrect.

    Steve.

    Steve.

  • Thank you, Steve. I will have a look at the documents. I am really thinking this has something to do with all of the Win2k updates that were installed on the box a few weeks ago. There have been no changes to the DW and I am the one who oversees the administration of that. I have not dropped and re-created the indexes on the fact table in a while as I have the SQL Server set to auto-update stats. Would I be better off to drop and re-create the indexes on a weekly basis or so even though auto update of stats is in effect?

    Also, I am going to try reinstalling SP3 for AS (and probably for SQL Server as well) and see if that helps.

    I will look for the Dimension manager tool and try to incorporate that as well. I did some reading yesterday on the Process Buffer and Read Ahead Buffer settings and adjusted those. They are both set at 8mb now but I think I will follow your advice and increase those settings until there is no temp file created during processing. As things stand now, the data file and temp file are on separate disks.

    Did you see my posting on the DTS list about the cube processing task filter not being set correctly? Somehow, I think all of these things are related...

    I will keep you posted of my progress and any speedbumps I run into.

    Thank you,

    Michael

    Michael Weiss


    Michael Weiss

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply