Processing the cube for Incremental Update

  • I am sorry if it sounds silly as I am new to Analysis Services.

    I am confused with the data is Fact table. I read in Books On Line saying we should specify some filter to process the cube for Incremental Update otherwise the cube will have duplicate data.

    What if we are scheduling the Incremental Update process and using the DTS to do the same. Should I go and change the filter each time in DTS? Should the Fact table have the data only for the updates?

    I am really confused .

  • Hello.

    The incremental update needs a filter to work properly; otherwise you will have duplicated data in your cube. Analysis Services reads the fact table to process the cube sending an SQL statement against data source. If you process incrementally the cube then its recommended to define a filter or the process will read the same data again and again for every time your process runs.

    I'm used to incrementally update my cubes using a date key. In the DTS task I use a filter like this (not exactly):

    date_key = getdate()-1

    Using this filter we populate our cubes incrementally with information from previous day.

    You may find more information in BOL, just search for "Partition Filters and Incremental Update Filters".

    Hope this helps.




    Fernando Ponte
    factdata.com.br

  • You may also want to look at Refreshing the data vs. Incremental Update.

    I don't know you specific situation, but I've found that sometimes refreshing the data is allot less cumbersome (data integrity wise) than Incremental Updates in some cases. I just offer it as an option.....

    From BOL....

    The Refresh data option causes a cube's data to be cleared and reloaded and its aggregations recalculated. This option is appropriate when the underlying data in the data warehouse has changed but the cube's structure remains the same.

    The Refresh data option can be performed while users continue to query the cube; after the refresh has completed, users have access to the updated data without having to disconnect and reconnect.

  • Thanks list for all your help and info.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply