Urgen help needed on MDX or SSAS cube design

  • Hi, I just recently found we have duplicated records in our fact tables which are allowable. That means the combination of all dimension keys cannot uniquely determine a row in the fact table. Consider the following data:

    Fact Test

    DateKey TestKey CustomerKey Amount

    5 1 1 16

    5 1 2 10

    5 1 2 4

    When I tried to get the average Amount by DateKey and TestKey, I used the following mdx query

    WITH MEMBER Measures.AvgAmountCal AS (

    avg({[Dim Test].[Test Key].CURRENTMEMBER*[Dim Customer].[Customer Key].CHILDREN*[Dim Date].[Date Key].CURRENTMEMBER},

    Measures.[Amount]

    )

    )

    SELECT

    {Measures.[Amount],[Measures].[Fact Test Count],Measures.AvgAmountCal} ON 0

    ,NONEMPTYCROSSJOIN([Dim Date].[Date Key].&[5], [Dim Test].[Test Key].CHILDREN) ON 1

    FROM [Test DB]

    However the result is

    AmountFact Test CountAvgAmountCal

    5130315

    It is more like that SSAS aggregate the duplicate key rows into one and count them as one row. Of course this can be solved by using Amount/[Fact Test Count]. But how can I calculate Standard Deviation/Top Percentile? They gave me the wrong value. I have tried serveral ways, including changing SSAS cube aggregation method from Sum to None. But it still does not solve the problem. This is very urgent. Anyone please provide your inputs. Thanks ahead!

  • You could google to be sure but I'm reasonably sure that the engine does exactly that - aggregates records that would be dupes. Without another key value to differentiate them, it makes good sense to aggregate them to save space etc

    Steve.

  • Appreciate Steve for your quick input. So there are no way of walking around, right? I think this phenomenon is very normal since in most fact table design, we allow some value (like -1) to represent unknown keys. If they will be aggregated into one row (if other keys are same), this will generate incorrect data calculation when rolling up, without being detected (like in this case). Am I right?

    Thanks again for your quick reply, Steve.

  • That sounds right, yes. If there isn't anything to uniquely identify the rows (different invoice #, transaction id, time of day etc) then for all intents and purposes, they're the "same thing" and *should* be aggregated.

    It sounds like there *should* be something that identifies these as separate events but your DW is not capturing it.

    Steve.

  • Once the data is in SSAS, the most granular you can get is at the dimension keys. As mentioned by Steve, to be able to analyze the two records as separate, you'll need another dimension that separates them. SSAS is meant to be used as a tool to perform aggregate data analysis so when it stores data, it loses all the record level details.

    Creating a SalesID/TransactionID that uniquely identifies the record and creating a dimension on it will allow you to perform the calculations you want, but you'll face a large storage and performance cost since you'll no longer be only storing aggregate data in SSAS.

    If you need to perform standard deviation and percentiles across each individual record, you'll probably just have to do it in SQL.

    I've had implementations where the user wanted to slice and dice on SSAS in excel, then drill into details which were queried from SQL. I used ASSP[/url] to tie an action to run a stored procedure that would return a data set based on parameters passed by the excel slice.

  • Short answer is your fact table violates proper granularity. By definition a single fact must be unique based on the granularity of the dimensions. If it isn't unique then you are either missing a dimension or have not decomposed a dimension far enough. A cube aggregates measures across the dimensions. There is no such thing as a row in a cube so trying to create MDX based on SQL concepts will get you in trouble every time.

    There are no facts, only interpretations.
    Friedrich Nietzsche

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply