Integration Services - Issues when using the Analysis Services Processing task. It doesnt seen to be processing the dimensions

  • I have an Integration Services package and in it I have Analysis Services Processing Tasks

    I have a task for each dimension rather than have them all in one package to update the dimensions

    Then 2 Procesing tasks for the cube updates.

    The dimensions run successfully. But the two cubes fail due to the Attribute keys can’t be found etc. It is as if the cubes have been run within the dimensions being processed first, which is incorrect because I have run the dimensions.

    I then run it in Analsis Services. Dimensions first, then the cubes and it is successful.

    I then try again in the IS package and its also successful. Presumably because its been processed in Analysis Services.

    So clearly even through Integrated Services says that the dimensions have run successfully isnt true.

    Can anyone give me any tips on this? Unless I manually process the cubes and dimensions I cant be sure its working. I cant do overnight processing which is… well rubbish.

    BTW Everything is on Processing Options = Update. I took this to be process updates only to save time. I am really at a loss here at the moment.

    Debbie

  • Change the dimensions to Process Full, that way you'll be sure they have the latest data.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Thats what I was thinking,

    The only thing that worries me is the extra time it will take to process everything.

    Annoying that Process Updates is so flaky.

    Thanks for the advice 🙂

    Debbie

  • Debbie Edwards (6/12/2012)


    Thats what I was thinking,

    The only thing that worries me is the extra time it will take to process everything.

    Annoying that Process Updates is so flaky.

    Thanks for the advice 🙂

    Debbie

    I recently had issues myself with Process Update, and I changed everything to Process Full. Not really sure why Process Update behaves so clunky.

    If your dimensions aren't too large, it shouldn't really be a problem.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • At least its not me then.

    I will get that updated and pop a note in my guidance about Processs updates in Integration Services

    Thank you

    Debbie

  • This just gets worse....

    I changed everything to full process and ran and 1 dimension and both cubes failed again (Like an idiot I didnt check the reasons for the fail)

    I processed it in Analysis Services and it ran successfully.

    I am really beginning to dislike Integration Services. Its very hit and miss. :blink:

  • That's odd.

    When something happens like that, I'd check if everything is configured correctly in SSIS.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Might have found something,

    I process Updates in analysis Services but I changed the problem dimension to Process FULL and it failed and it looks like there is Some issues.... E.g.

    Attribute relationship [School Name] -> [School DFEE] is not valid because it results in a many-to-many relationship.

    so At least I have somethint actually sort out now.

    Does this mean that its best to always process FULL because the Update is hit and miss?

  • Normally Process Update shouldn't be hit & miss.

    I stumbled across this blog post on MSDN:

    Different Kinds of SSAS Processing in simple words…

    This particular line is interesting:

    ProcessUpdate is inherently slower than ProcessFull since it is doing additional work to apply the changes.

    This means there is no actual reason to do Process Update instead of Process Full, so I would just stick with the last one.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Here was I thinking that just doing the updates would speed things up.

    One other question.

    I have broken my Analysis Services update into seperate tasks. So each dimension sits in its on task. Then a cube and then then last cube.

    Is this the best way to do it?

    OR should you have everything in one Analysis Services package. When I had it that way I kept running out of space.

    is there a reccomended practice for this?

    Debbie

  • I usually do all the dimensions in one task and the cube/partitions in another.

    It's OK to process dimensions in parallel. If it gives you issues, you can try to group the dimensions together in smaller groups.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Brilliant,

    Thanks for that. Changing everything to Process FULL threw up a couple of duplication issues with the data that wasnt showing at all with Process Update.

    Fingers cross things should be a little better now.

    thanks for all your help

    Debbie

  • No problem, glad to help.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Well As expeced, It failed again last night

    Source: Analysis Services Processing Task SEN Cube Analysis Services Execute DDL Task Description: Internal error: The operation terminated unsuccessfully

    Analysis Services Processing Task SEN Cube Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: An error occurred while processing the 'SEN Fact' partition of the 'COP Stage Pupil Business Measures' measure group for the 'SEN' cube from the CAYA_DataWarehouse_Dev database. End Error Error: 2012-06-14 21:48:17.20 Code: 0xC11F0006 Source: Analysis Services Processing Task SEN Cube Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error: 2012-06-14 21:48:17.20 Code: 0xC11C0002 Source: Analysis Services Processing Task SEN Cube Analysis Services Execute DDL Task Description: Server: The operation has been cancelled. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC1000007 Source: Analysis Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Internal error: The operation terminated unsuccessfully. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC11F0006 Source: Analysis Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC11F000E Source: Analysis Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: An error occurred while processing the 'SEN Workflow Steps' partition of the 'SEN Workflow Steps' measure group for the 'SEN Workflow Steps' cube from the CAYA_DataWarehouse_Dev database. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC11F0006 Source: Analysis Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC11C0002 Source: Analysis Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Server: The operation has been cancelled. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 9:00:01 PM Finished: 9:48:44 PM Elapsed: 2923.79 seconds. The package execution failed. The step failed.

    So if i now go into Analysis Services Deploy and run everything myself it will be fine.

    I think this is the issue.

    In Analysis Services when you try and process something you get a message:

    The Server content appears to be out of date. Would you like to build and deploy the project first.

    You click yes and go from here to completion. This is the bit Missing from the Integration Services package.

    There must be an easy way of adding the deploy to the integration services task so the cube deploys first and then the dimensions and cubes are processed?

  • I'm further answering your question on the MSDN forum 🙂

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

Viewing 15 posts - 1 through 15 (of 32 total)

You must be logged in to reply to this topic. Login to reply