Error while processing CUBE - SSAS 2005

  • The situation is like this.

    We have an SQL Server 2005 agent job to process cube, by calling an XMLA.

    Its a cube which is pulls data from an SQL 2005 Warehouse server.

    The error thrown is:

    Please see the attachment Error.JPG

    We tried changing the following configurations, and it dint work.

    sp_configure 'show advanced options', 1

    GO

    RECONFIGURE WITH OVERRIDE

    GO

    sp_configure 'max degree of parallelism', [4]

    --

    GO

    RECONFIGURE WITH OVERRIDE

    GO

    -- BEGIN

    sp_configure 'remote query timeout', 0

    go

    reconfigure with override

    go

    -- END

    can anyone, please give your suggestion on this.

    Regards | Enbee

  • does the agent have the correct login credentials to access the source database

  • yes.

    We have tried the following configuration setting also, but still it throws the same error.

    -- SCRIPT START

    sp_configure 'remote login timeout', 0

    go

    reconfigure with override

    go

    -- SCRIPT END

    Regards | Enbee

  • I would definitely verify the account that is running the job to verify that the credentials have the necessary permissions to the data source and to process the cube. Setup Profiler and run a trace against the database where the data resides to verify what credentials are being used to connect (or if a connection is actually even being made). Make sure that you verify the owner of the job and make sure this is setup correctly. Running the Profiler trace from SQL Server Management Studio should definitely narrow down the connectivity issues.

    ----------------------------------------------------------------------------------------
    Dan English - http://denglishbi.wordpress.com

  • It seems to be an access problem with the OLAP database,

    can you process the cube from management studio?

    does the agent have the correct security to access the OLAP database?

    have you redesigned the partitions and not deployed them?

    does the agent have file security permissions on the OLAP database folder?

  • Your processing query for your facts is timing out. By the name of the partition vFact, I assume it is probably from a view, but there are other factors that can impact the query it runs to process facts. Do you have any materialized referenced dimensions? If you do, they add a join to the fact processing query that can slow things down.

    Try processing the partition through the GUI. While the processing is running, right-click in the dialog and copy the messages to the clipboard. The XML will include the query it is running - this is the query you need to optimize. It is probably taking too long to return any records.

    In addition to this, the server properties on your Analysis Services Server (right-click on the SSAS server in management studio and bring up the properties) includes an "ExternalCommandTimeout" property - you can increase this to give SSAS more time before it times out the query.

  • Thank you, every one for the responses.

    I will try these options, and get back.

    Regards | Enbee

  • Hi everyone,

    I tried above options, but the problem still exists.

    When I run process cube from Management Studio, it finishes in about 20 minutes.

    When the same thing is being executed from a job through command line using XMLA command, its running for about 2 hours, and then throwing the query timeout error.

    ///

    OLE DB error: OLE DB or ODBC error: Query timeout expired; HYT00." Source="Microsoft SQL Server 2005 Analysis Services" HelpFile="" /><Error ErrorCode="3240034318" Description="Errors in the OLAP storage engine: An error occurred while processing the '

    vFact' partition of the 'vFact' measure group for the 'vAMGeneric' cube from the AC_OLAP database." ///

    The profiler, run says that, its retrieving fact data from the datamart, through views.

    is there any particular configuration/ setting would solve this.

    or if anybody can tell what could be the root cause atleast.

    Thanks a lot.

    Regards | Enbee

  • You shouldn't be seeing a different performance profile running it via the designer vs running it in batch, so there's something fundamentally different between the two senarios:

    - How many threads are you running in each case?

    - How many connections are you running against SQL server. Try cutting back the connections to one for each CPU (the default was 10 connections). This is a better control.

    - Are you using the same credentials to connect in both cases? Is there anything in the views that's credentials sensitive?

    - Are you running everything on one machine or across machines? When you run it in batch, do you see the same performance profile (i.e. via Perf Mon) as running it in the designer?

    - How are your partitions designed? (Consider clustering your DW by partition criteria. Dramatic impact on cube build times.)

  • Usually queries from SSAS timeout after 1 hour (default value). Since your queries are timing out beyond that time, maybe you've set the timeout manually. You can go to Management Studio, connect to SSAS server, right click it, open its properties, mark the checkbox "show advanced properties", and set the appropriate timeout values. Note: although this is an "easy" way out, it is not advisable to tamper these values. Probably you have issues with partitions. Look for those which are used in distinct count metrics, and process them individually (by this I mean creating an app or DTSX package that goes into each measure group, and processes each partition sequentially, rather than in parallel). Normally one common problem for timeouts is concurrency between queries. Hope this helps in some way.

  • No need to go the DTSX route (nor is it advisable).

    Simply change the threads on your process job to 1, and they'll process one at a time. (Then, you can observe the profile of the individual partitions as the process.)

    Remember, you can process an individual partition easily, if you're testing various scenarios.

    I believe the base problem he's researching is why does it take 20 min running it interactively vs 2 hrs (the times out) in batch.

    Basically, interactive is the same as batch, with the exception of where the feedback goes. (In both cases, the processing is performed by the MSAS "engine".) Therefore, there's a difference in the two setups. Most likely, its somehow related to the authentication and/or the views. That assumes that you're already accounted for other external factors (i.e. other jobs running on the server.)

    IMPORTANT: Running any data warehouse scenario where it takes 1 hour for the database to start returning rows is extremely problematic. Look at the query and optimize it, even if it means replicating data on the data table. Data warehouses are different from OLTP systems. Different rules apply.

  • We had a similar issue processing an individual dimension.

    Setting the data source to only allow one connection at a time allowed us to see the actual problem in more detail.

    (Just nulls in the dimension table, which we should have spotted.)

    Once we'd fixed the underlying problem we reset it to the default 10.

Viewing 12 posts - 1 through 11 (of 11 total)

You must be logged in to reply to this topic. Login to reply