SQL SSAS 2016 Server out of memory error

  • Some backgroud Idea : 

    1) 2 different Server one having only analysis Serveres having 64 GB Ram and Windows 2016 standard processor 3 , virtual machine .
    2) I got the error in eventvwr. As “Analysis Services Cube processing fails with error "OLE DB error: OLE DB or ODBC error: Operation canceled; HY008."” As per the error below trial email is the solution .
    externalCommandTimeout, I have increased it to 7200 which is 2 hours. But the job still failed after 50 minutes with the same out of memory error
    2)  a. 2 different Server one having only analysis Servere having 64 GB Ram and Windows 2016 standard processor 3 , virtual machine 
     b. another Server is having SQL 2016 enterprise edition SP1 ,64 bit Vmware Virtual machine .form where Job execute and failed with below error .
    I got the error in eventvwr. As “Analysis Services Cube processing fails with error "OLE DB error: OLE DB or ODBC error: Operation canceled; HY008."” As per the error below trial email is the solution . 
    externalCommandTimeout, I have increased it to 7200 which is 2 hours. But the job still failed after 50 minutes with the same out of memory error

    Task did on Server : 
    3)   Charge table: 173 million rows (I cut down one year already. Down from 200 million rows). This still failed.

    Memory configuration Set in SSAS Server :

    Date        6/9/2018 12:37:16 PM
    Log        Job History (- ETL Weekly 1200 Cube)

    Step ID        3
    Server        Server name 
    Job Name        - ETL Weekly 1200 Cube
    Step Name        Charge
    Duration        00:50:12
    Sql Severity    0
    Sql Message ID    0
    Operator Emailed    
    Operator Net sent    
    Operator Paged    
    Retries Attempted    0

    Message
    Executed as user: dOMAIN\lOGIN. Microsoft.AnalysisServices.Xmla.XmlaException: The JSON DDL request failed with the following error: Failed to execute XMLA. Error returned: 'The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.'.. at Microsoft.AnalysisServices.Xmla.XmlaClient.CheckForSoapFault(XmlReader reader, XmlaResult xmlaResult, Boolean throwIfError) at Microsoft.AnalysisServices.Xmla.XmlaClient.CheckForError(XmlReader reader, XmlaResult xmlaResult, Boolean throwIfError) at Microsoft.AnalysisServices.Xmla.XmlaClient.SendMessage(Boolean endReceivalIfException, Boolean readSession, Boolean readNamespaceCompatibility) at Microsoft.AnalysisServices.Xmla.XmlaClient.SendMessageAndReturnResult(String& result, Boolean skipResult) at Microsoft.AnalysisServices.Xmla.XmlaClient.ExecuteStatement(String statement, String properties, String& result, Boolean skipResult, Boolean propertiesXmlIsComplete) at Microsoft.AnalysisServices.Xmla.XmlaClient.Execute(String command, String properties, String& result, Boolean skipResult, Boolean propertiesXmlIsComplete) at Microsoft.SqlServer.Management.Smo.Olap.SoapClient.ExecuteStatement(String stmt, StatementType stmtType, Boolean withResults, String properties, String parameters, Boolean restrictionListElement, String discoverType, String catalog) at Microsoft.SqlServer.Management.Smo.Olap.SoapClient.SendCommand(String command, Boolean withResults, String properties) at OlapEvent(SCH_STEP* pStep, SUBSYSTEM* pSubSystem, SUBSYSTEMPARAMS* pSubSystemParams, Boolean fQueryFlag). The step failed.

  • You've described an out of memory problem and mention 2 servers, but I cannot understand what you are asking for help with. Can you clarify, by asking one or more questions?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • HI , 

    Job is getting failed with out of memory error though Server is having enough RAM .
    Job is failing on Second Server which refer to first Server

  • Are you doing Tabular or Dimensional processing? if tabular standard edition is limited to 16GB which may not be enough for your volumes. And assume 2-3 times the standard size of the cube after processing as the required memory to process it in the first place.

    And had you look at performance metrics to see which server is reaching limits first - although it seems it is the SSAS one, that is not always the case.

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply