Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 

James Serra's Blog

James is currently a Senior Business Intelligence Architect/Developer and has over 20 years of IT experience. James started his career as a software developer, then became a DBA 12 years ago, and for the last five years he has been working extensively with Business Intelligence using the SQL Server BI stack (SSAS, SSRS, and SSIS). James has been at times a permanent employee, consultant, contractor, and owner of his own business. All these experiences along with continuous learning has helped James to develop many successful data warehouse and BI projects. James has earned the MCITP Business Developer 2008, MCITP Database Administrator 2008, and MCITP Database Developer 2008, and has a Bachelor of Science degree in Computer Engineering. His blog is at .

What happens when a SSAS Tabular model exceeds memory?

If you are using the Tabular model in SSAS, it will use the xVelocity technology to load your entire database in memory (greatly compressing the database).  So what happens if your database is too big to fit in memory?  You will get this error when you process the model:

“The following system error occurred: Insufficient quota to complete the requested service.

Memory error: Allocation failure. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.

The current operation was cancelled because another operation in the transaction failed.”

This happens because, by default, no paging to disk is allowed if the data is too big for the amount of available memory on the machine where the model resides.  To change this, go to the Analysis Server properties (by right clicking on the Tabular Model server) and change the property VertipaqPagingPolicy to 1 or 2:

  • Zero (0) is the default.  No paging is allowed.  If memory is insufficient, processing fails with an out-of-memory error.  All Tabular data is locked in memory
  • 1 enables paging to disk using the operating system page file (pagefile.sys).  Lock only hash dictionaries in memory, and allow Tabular data to exceed total physical memory
  • 2 enables paging to disk using memory-mapped files.  Lock only hash dictionaries in memory, and allow Tabular data to exceed total physical memory
When VertiPaqPagingPolicy is set to 1 or 2, processing is less likely to fail due to memory constraints because the server will try to page to disk using the method that you specified.  Note the property VertiPaqMemoryLimit specifies the level of memory consumption (as a percentage of total memory) at which paging starts.  The default is 60.  If memory consumption is less than 60 percent, the server will not page to disk.

Another solution is to use the DirectQuery mode, which bypasses the in-memory model, so client applications query data directly at the source.  Then it does not matter how big the database is since you are not using any memory.  Of course the trade-off is much slower queries.

More info:

Memory Settings in Tabular Instances of Analysis Services

Memory Properties

Undo Bad Tabular Mode Analysis Server Properties

Woops I ran out of memory while processing my tabular model

Comments

Leave a comment on the original post [www.jamesserra.com, opens in a new window]

Loading comments...