Printed 2017/01/19 10:49PM

What happens when a SSAS Tabular model exceeds memory?


If you are using the Tabular model in SSAS, it will use the xVelocity technology to load your entire database in memory (greatly compressing the database).  So what happens if your database is too big to fit in memory?  You will get this error when you process the model:

“The following system error occurred: Insufficient quota to complete the requested service.

Memory error: Allocation failure. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.

The current operation was cancelled because another operation in the transaction failed.”

This happens because, by default, no paging to disk is allowed if the data is too big for the amount of available memory on the machine where the model resides.  To change this, go to the Analysis Server properties (by right clicking on the Tabular Model server) and change the property VertipaqPagingPolicy to 1 or 2:

When VertiPaqPagingPolicy is set to 1 or 2, processing is less likely to fail due to memory constraints because the server will try to page to disk using the method that you specified.  Note the property VertiPaqMemoryLimit specifies the level of memory consumption (as a percentage of total memory) at which paging starts.  The default is 60.  If memory consumption is less than 60 percent, the server will not page to disk.

Another solution is to use the DirectQuery mode, which bypasses the in-memory model, so client applications query data directly at the source.  Then it does not matter how big the database is since you are not using any memory.  Of course the trade-off is much slower queries.

More info:

Memory Settings in Tabular Instances of Analysis Services

Memory Properties

Undo Bad Tabular Mode Analysis Server Properties

Woops I ran out of memory while processing my tabular model

Copyright © 2002-2017 Redgate. All Rights Reserved. Privacy Policy. Terms of Use. Report Abuse.