Click here to monitor SSC
SQLServerCentral is supported by Redgate
Log in  ::  Register  ::  Not logged in

James Serra's Blog

James is a big data and data warehousing technology specialist at Microsoft. He is a thought leader in the use and application of Big Data technologies, including MPP solutions involving hybrid technologies of relational data, Hadoop, and private and public cloud. Previously he was an independent consultant working as a Data Warehouse/Business Intelligence architect and developer. He is a prior SQL Server MVP with over 30 years of IT experience. James is a popular blogger ( and speaker, having presented at dozens of PASS events including the PASS Business Analytics conference and the PASS Summit. He is the author of the book “Reporting with Microsoft SQL Server 2012”. He received a Bachelor of Science degree in Computer Engineering from the University of Nevada-Las Vegas.

What happens when a SSAS Tabular model exceeds memory?

If you are using the Tabular model in SSAS, it will use the xVelocity technology to load your entire database in memory (greatly compressing the database).  So what happens if your database is too big to fit in memory?  You will get this error when you process the model:

“The following system error occurred: Insufficient quota to complete the requested service.

Memory error: Allocation failure. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.

The current operation was cancelled because another operation in the transaction failed.”

This happens because, by default, no paging to disk is allowed if the data is too big for the amount of available memory on the machine where the model resides.  To change this, go to the Analysis Server properties (by right clicking on the Tabular Model server) and change the property VertipaqPagingPolicy to 1 or 2:

  • Zero (0) is the default.  No paging is allowed.  If memory is insufficient, processing fails with an out-of-memory error.  All Tabular data is locked in memory
  • 1 enables paging to disk using the operating system page file (pagefile.sys).  Lock only hash dictionaries in memory, and allow Tabular data to exceed total physical memory
  • 2 enables paging to disk using memory-mapped files.  Lock only hash dictionaries in memory, and allow Tabular data to exceed total physical memory
When VertiPaqPagingPolicy is set to 1 or 2, processing is less likely to fail due to memory constraints because the server will try to page to disk using the method that you specified.  Note the property VertiPaqMemoryLimit specifies the level of memory consumption (as a percentage of total memory) at which paging starts.  The default is 60.  If memory consumption is less than 60 percent, the server will not page to disk.

Another solution is to use the DirectQuery mode, which bypasses the in-memory model, so client applications query data directly at the source.  Then it does not matter how big the database is since you are not using any memory.  Of course the trade-off is much slower queries.

More info:

Memory Settings in Tabular Instances of Analysis Services

Memory Properties

Undo Bad Tabular Mode Analysis Server Properties

Woops I ran out of memory while processing my tabular model


Leave a comment on the original post [, opens in a new window]

Loading comments...