If you are using the Tabular model in SSAS, it will use the xVelocity technology to load your entire database in memory (greatly compressing the database). So what happens if your database is too big to fit in memory? You will get this error when you process the model:
“The following system error occurred: Insufficient quota to complete the requested service.
Memory error: Allocation failure. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.
The current operation was cancelled because another operation in the transaction failed.”
This happens because, by default, no paging to disk is allowed if the data is too big for the amount of available memory on the machine where the model resides. To change this, go to the Analysis Server properties (by right clicking on the Tabular Model server) and change the property VertipaqPagingPolicy to 1 or 2:
Another solution is to use the DirectQuery mode, which bypasses the in-memory model, so client applications query data directly at the source. Then it does not matter how big the database is since you are not using any memory. Of course the trade-off is much slower queries.
Memory Settings in Tabular Instances of Analysis Services
Undo Bad Tabular Mode Analysis Server Properties
Woops I ran out of memory while processing my tabular model