Tabular Model Memory Issues

  • I have a tabular-model which has recently started to run out of memory when the job that performs the full process is run

    (Memory error: Allocation failure : The paging file is too small for this operation to complete.) Vertpaqpolicy is set to 1 to use the pagefile - but in this instance even this is not sufficient

    What are the options that I might consider to resolve this - there are some calculated fields so my understanding is that switching to directquery isn't an option

    Would partitioning help in this scenario - if so but how does this work with an in-memory model, does it load the requested partitions into memory and leave the rest on disk somewhere.

    Cheers

  • When the full process is performed on the tabular database, you are still storing the previous model in memory, so you need at least twice as much memory as it needs during the normal operation.

    Two options:

    1)Use Process Clear, followed by Process Full – deletes all data from model, before loading it again. Backup database prior to this.

    2)Perform incremental Full processing on tables instead of Full on database. Requires a script, example http://www.sqlbi.com/articles/incremental-processing-in-tabular-using-process-add

  • Thanks worked perfectly selected process clear from Ssms and re-ran the process full and it ran through completely this time

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply