Deploying very large Tabular Cube

  • Hi,

    We have a large data warehouse - a number of fact and dimension tables.  As part of building out SSAS Cube, we built a view that stacks all of the Fact tables together.  Once stacked, they are contain about 1.5 billion rows (~40 columns)

    I tried to process it yesterday and got the following error:

    "Memory error: Allocation failure : The paging file is too small for this operation to complete. . If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine."
    Memory error: Allocation failure . If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.
    The current operation was cancelled because another operation in the transaction failed."

    I am guessing the data size is too large.

    Question 1: What are my options here?

    Questions 2: Is there a way to estimate the size of the cube and then figure out how large the SSAS server needs to be to be able to accommodate its current and future size (as it grows)?

    Thank you in advance!

  • If you are doing a full process of the cube, you need twice the memory (more or less) that the data will take. You can shrink this by partitioning the cube. Then you only need twice the memory for the partition being processed. That's assuming you have enough memory. If not, then you can shrink the data or add memory.

  • For question #2, it's not an exact science, but if you figure the uncompressed size of the data and it follows a typical pattern then it's approximately 10% of the uncompressed size.

  • squareoff - Friday, July 13, 2018 8:50 AM

    Hi,

    We have a large data warehouse - a number of fact and dimension tables.  As part of building out SSAS Cube, we built a view that stacks all of the Fact tables together.  Once stacked, they are contain about 1.5 billion rows (~40 columns)

    I tried to process it yesterday and got the following error:

    "Memory error: Allocation failure : The paging file is too small for this operation to complete. . If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine."
    Memory error: Allocation failure . If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.
    The current operation was cancelled because another operation in the transaction failed."

    I am guessing the data size is too large.

    Question 1: What are my options here?

    Questions 2: Is there a way to estimate the size of the cube and then figure out how large the SSAS server needs to be to be able to accommodate its current and future size (as it grows)?

    Thank you in advance!

    Few questions
    First what is the hardware you trying to run this on - CPU/Ram (total and available for SSAS), Paging file max size , free space on paging file drive
    OS version (and 32 or 64 bit)
    SQL Server version

    With regards to processing are you doing process full or other

    And with regards to the cube itself.
    Are you telling us that you have multiple fact tables on a database and you combined them all into a single table/view? It is possible this is not the best option just from a usage point of view unless all source fact tables have all the same granularity.

    On the processing - At what stage did it fail?
    Read Data
    Process Data
    Process Measures
    Process hiearchies
    or other.

    And how many measures/calculated fields do you have on top of the columns you are processing.

    And finally what is the data type of the columns and their cardinality.

  • Hi frederico_fonseca,

    I am having a similar issue. I have 23 tables in cube, of which 18 are dimensions and 5 are facts. I am having issue in the initial stages, i.e. importing tables. All the tables except 2 facts are loaded. The facts that haven't loaded have 25Cr records. When I try to load the tables individually, they fail after importing 5Cr records. Can you please help me out here. I am giving few details that you have asked squareoff in the earlier post.

    Few questions

    First what is the hardware you trying to run this on - CPU/Ram (total and available for SSAS) - Total 64GB, SSAS 8GB (calculated based on total memory and max memory for SQL), maximum memory for SQL is 50GB

    Paging file max size -

    free space on paging file drive -

    OS version (and 32 or 64 bit) - 64 bit

    SQL Server version - 2016

    With regards to processing are you doing process full or other - Haven't reached the stage of processing yet

    And with regards to the cube itself.

    Are you telling us that you have multiple fact tables on a database and you combined them all into a single table/view? - yes

    It is possible this is not the best option just from a usage point of view unless all source fact tables have all the same granularity.

    On the processing - At what stage did it fail? - Read Data

    And how many measures/calculated fields do you have on top of the columns you are processing. - None

    And finally what is the data type of the columns and their cardinality. - int, bigint, float, numeric, varchar

    We have tried the below:

    • setting the VertipaqPagingPolicy to 2.
    • tried importing only the fact table in a new project.
    • Also tried changing the default values of LowMemory and TotalMemory on the SSAS server properties.

    Thank you,

    Anusha

  • what is 25Cr records - please give the full number. e.g. 25 000 000 or 2 500 000 000.

    if it is what I think then you better get another 400 GB ram on that server as it is well underspecd

    but 8GB is too low in any case - for your case you most likely need a minimum of 16GB allocated to tabular

  • Hi frederico_fonseca,

    total record count = 253 405 059

    I hope the calculated memory that I mentioned for SSAS is correct; I have calculated like below:

    Total memory - 64GB;  DB engine (max server memory )- 50GB;  OS - 4GB

    So for SSAS = 64-50-4 = 10GB approx

    I am attaching the DB server properties

    Can we try reducing the max server memory limit on the DB server and set it to 30GB. Will there be any implications if we change this? Kindly suggest

     

    Attachments:
    You must be logged in to view attached files.
  • Hi frederico_fonseca,

    How can we know how much memory is allocated to tabular instance? Can you please help us out here.

    Also we have got the memory allocation lowered for SQL DB instance and we are trying to load the data, it has been running since yesterday evening and import has reached to 55,400,001 records. Is there anything that we are missing, kindly suggest.

    Thank you,

    Anusha

  • what you are missing is a server with capable spec to hold your volume of data.

    if you wish to know your bare minimum memory required for your Tabular databases do as follows

    create a normal SQL Server database

    create a columnstore table for each table you have on the cube.

    insert the same data that goes onto the cube onto these tables.

    your required memory will be the size of the tables on this database plus 2.5 times the size of your biggest table.

    This is your minimum size - depending on your other measures and cube usage you may need a lot more memory.

    Note that this minimum also takes in consideration that you do the cube processing in stages instead of all in one go.

    e.g. process clear, followed by load data, followed by process recalc.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply