Memory error: The operation cannot be completed because the memory quota estimate (4095MB).....

  • Hello

    I'm having the identical problem as well. In fact, the memory quota values are quite similar..... estimate(3413MB) exceeds the available system memory(1331MB).

    The cube processing was not an issue up until last week. The ETL process to load the back end database has not changed and we are not introducing large quantities of data. The dimensions process fine but the cube chokes! We just up'd the server memory by an additional 2 Gig and still get the same error. We are up to SP2 for the SQL server. Any help would be appreciated.

  • To help you with the diagnosis, I suggest you post the following information:

    The following counts:

    - records per dimension

    - attributes per dimension

    - records in the fact table (or records per partition)

    As well, it's helpful to understand (from a scale perspective):

    - Size of the source database

    - time to process

    - size of the cube (or last available size).

    - physical machine (64 bit vs 32, memory, # CPUs, etc.)

    I have to reinforce, that a typical BI project isn't to take transaction log level data and simply push into a cube, but involves some degree of design and compromises around the core business needs. In a pinch (when faced with endless requirements), consider designing one approach to solve 80% of the needs and another approach to solve the remaining 20%.

  • I have the same situation, in SSAS it is failing to process at the very end it says, quota estimate 3233 exceeds maxima of 1.3gb. Haven't changed anything since last month, but added one month of data.

    It is failing on adding one 320 mb dimension into the cube.

  • If you'd like some help, I'll need the counts described in my previous post. That'll generally point directly to the disconnect.

  • Ok, I have the same problem. Hooray! It would have been nice if this was the first thing Microsoft told me, not now 90% into the project with deadlines looming. Okay, so I need 1619MB and only have 1331MB apparently available on the 16GB 32-bit server. This is whilst processing as little as possible (only the cube with all dimensions preprocessed, separate sequential transactions). I've had IT set the Boot.ini to 3GB and indeed when I check the file, the switch part is there. Is there anyway to check whether the 3GB switch is making a difference? SQL Server is running on the same box. Is there some way I can restrict it from hogging the resources whilst I'm trying to process the cube or am I pretty much just screwed here? 64-bit server is a no go.

  • Edited to remove... misposted...

    Sorry folks...

    Steve

    (aka smunson)

    :-):-):-)

    Steve (aka sgmunson) ๐Ÿ™‚ ๐Ÿ™‚ ๐Ÿ™‚
    Rent Servers for Income (picks and shovels strategy)

  • Hello

    Not sure if this will help in your situation; however, I was having much grief with this issue and I eventually alleviated the problem by breaking down the updates into several pieces. I have an SSIS process that first brings over my raw data, then a step that (stops and restarts the SSAS services), then another step the processes the dimensions (full), then the final step the reprocceses the cubes.

    This has been running nightly for the last three weeks solid.

    p.s. Prior to doing all this, I believe something may have been corrupted in the production cube meta data so, I used SSMS (SQL server management studio) to delete all the dimensions and cubes from the production server. Once this was done I redeployed my BI project back onto the production server.

    Somewhere I did something right because I no longer have the issue any more!

  • Thanks for your replies!

    I'm trying to process the cube for the first time. Again, I've preprocessed all the dimensions and the data has been copied over.

    To simulate your code I restart SSAS manually and then try run the cube process. Still no happiness. It won't budge on the 1331 MB available. Even after I just had an accident with the restart SQL Server service button on a production box :-D. Anyway, I've tried setting the memory limits without any result. Still says the same thing. It is on default now. I see a couple of posts where people have more than 2GB available with the 3GB switch. If you can believe SSAS, I only need 300mb more. Any suggestions to get at least 2GB available for SSAS? It is a 32bit server with 16GB of Ram, running Sql server, SSIS and SSAS. No SSIS packages are running currently.

  • Hello, I double checked the registry and it seems that IT didn't actually select the 3GB option at startup. Hooray? No, seems like nothing can connect to the server when they enable the switch and has more or less given up on trying to fix it :crying:

    Below is the boot.ini from the server, is this a common problem? It is a 32bit Dell with Windoooze 2003 SP 1 and with PAE enabled.

    [boot loader]

    timeout=10

    default=multi(0)disk(0)rdisk(0)partition(2)\WINDOWS

    [operating systems]

    multi(0)disk(0)rdisk(0)partition(2)\WINDOWS="Windows Server 2003, Enterprise" /fastdetect /NoExecute=OptOut

    multi(0)disk(0)rdisk(0)partition(2)\WINDOWS="Windows Server 2003, Enterprise /3G" /fastdetect /NoExecute=OptOut /3GB

    I don't really now what any of it means except for the /3GB part, so any help would be appreciated.

    Thanks in advance.

  • There are two entries for the same folder within the "operating systems" section. I'm not sure why you would have that particular scenario, other than perhaps to have a choice as to whether or not to boot with the /3GB switch in effect or not. As one would need a re-boot anyway to change it, why bother with the extra entry? Either it works or it doesn't, and with this setup, it might be possible that the default choice will go into effect if no one specifically selects the one with the /3G during a re-boot. Thus, if someone forgets about the need to MAKE a choice during boot, the setting might not get implemented, and also resulting in the potential for "de-activation" of the fix by virtue of an unattended re-boot.

    Steve

    (aka smunson)

    :-):-):-)

    frederickd (4/1/2009)


    Hello, I double checked the registry and it seems that IT didn't actually select the 3GB option at startup. Hooray? No, seems like nothing can connect to the server when they enable the switch and has more or less given up on trying to fix it :crying:

    Below is the boot.ini from the server, is this a common problem? It is a 32bit Dell with Windoooze 2003 SP 1 and with PAE enabled.

    [boot loader]

    timeout=10

    default=multi(0)disk(0)rdisk(0)partition(2)\WINDOWS

    [operating systems]

    multi(0)disk(0)rdisk(0)partition(2)\WINDOWS="Windows Server 2003, Enterprise" /fastdetect /NoExecute=OptOut

    multi(0)disk(0)rdisk(0)partition(2)\WINDOWS="Windows Server 2003, Enterprise /3G" /fastdetect /NoExecute=OptOut /3GB

    I don't really now what any of it means except for the /3GB part, so any help would be appreciated.

    Thanks in advance.

    Steve (aka sgmunson) ๐Ÿ™‚ ๐Ÿ™‚ ๐Ÿ™‚
    Rent Servers for Income (picks and shovels strategy)

  • I had similar issues, and I found a solution, I loaded the cube onto a SQL2008 server SSAS , but used a SQL2005 source for the data load. I didn't want to go through all of this . Since I had that server up and running it was easy to try out.

  • The estimate done by SSAS 2005 is not accurate in some scenarios. The processing may not require the same amount of memory as estimated. In which case, try setting SSAS Advanced property "OLAP\ProcessPlan\MemoryLimitErrorEnabled" to false to inform SSAS to ignore the error caused by estimate and start the processing again.

    This had worked for me once before. But if the processing really required the estimated memory, then the processing would fail.

  • Hi, thanks guys. I've tried the setting to not do the calc of mem, but got the following error โ€œInternal error: An unexpected error occurred (file 'pffilestore.cpp', line 3311, function 'PFFileStore::HandleDataPageFault')โ€ . I see there is a fix for the bug.

    Anyway, so a bit of a progress report: I've taken away the most complex and biggest dimension and tried to at least process the cube without it. The cube starts processing, but I get an error complaining about it can not fit a string in something. Looks like this bug has also been fixed.

    Question then, how many Cumulative Update packages are there now available for SP 2 had have any required memory calculation problems been addressed in them? Is there a SP 3?

    Long and the short; so we'll probably have to buy processing space on the 64 bit enterprize cluster for this to work at huge expense. Not so easy to try and sell the idea to management in the economic down turn.

    I know this proof of concept convinced me that SQL 2005 is not an option for medium scale BI. Any future projects we are probably going to look at products from IBM or revert back to SAS. Hopefully by the time we are upgraded to SQL 2008, these issues would have been resolved.

  • Any future projects we are probably going to look at products from IBM or revert back to SAS

    Don't be in such a hurry to write off SSAS. Can you still buy a version of DB2 from IBM or a version of SAS that runs on 32-bit? Most other platforms moved their enterprise computing to 64-bit many years ago. Also SQL 2005 is now 4 years old - have you tried SQL 2008 SSAS on 64-bit? Most likely your hardware is 64-bit capable, it just needs a 64-bit OS and SSAS put on it.

    If you are going to compare SSAS to competitors, you should at least have a level playing field regarding 32 or 64-bit.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hรฉlder Cรขmara

  • Yes, there is SP3 for SQL 2005. I have Developer Edition of SQL Server 2005 on 64-bit Vista Ultimate, and it's running fine, but then, I don't do much with it at home except for trying to learn more about the administrative stuff. I know that others on the forum have SP3 in place, and if I recall correctly, there's even a thread about it.

    Steve

    (aka smunson)

    :-):-):-)

    frederickd (4/2/2009)


    Hi, thanks guys. I've tried the setting to not do the calc of mem, but got the following error โ€œInternal error: An unexpected error occurred (file 'pffilestore.cpp', line 3311, function 'PFFileStore::HandleDataPageFault')โ€ . I see there is a fix for the bug.

    Anyway, so a bit of a progress report: I've taken away the most complex and biggest dimension and tried to at least process the cube without it. The cube starts processing, but I get an error complaining about it can not fit a string in something. Looks like this bug has also been fixed.

    Question then, how many Cumulative Update packages are there now available for SP 2 had have any required memory calculation problems been addressed in them? Is there a SP 3?

    Long and the short; so we'll probably have to buy processing space on the 64 bit enterprize cluster for this to work at huge expense. Not so easy to try and sell the idea to management in the economic down turn.

    I know this proof of concept convinced me that SQL 2005 is not an option for medium scale BI. Any future projects we are probably going to look at products from IBM or revert back to SAS. Hopefully by the time we are upgraded to SQL 2008, these issues would have been resolved.

    Steve (aka sgmunson) ๐Ÿ™‚ ๐Ÿ™‚ ๐Ÿ™‚
    Rent Servers for Income (picks and shovels strategy)

Viewing 15 posts - 16 through 30 (of 30 total)

You must be logged in to reply to this topic. Login to reply