Memory error: The operation cannot be completed because the memory quota estimate (4095MB).....

  • Memory error: The operation cannot be completed because the memory quota estimate (4095MB) exceeds the available system memory (1996MB).

    Been stuck with the error for a few days now ..... could really use some help.

    The error occurs when processing a vldm in SSAS , have tried different process methods add , update , full (was performed quite some time back)

    SQL Server 2005 32 bit , sp 2 with /3gb switch and AWE enabled in OS and SQL.

    Total RAM is 8 GB and perfmon counter show atleast 3.5 GB RAM available throughout the processing . The Dimension has 2 columns with varchar(2000) and the underlying table is 17GB in size with approx 33 mil records. SQL Server max memory has been set to everything from running vakues to max of 4 GB. Also tried using a Warm and Cold cache before processing. other SSAS propeties like memory limit enabled is set to false and buffer limit is set 60 % any other suggestions that might help would really be appreciated.

    I am fairly confident I have read every post and article on Perfomance tuning of SSAS and this particular issue, so I really have run out of options 🙁

    Jayanth Kurup[/url]

  • What data mining method are you trying to run for the model? Try upping your settings for minimum number of cases or/and probability percentage. The advanced setting have a huge impact on the performance of the model and how much memory it is crunching.

    Remember that is is not necessary to process every row of data in order to generate an acurate model. For instance for 100,000 rows you only need to randomly sample 1056 of them to have a confidence interal or 95% with a + or -3% interval and only need to up that to 1815 to have a 99% confidence interval for the model.

    Also, the extra memory won't do anything for the machine it does not matter that you set AWE and PAW in the long term you need to move to 64bit in order to use all that memory.

    In the short term make sure that you are not trying to process the model from within BI-Studio installed on the same machine as the database server. The BI-Studio tool can easily climb up in memory usage for larger models and will consume a couple of hundred megs that the Analysis Services engine could be using.

    --Dave

  • Currently we are not using a data mining model. All we have is an Molap cube.

    I have tried processing the dimension using xmla , Data flow task and the gui but no luck. I understand that there is a limit to the addressable memory in 32 bit. 64 bit upgrade is in the pipeline but I need to have this fixed before we reach that stage, which might be a couple of months away.

    Jayanth Kurup[/url]

  • Sounds like there's something dramitically wrong with your cube definitions. Without some more information, it's impossible to say. At this size, you should be able to pull in and process the cube (even a much smaller machine).

    How many dimensions do you have? Can you process the dimensions individually? How many rows in each of the dimensions?

    If you could post a schema, it might be easier to spot the problem.

    BTW. It might be a good idea to plan on defining partitions, especially on that hardware. It'll have a dramtic effect on performance. Recommend you design partitions arround your access requirements (not updates).

    Cheers.

  • Double check your tempdb files count and make sure you have one tempdb file for every logical CPU in the machine even if you put additional tempdb files on the same spindle you should see a performance gain, but of course it is better if they are on seperate spindles.

    TempDB gets hit hard for distinct counts so you may have some IO stalling there.

    Also you may find this whitepaper on Analysis services Tuning from Unisys helpful as they tackle MOLAP cube performance.

    http://www.unisys.com/eprise/main/admin/corporate/doc/Project_Real_Enterprise_Class_Hardware_Tuning_for_Microsoft_Analysis_Services.pdf.

    --Dave

  • Wait a minute. What was flagging this error? Was it MSAS or SQL Server. If it was SQL Server (while building a cube) then try this:

    Change the number of connections on your data source to 1 and run your cube build with one thread. This'll run much slower, but help narrow down the problem (i.e. to a specific SQL statement).

    Assuming things run through fine (isn't it always the way), start increasing your number of connections and threads, until you cause the problem / issue. When running on the same box, I generally run one thread for every CPU and one connection for every two threads. Of course, if you haven't defined partitions, it won't make much of a difference. If you're running on separate boxes, one thread or connection for every CPU.

    If this doesn't seem like it's taking proper advantage of the hardware, most likely you need to check your database storage design.

  • The error was flagged by SSAS , and we tried processing using 1 thread. and it was quite slow but did the job , the problem is we need it the finish processing much faster than this. as the next set of data is ready for load by the time this step has completed.

    Also we have seperated SSAS from the database and we still get the same error , so TemDB issue might not be the issue.

    Jayanth Kurup[/url]

  • After rereading your original post, I realised your dimension has 33 million records (when I originally assumed you were talking about the data table).

    This is indeed a very large dimension. Any chance it doesn't have to be this big? Also, if it only has two columns (as you stated), there's not going to be many aggregations possible.

    I suspect you're going to need a a bigger (and 64 bit) box if you want to maintain this design. I believe MSAS loads all of the dimension information into memory for aggregation purposes, so you can see the potential problem here.

    Also: When you stated it had two columns, each varchar(2000), what was the primary key? And is it possible to offboard these columns, adding a foreign key? Although MSAS will assign internal IDs to these characteristics, the resulting ID referenced in MDX is quite unwieldy.

    If you can post some information about your application, it might be possible to offer some suggestions.

  • I am having the same problem. I have a dimesnion "General Ledger Transaction" that has around 30 million rows and it will drow over time. When I process the dimesnion , it processeses all the attributes but then gives the following error:

    Memory error: The operation cannot be completed because the memory quota estimate (3422MB) exceeds the available system memory (1331MB).

    Any ideas or solutions?

    Thanks,

    Ambreen

  • The general angst over the inability to process large amounts of data with rather smallish hardware is interesting to observe. Unfortunately, SQL Server is going to need larger hardware to process larger quantities of data. That concept is going to apply to ALL database server products, and not just SQL Server. I just can't imagine trying to run SQL Server with anything less than 4 GB of RAM in the server, and I'd call that the very lowest configuration that's worth working with. After all, you can't really do very much with Windows XP without giving it at least 2 GB of RAM, so expecting SQL Server to survive on a similar level of resources just seems extraordinarily unrealistic if what's desired is to process millions of rows of data: especially with SSAS involved.

    I would strongly recommend folks go exclusively with the 64-bit editions of SQL Server so as to allow for some level of growth in RAM usage. I'd also want to have as many disk spindles as can reasonably be attached to the server. It might cost more, but in the long run, it will likely cost less because fewer hardware upgrades and/or outages will occur. Over buying a server now can significantly reduce future growth costs. If only "Corporate America" management could realize that...

    Steve

    (aka smunson)

    :):):)

    Steve (aka sgmunson) 🙂 🙂 🙂
    Rent Servers for Income (picks and shovels strategy)

  • Ambreen: Are you sure that "General Ledger Transaction" should be a "dimension" (as opposed to a data table)? If you can post your data model, I may be able to offer some additional comments.

    Jaykay: Same comment. I didn't see answers to the questions I raised in a previous post. Did you solve your problem?

    For all: 33 million members for a single dimension is a very significant dimension. Remember, that when you design a cube (and it does require design skills), the resulting reference structure is a function of the cross product of all of the members in all of the dimensions. For example, assuming this dimension has 33 million rows, and there's a weekly time dimension plus a "location" dimension of some sort (with say 2000 rows), the resulting "potentional" rows for 2 years is 12.8 trillion! Of course, MSAS doesn't actually create that, but you can see that this example isn't a trivial cube. Your cube may be bigger or smaller, depending on the specifics.

  • One point from the OP...

    SSAS is unable to use Windows PAE, so in 32-bit mode the maximum amount of memory available to SAS is 2GB, unless you have the /3GB switch in boot.ini when it can use 3GB. Out of this allowance you have to give space to SSAS itself, and enough space to load all dimension data. If you are running anything else on the box apart from SSAS (Anti-virus, SQL database services, etc) then take away the memory used by those pocesses. What is left over can be used for queries in SSAS.

    The error message says that almost 2GB is available for your query, but SSAS can only look at the memory allocated to its address space, it cannot know about memory used by other processes. This means that if SSAS used all of the 2GB it reports as being available, it is possible that total server memory requirements for sub-4GB memory would exceed 4GB, causing Windows to do excessive paging. Remember, the only things that can use memory above the 4GB line in 32-bit Windows are processes designed to exploit AWE, so it is possible to have lots of unused memory above the 4GB line while Windows has paged itself to a standstill due to sub-4GB usage.

    As other people have said, if you want to use large amounts of data in SSAS you need to use a 64-bit OS and 64-bit SSAS. This allow you to fully use all the memory you have on your box.

    If you cannot upgrade to 64-bit, then you have to reduce the amount of data you are dealing with. Or tell the users to wait until the 64-bit upgrade is complete.

    Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.

    When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara

  • Thanks. I tried to process it on 64 bit machine. The dimension itself took more than 2 hours to process but it ran sucessfully.

  • Hi Dave:

    Actually General Ledger Transaction is a fact table but I added it as a dimension as they want to drill down for transaction details (i.e. transaction number, description etc).

    What will be the best way to implement this drill down?

    Thanks,

    Ambreen

  • Leave it as a fact table. You can implement character fields in MSAS. (Different reporting engines may treat the information differently). Some notes:

    - to get to the data, the user will obviously have to know to fully "qualify" the query. This can sometimes be difficult for people to understand, due to how OLAP works.

    - some tools implement "Drill to details" (like ProClarity), that are a handy way of seeing the details underlying a specific total.

    - Everyone needs to realize that text fields can't aggregate. Consider using SCOPE statements to set higher level values.

Viewing 15 posts - 1 through 15 (of 30 total)

You must be logged in to reply to this topic. Login to reply