SSAS 2012 High CPU Usage When Processing Cube

  • Hello,

    I have this problem when I process a cube, when the cube has data that is partitioned by month. Works fine if I'm processing 1 month, but if I process a whole year, then I'm guessing there is a problem with SSAS threading when multiple CPUs need to write to multiple partitions. I have 12 CPU cores. If 1 month is processed, then only 1 CPU goes to 100%. But if I try to load a year or more of data, then all the CPUs go to 100%.

    Is there a way to limit how many CPUs SSAS uses?(and not through Task Manager, which is very quirky)

    Is this high CPU usage with partitions a known issue in 2012?

    🙂

  • Yes there are, if you right click the connection in Management Studio and go to Properties then go to the general tab.

    Of course, these are system settings that usually should not be changed unless you are sure of the effects of changing them.

    Please read about these settings in Books Online and reference the preformance white paper from Microsoft - http://download.microsoft.com/download/8/5/e/85eea4fa-b3bb-4426-97d0-7f7151b2011c/SSAS2005PerfGuide.doc

    In the bottom left, click Show Advanced (All) Properties -

    There are a couple of settings that should be defaulted:

    OLAP \ LazyProcessing\MaxCPUUsage - default should be .5 which means use half the prcessors available.

    If your processing is not LazyProcessing, then look at:

    ThreadPool\Process\MaxThreads and MinThreads - they are default at 0 which does not limit the threads. You could change this to 4 and see what happens

    Thomas LeBlanc, MVP Data Platform Consultant

  • Thanks for the reply. We are currently working with a Microsoft Tech support person and have tried a variety of things. Some had no effect, however setting the CoordinatorQueryMaxThreads =2 did help the CPU utilization stay around 25-30%, but, then queries by the useres were vastly slower. According the the tech person:

    "

    When the concurrency is set to 6 partitions with your original memory setting, SSAS alone can use up to 11*100% CPUs. Each partition processing job can create multiple threads. So 12 core may not be enough.

    "

    (As a side note, we have 32GB RAM) RAM doesn't seem to be the problem though(yet), since the CPUs max out within 15-30 seconds of starting the processing, and there is plenty of memory for at least 10 minutes based on PLE and other counters.

    I will have to try the ThreadPool\Process\MaxThreads and MinThreads settings, since apparenty we are not using Lazy Processing.

    Developer is out...so I'll get back in a couple days.

  • We aborted mission on trying to parallel process partitions. We need to be able to query other cubes while processing other cubes during business hours, and unfortuntley limiting the ThreadPool\Process\MaxThreads and CoordinatorQueryMaxThreads makes the experience very slow on the end user's side even when not processing the cube at the same time.

    The white notes link you provided was very useful. Based on the table that shows recommened CPUs per partition, we would need at least 20-40 CPUs, since we have around 20 partitions that are split out by date and year. This of course far exceeds our 12 CPUs.

    So, instead our solution was to change our code to process 1 partition at a time, and force the package to handle data and index build separately using ProcessData and ProcessIndexes.

    This makes our process take 110minutes instead of 45-55minutes, but at least this only takes up between 25%-45% CPU.

    Thanks again.

  • I am glad you found something helpful.

    Thanks also for posting your solution because now other people can use it and Bing It! 😛

    Thomas

    Thomas LeBlanc, MVP Data Platform Consultant

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply