I attended the Denver SQL Server User Group meeting, wanting to get a chance to catch up with some friends in from out of town. Peter Shire and Steve Wright of SQL Sentry were in town this week to present at the three local groups (Boulder, Colorado Springs, Denver) and this was a good chance to catch up a little. I typically see these guys once or twice a year at events, and it’s always a good time.
The meeting started with the basic user group business. Letting people know what the group is about, who’s in charge, thanking sponsors, etc. They called out SQLServerCentral and even asked if I’d like to show the site. I stood up and asked about who didn’t know about the site and no hands were raised of the 35-40 people that were there. That was pretty cool since almost every time I go to an event, there seem to be a good percentage of people that have never heard of SQLServerCentral.
There were a few people looking for jobs, and one company looking to hire some developers. It seems that the SQL job market isn’t great in the Denver area.
The first presentation was an Introduction to Reporting Services by Steve Wake of Visa. Steve gave a 30 minute opening presentation that was just showing the basics of SSRS. He started with some basic facts and information about SSRS
The majority of the presentation was a demo. What was interesting was Steve presented on a Macbook using a VM (VMWare Fusion) to show the SQL demo. When the VM didn’t rake up right away, a few nice Apple jokes resonated with the audience.
We got a nice explanation of the various parts of the Reporting Services Configuration Manager.
Overall the presentation was OK, and some good basic info, but the slowdown of the VMs caused a few issues. One reminder to people presenting, be sure you’re connected to the projector when you make check your setup at the event.
The main presentation was Memory Management in SSAS by Steve Wright (SQL Sentry). Steve and Peter Shire traveled from Charlotte, NC to the Denver area to present 3 nights in a row at all 3 area user groups.
There are two types of memory. The first is Shrinkable memory
As you might guess, the other type is Non-Shrinkable
How do you measure memory? A couple counters from PerfMon
Just as with SQL Server, you can configure memory for SSAS. Steve talked about setting the low memory as well as the total memory limit. These refer to the physical memory on the machine, not virtual memory. The low memory limit refers to the percentage of memory if the value is between 0 and 100. If it goes over 100, then it’s the bytes on a server. That’s bytes, not MB or KB. Sounds crazy to me, but that’s a useful bit of trivia. The same value applies to the total memory limit.
Once you hit the low memory limit, the cleaner threads start to move data out of memory non-aggresively and release that memory back to the OS. There is a “cleaner memory shrunk” counter you can use to monitor this. If the total limit is reached, it’s “crisis” mode. I like that description. The cleaner now more aggressively starts to clean memory. There is a more dramatic performance impact. Steve talked about the “memory price” which can be set in the ini file. The price is 0 below the low memory limit, and above that, the cleaner starts to try to manage memory using a model that accounts for the price of memory, which rises as more memory is used.
One interesting point here is that SSAS isn’t looking at other services, or even Windows. It’s memory management is internal only,so you need to set limits that account for other needs on the host. Objects are also files on the file system, so the data is loaded into the file system cache. Usage of the file cache isn’t considered part of the SSAS limits.
Steve says that Task Manager is not the place to track this. Use the counters in Perfmon.
In a demo of Perfmon, he showed counters, and pointed out the memory shrunk counter. IF that isn’t zero, or close, then you have memory pressure. It would be great if SQL Server had a counter of some sort that could indicate memory pressure. Maybe there is one, if so, let me know.
A large part of the talk showed a memory issue with a client of SQL Sentry. During the beta test of the Performance Monitor tool, they reported a graphical issue. It turned out to be the low and total settings, which the client thought was kb or MB, but were in bytes. So for a large SSAS server 32GB or RAM, they had set 2MB and 13MB as the values. Steve walked through what this showed on the counters using the SQL Sentry tool. It was a good use of their product to show how the memory settings had affected the SSAS server.
Steve mentioned a change in SSAS 2008 of pre-allocation of memory. What was interesting was that this doesn’t see to matter with Windows Server 2008, but if you have Windows 2003, then pre-allocation makes a huge difference in how SSAS runs. Note that the pre-allocation doesn’t override memory limits. Be sure that you set this at a level that makes sense. Don’t set it to the same value as the total limit.
It was a good presentation, and a good meeting of the group. Thanks to Steve Wake, Steve Wright, and SQL Sentry (www.sqlsentry.net)
We meet the third Thurs of the month, so come on by if you are in the area.
- Julie- SQL Saturday space
- Marc blog
- sSRS scripter – sqldbatips.com
- Marc – Might survey people for areas they work (DB engine, SSRS, etc) Get ideas for topics focus