When memory is assigned to a virtual machine (VM), that memory is taken from available RAM and allocated from the host to the machine. Hyper-V memory, up to this point, has been a static resource. If you allocated 4 GB, that was pretty much that; you could not use more than you had.
In reality, many virtual machines don’t fully utilize their available RAM, just like they don’t fully utilize their available processors. But with Dynamic Memory, we can shuffle the deck and move some of that RAM around to go where it’s needed for better consolidation and efficiency.
The concept is not unlike that of virtual memory using a page file on your standard PC. All modern operating systems are able to swap data from physical memory to a page file on a hard disk without the program’s knowledge. This allows us to run many applications at once while utilizing RAM for what is running at the time, and the same can be said for a virtual host when running many virtual machines. It usually works well, but anyone who has dealt with a system that lacked enough memory knows that performance can suffer when too much dependence is placed on the page file.
This can also translate to virtualization, as some systems that need access to memory quickly could see performance drops if proper consideration isn’t taken to how Dynamic Memory is configured.
Setting the table for Dynamic Memory
Dynamic Memory brings automation to this kind of complex memory management, but it’s different than VMware’s memory overcommit feature. There are several manual settings you’ll need to setup when you enable Dynamic Memory on a virtual machine. First, there’s the Startup RAM setting, which is what the guest operating system will get when booting up. Although it would seem to make sense to use the recommended settings for RAM, in reality this should be the minimum amount of memory required by the operating system to boot plus any RAM needed to start applications.
For example, Exchange Server 2010 requires a minimum of 4 GB of system RAM. Even though you might go with 8 GB to run a performance system, 4 GB is the minimum the operating system and application need, so this is your Startup RAM setting. Keep this setting to the minimum to get the operating system booted and your applications started.
The Maximum RAM setting is the most RAM Hyper-V will ever offer to your virtual machine. The default setting is 64 GB of RAM, giving the parent partition plenty of leverage on what to assign to the guest. This is usually fine, since Hyper-V will make the determination on how much to allocate based on use. There are cases where applications will want to gobble up every ounce of available RAM. In those cases, set your maximum RAM to the upper limit that you would like to allocate to that application and operating system. This is also a good time to set your maximum RAM setting to coincide with the application’s settings.
The Memory Buffer is not an amount in megabytes, but a percentage of memory that Hyper-V will try to reserve as an extra pool above the committed memory at the time. So, if the virtual machine has 1.2 GB of RAM assigned by Dynamic Memory and the Memory Buffer is set to 20%, there will be 300 MB of additional RAM that Hyper-V will try to reserve for a total of 1.5 GB.
If memory is needed, this reserve will be available without having to wait for memory to be allocated. If you have an application that requires larger spikes in memory, you may want to set this percentage higher than the default to provide quick access to available memory. The ability to actually reserve this buffer will depend on the demand on available physical memory by all of the virtual machines on the host, so the Memory Buffer is no guarantee on a busy Hyper-V server.
The last setting you need to consider is Memory Priority. This helps determine which machine will get priority when available physical memory is constrained. This setting can be anywhere from 1 to 10,000. That’s quite a range, but the point of this setting is that the higher number will get priority for memory when there is not enough memory to go around. The virtual machines with a lower priority may get memory removed to supply memory to a higher-priority machine.
The default Memory Priority setting is 5,000 -- right in the middle. If you have systems with critical performance requirements, bump them up and leave other systems at a lower priority. If you are going to use this setting to set a distinct priority for every machine, make sure you keep track and are basing your decisions on hard numbers for requirements or pre-determined service-level agreements (SLA).
When to use Dynamic Memory
Even though Dynamic Memory opens up the ability to host many more servers on a single host, there are times when you will want to use the old static setting. Any application that does best with a pre-determined amount of RAM for performance, then grabs onto that RAM and doesn’t let go, might as well use a static setting.
For example, if the application will always use 6 GB, there’s little reason to go through the trouble of allowing that virtual machine to participate in the Dynamic Memory algorithm. As an example, SQL Server can be set to use a specific amount of RAM, but the default is to use the maximum RAM available and keep it for itself.
As the major new feature for Windows Server 2008 R2 Service Pack 1, Dynamic Memory continues to show Microsoft’s commitment to Hyper-V as a viable solution and competitor to VMware. Just remember, your virtual machines have to be enlightened with the latest integration tools to use the new settings, and you need SP1 installed if the operating system is Windows 7 or Server 2008 R2. Also, all hosts participating in a cluster must be running Service Pack 1 to take advantage of the Dynamic Memory features across the board.
Finally, be sure to plan for memory use properly so that you don’t starve your machines for RAM, which would turn your virtual host into a slow swap file on a low memory PC instead of using Dynamic Memory as an intelligent agent for enhanced memory management.
You can follow SearchWindowsServer.com on Twitter @WindowsTT.
ABOUT THE AUTHOR: Eric Beehler has been working in the IT industry since the mid-90's, and has been playing with computer technology well before that. He currently provides consulting and training through his co-ownership in Consortio Services, LLC.
This was first published in December 2010