When to use CPU affinity with Windows Server 2012 R2

There are several issues to consider before you decide to use CPU affinity in Windows Server 2012 R2.

Should we enable CPU affinity to improve performance with Windows Server 2012 R2?

In almost all cases, it is unnecessary to use CPU affinity masks in Windows Server 2012 R2 to confine a workload to specific processor threads (supposing each Intel processor core provides two threads). There are several issues to consider.

First, CPU affinity can conflict with the non-uniform memory access (NUMA) architecture used in most modern servers. The whole idea of NUMA is that it is almost impossible for every thread to access the vast amount of available memory at the same speed: Memory closer to a particular core or processor package (socket) can be accessed faster than more distant memory can. So, the server's scheduling routine will attempt to schedule threads on processors that are closest to the memory where the corresponding workload is running. It's almost impossible for a human to know this, so thread affinity will more often drive processing to processors that are in different NUMA zones, actually degrading workload performance.

CPU affinity is often more appropriate for symmetrical multi-processing (SMP) systems where all processors maintain equal access to memory space (different from a NUMA architecture). In an SMP model, any thread can be run on any processor equally, and this is an important prerequisite for parallel processing systems. Still, the operating system can schedule threads automatically based on thread priority. Human intervention will have less impact on performance here, but typically it does not result in better workload performance.

When affinity rules are applied today, it's usually to test the performance of specific processors (or cores within specific processors). IT administrators can see the current thread affinity for a process using the GetProcessAffinityMask function, or use the SetProcessAffinityMask function to specify affinity for the process' threads. As an alternative, IT administrators may elect to use the SetThreadIdealProcessor function to suggest a preferred (ideal) processor for a thread. This function does not force the affinity, and the scheduler can still choose other processors, but setting the ideal processor will prompt the scheduler to use the suggested processor.

Dig Deeper on

Cloud Computing
Enterprise Desktop
Virtual Desktop
Close