So rather than increasing clock speeds, chip makers looked for new ways of increasing the number of instructions that could be performed within each clock cycle. One way they achieved this was through multiple core processors.
Multi-core processors have multiple processors integrated onto a single piece of silicon. Walk into an electronics store, and nearly every PC you'll see will have a dual-core processor.
Meanwhile, Intel Corp. has already begun producing quad-core processors for use in high-end machines. Processors with eight cores are not far behind.
In fact, Intel recently announced that it had produced a processor containing 80 cores. The processor -- the size of your fingernail -- is capable of performing a trillion calculations per second! Previously, only a supercomputer could process a trillion floating point instructions per second (known as a tetraflop). Even more startling: Intel plans on mass-producing this processor by 2012. (There are several reasons why it is still a few years out. For one thing, the chip requires a special system board, and Intel has yet to resolve issues regarding how the chip will be able to interact with a system's memory.)
But forget five years from now. What do multi-core processors mean for today's Windows administrator?
Current operating systems are completely unprepared to deal with processors with large numbers of cores. Applications consist of one or more processes, each in turn consisting of one or more threads. A thread is an individual unit of execution, and it cannot be split between multiple processors or between processor cores.
Windows is multi-threaded and has been for many years. In fact, Windows runs very well on a dual-core processor. So why do I say that current operating systems are not ready to deal with large numbers of cores? Because in the current Windows architecture, the operating system maintains control over all processor cores.
A considerable amount of CPU resources is consumed during the task of assigning threads to a specific core. This is why a dual-core system is not twice as fast as a comparable single-core system. Roughly half of the capabilities of one core are used in assigning threads to a specific core. As such, adding an additional core to a processor only provides about a 50% increase in performance, rather than the 100% increase you might expect. (There are also hardware issues that prevent performance from being truly doubled).
Another reason why processors with large numbers of cores are not ready for prime time is because applications are not designed to take advantage of all these cores. Any application can benefit from a dual-core processor. One core runs the Windows operating system, while the other core runs the application. (In reality, it's more complicated than that, but that's the basic idea).
For an application to see any benefit from additional cores, one of two things must happen. Either
- the application must be multi-threaded, or
- other high-demand applications must be running simultaneously.
Running additional applications won't increase the application's performance, but it will allow otherwise unused cores to be employed.
Multi-threading will allow an application to perform better because each thread is assigned to a separate core. However, most applications are not multi-threaded, and those that are do not use more than a few different threads.
One other problem: Under the current Windows architecture, as the number of running threads (and the number of processor cores) increases, so does system overhead.
My prediction is that Windows will probably perform adequately on systems with up to eight cores. I'm not even sure if Windows Vista would run on a system with more than eight cores. But I think it's only a matter of time before Microsoft will have to redesign Windows to make more efficient use of CPU cores.
Another prediction: Over the next couple of years, developers will find themselves having to learn how to write multi-threaded applications.
About the author:
Brien M. Posey, MCSE, is a Microsoft Most Valuable Professional for his work with Windows 2000 Server, Exchange Server and IIS. He has served as CIO for a nationwide chain of hospitals and was once in charge of IT security for Fort Knox. He writes regularly for SearchWinComputing.com and other TechTarget sites.
This was first published in March 2007