Explaining how Windows stacks up against Linux in a high demand environment sounds like a simple exercise. In truth, it's very complicated for several reasons, but mainly because of a lack of reliable benchmark data.
Benchmarking performance has turned into something of a cat fight. And to understand why, you need to have a little bit of a historical perspective.
The benchmarking war really picked up about two years ago when Oracle told the world that its 9i Application Server was superior to its competitors, but had no way of proving it. So Oracle engineers got a hold of a copy of Sun Microsystems' Pet Store Java Reference implementation and used it to get some benchmark data. The problem was that Pet Store was never intended to be a benchmarking application.
At that point, Microsoft wanted to show the Unix community that Windows was a superior platform, and it did its own benchmarking. Of course, this benchmark data showed Windows' built-in application server performing far better than Oracle's 9iAS. Oracle countered by doing a live benchmarking demonstration at the JavaOne conference that year. Of course, this benchmarking test made Windows look silly.
Needless to say, Microsoft wasn't very happy and contracted an independent testing company called VeriTest to settle the dispute. The VeriTest results indicated that Windows is indeed a superior platform. It would seem that the story would end there, but it's actually just the beginning.
Why the VeriTest testing was invalid
A couple of arguments could be made for why the VeriTest testing was invalid. Keep in mind that Oracle's testing was based on Pet Store. That's the first problem with the comparison. The second problem was that the code running on the Windows server was not Java based. It hardly seems fair to compare the results of a Java-based pseudo benchmarking test to a non-Java test.
Another issue in the benchmarking cat fight is that the various companies involved can't seem to agree on a single testing method. In fact, during some periods of time, Microsoft has even been "banned" from the tests that are officially sanctioned by the Java/Unix community.
Of course, these issues happened a couple of years ago. So where do things stand today?
Today, the closest thing that we have to a reliable benchmark comparison is a Web site maintained by the Transaction Processing Performance Council. This site is more focused on the actual number of transactions performed and the cost per transaction than how those transactions were done.
Microsoft today is still using VeriTest to prove its claims of superiority of Windows over other operating platforms. In fact, if you go to the VeriTest Web site, you can see a report indicating that Windows Server 2003 performs anywhere from 66% to 95% better than Red Hat Linux when configured as a file server.
Other tests done by HP, Robby Sherman
Another interesting thing to note, though, is that today, there are separate versions of Oracle for Windows and for Linux. Even so, information on the Oracle Web site would tend to hint that the company favors Linux. Similar opinions are expressed in a Hewlett-Packard forum on Linux versus Windows for Oracle databases. There is also a rather infamous test by Robby Sherman that determined that when everything else is equal, a Linux server running Oracle 9i is 38% faster than a Windows server.
The point of all this is that despite the heated debate, there is no clear winner. Both Windows and Linux have performance data that would seem to crown them as the performance king. The problem is that none of the tests really seem to be reliable. It wouldn't even be a surprise if the various benchmark tests pit a highly optimized server running one operating system against a server running an out-of-the-box configuration of the other operating system.
So which operating system is better for high performance computing? I personally lean more toward Windows, but I have to admit that this is more due to my Windows background than to any performance data. I will say, however, that both operating systems are very capable of servicing high demand environments. Both are scalable and support multiple processor and clustered configurations. The actual performance depends just as much on the underlying hardware, how the application is coded and how the server is configured, as it does on the operating system.