Although I usually write about corporate networks, this article discusses something that happened to my home server. It may provide a valuable lesson for administrators of networks of any size.
I have a couple of dozen computers in my house. When I moved into my current home, I realized my home server room was hotter than was healthy for the machines. I quickly invested in a portable air conditioner (not a window unit), and set it up to vent the hot air to the outside.
But the first time I turned the unit on, I tripped the circuit breaker and blacked out my server room. (Fortunately, all the servers were on battery backup.) So I had an electrician install a dedicated circuit for the air conditioner, and everything was fine. Until recently.
One day my wife turned on the plasma TV in the bedroom, and it tripped the circuit breaker for my server room. Worse, one of my battery backups failed! I spent the rest of the night recovering a corrupt database.
Admittedly, in a corporate environment, no one's going to turn on a TV and knock your servers offline. But why did this power failure happen?. Turning on the TV tripped a circuit breaker, but the TV had never caused a problem before. I dug out the electrical blueprint for my house, and it showed that my computer room and bedroom were on the same circuit.
I had an electrician put my computer room onto a dedicated circuit so I wouldn't have to worry about the TV tripping a circuit again. But I still hadn't found the actual cause of the problem.
Then I remembered that one of my battery backups had failed when the power went off. I checked the battery backup to see what was plugged into it. It supported the same number of computers as always, but each of the computers had been replaced in the last year.
That's when it hit me: When it comes to power consumption, not all computers are created equal. Many computers shipping today have much larger power supplies than those that are just a few years old. For example, last year I retired a Pentium III with a 180W power supply. The computer I replaced it with was a dual-core, 64-bit Athlon with a 450W power supply. That may seem like an excessively large power supply, but it really isn't. It's not uncommon for new computers to have 500W power supplies.
Why do today's computers have such larger power supplies? Although no single component is to blame, machines with bigger CPUs produce more heat, which requires a case to have more fans. All those fan motors have to get power from somewhere.
DVD drives and hard drives have always been power hogs, but video cards can also consume a lot of power. I recently installed an ATI Radeon X1900 graphics card in one of my computers. The card required a connection to the PC's power supply (in addition to the power it receives through the PCI Express slot). The manual for the card recommended the computer have at least a 450W power supply.
The lesson learned is that it's worth it to pay an electrician to look at how your building's circuits are configured. You should know if any other rooms share a circuit with your server room, or if the server room has a dedicated circuit, or even multiple dedicated circuits. While the electrician is on-site, find out how many amps each circuit is rated for.
Schedule some downtime
Once you know how much power is actually going to your server room, schedule some downtime and figure out how much of the available power you're actually using. If you have the luxury of taking the systems offline for a night, you can open the cases and see what size power supplies they have. (A sticker on each power supply tells how many watts it is rated for.)
If you don't have the option of downtime, perform a worst-case estimate. Assume each machine has a 500W power supply and then multiply 500 watts by the number of machines on each circuit.
Whichever method you use, don't forget to account for other devices that might be on the same circuit: monitors, printers, routers, even the ceiling lights perhaps.
You can find out how much electricity these other devices are consuming by looking up each device's specs in its manual or on the Internet. You can find the power consumption for ceiling lights by checking the wattage of each light bulb. If a light has three 100W bulbs, it's using 300W when illuminated.
You won't be able to figure out your exact power consumption without a lot of effort, but you should be able to get an estimate. Once you know the approximate number of watts being used, convert the watts into amps (because the electrical circuits in your building are rated in terms of amps). By determining how many amps you're consuming, you'll be able to tell how close you are to the circuit's actual limit.
Convert watts to amps by dividing the watts by the number of volts being used on the circuit. For example, if a server room has 10 servers drawing 500W of power each, and they were the only things on the circuit, the total power consumption would be 5,000W.
In the U.S., computers get plugged into 120-volt circuits. So we'd divide our 5,000W by 120 volts for a total power consumption of 41.6 amps.
Now let's assume the circuit has a 50-amp breaker. If we're only drawing 41.6 amps, then all is right with the world, right? Maybe not. Most books on electricity recommend you leave yourself a 20% safety margin. Now you need to take the number of amps the circuit is rated for and multiply it by 0.8. That gives you the number of sustained amps the circuit can safely support.
Let's go back to our 500-amp circuit with the 41.6 amp load. If you multiply 50 by 0.8, you get 40. That being the case, 41.6 amps exceeds the circuit's recommended maximum load.
Will exceeding the safety limit by a couple of amps cause a problem? Probably not, but I'm not an electrician. In a situation like this, I would recommend having another circuit installed so that you can offload some of the computers to another circuit.
The next article in this series will show you how to figure out how much of a load your UPS can safely handle, and how long your battery will last under the current load.
About the author: Brien M. Posey, MCSE, is a Microsoft Most Valuable Professional for his work with Windows 2000 Server, Exchange Server and IIS. He has served as CIO for a nationwide chain of hospitals and was once in charge of IT security for Fort Knox. He writes regularly for SearchWinComputing.com and other TechTarget sites.
More information on this topic:
- Tip: Over-voltage vulnerabilities delivered right to your door
- Topic:Windows Servers (general)
- RSS: Sign up for our RSS feed to receive expert advice every day.
This was first published in August 2006