Here’s a new twist on how to lower the cost of keeping your data center cool: use water misters.
Perhaps you’re familiar with water misters from your local supermarket, where they’re used to keep produce fresh (hopefully without the annoying thunder sound that one market near me uses to warn you the mist is coming). Or maybe you’ve seen them used around pools and patios in places like Las Vegas, to keep people from cooking in the desert sun.
Facebook is using them to help cool its data center in Prineville, Ore. As CNET reports:
Rather than a raised floor pumping air conditioned air toward servers, Facebook built an elaborate air flow system that takes advantage of the relatively cool air in Oregon. That outdoor air, with the help of sprayed water, does all the cooling.
A penthouse above the server racks takes in outdoor air and, after being filtered, is cooled by a bank of water misters. That cooled, humidified air is then fed onto the concrete floor of the data centers to actively cool the racks.
It sounds like radiant heat, but in reverse. Radiant heat, I can tell you from experience, is a wonderful way to heat your house. It consists of a series of tubes running underneath the floor or through the walls, with warm water running through them. And I emphasize “warm,” as the water doesn’t have to be all that hot to make a room quite comfortable. (One hint: don’t use copper tubing embedded in concrete, as they did in a house built in the 1950s that I used to own. Eventually, the tubing breaks, at which point you’ve got a big mess on your hands. Now they use plastic – much better.)
I can envision the same principle working quite well for cooling, and indeed Facebook is getting impressive results, judging by its power usage effectiveness rating. PUE is a metric developed by The Green Grid consortium as a way to measure how much of the power a data center consumes is used to power IT equipment, as opposed to ancillary purposes like cooling. A PUEof 1.0 is considered perfect; it means all your power is used for IT equipment. Facebook says it’s achieving a PUE between 1.06 and 1.1, which is nothing short of remarkable. (NTT America, a subsidiary of the sponsor of this site, has made significant strides in reducing the PUE at its two largest data centers by installing intelligent energy systems from Vigilent, saving itself some $630,000 annually in the process.)
Facebook is sharing what it’s learning from its data center adventures through a group it founded called the Open Compute Project. At the Open Compute site you’ll find specs andupdates not only on Facebook’s innovative cooling system but its high-efficiency electrical system, racks, battery cabinets and more. Check it out and see what you can apply to reduce your own data center energy costs. What’s not to “Like?”