Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
Facebook (s fb) has managed to make its newest data center in North Carolina more efficient than the one it built in Prineville, Ore., despite the sweltering heat of the Carolina summer. By using open-air cooling and its Open Compute Foundation servers, Facebook has achieved a PUE score of 1.07 (the Prineville facility has a PUE of 1.09).
PUE, or power usage effectiveness, is a metric that divides how much energy a data center uses in total by how much its IT equipment uses, and the closer you are to one the more efficient your data center is. So how on earth did Facebook manage to avoid using air conditioning in its data center located in the South where temperatures can hit triple digits and the humidity is enough to make you feel like you’re breathing underwater?
Dueling weather metrics and swamp coolers
Facebook’s blog post on the topic opens with dueling weather forecasts. It weighed forecasts using BinMaker software for calculating temperature and humidity on computing and forecasts and data from the American Society of Heating, Refrigerating and Air-Conditioning Engineers, Inc. to figure out if it would need to use A/C. The ASHRAE estimates said the social networking giant would need to use its A/C in Forest City, N.C. and BinMaker didn’t. Undeterred by the prospect of turning on the chillers, Facebook installed them but also went ahead with plans for open air cooling.
As an Open Compute blog post written by a Facebook engineer notes, “We ended up installing a direct expansion (DX) coil system in the facility, just in case it might be needed, but it was important to us to find a way to make the free cooling system work — the potential efficiency gains to be found in keeping those DX units switched off were just too great to ignore.” The social networking company then built its own servers based on the Open Compute standards it had pioneered and tweaked them to handle higher indoor temperatures and humidity.
For engineers, the heat wasn’t the issue, the humidity was. Much like eBay (s ebay), which uses open-air cooling in Phoenix, that “dry-heat” everyone praises actually is better because the data center operations team can use a mist to cool down the air before it enters the data center. In high heat and low humidity environments basically Facebook is using a swamp cooler. That’s the same method it uses in Prineville on warm days.
Last summer, one in which North Carolina had the second-highest July on record and the temperature hit 100 degrees one day, Facebook lucked out because the humidity stayed low. This meant Facebook never had to turn on the A/C. It did, however, have to use some of the dry hot air from the data center hall to lower the overall humidity on days when it was really humid but not that hot.
So all that hot air generated on Facebook by your friends and their political opinions can be put to good use.
Check out this series by GigaOM’s Katie Fehrenbacher on North Carolina’s mega data center clusters:
Here’s my 4-part series on data centers and power:
- The ultimate geek road trip: North Carolina’s mega data center cluster
- 10 reasons Apple, Facebook, Google chose North Carolina for their mega data centers
- The controversial world of clean power and data centers
- The story behind how Apple’s iCloud data center got built