Stay on Top of Enterprise Technology Trends
Get updates impacting your industry from our GigaOm Research Community
Prineville, Oregon: When the temperature creeps above 90 degrees in this rural community, it’s the perfect time to see why Facebook decided to build its first data center here. That’s when the outside air cooling system — which collects the cool, dry Oregon air and pushes it through filters and misters to chill the thousands of servers that hold all those Facebook Likes and photos — has to work overtime.
In a rare visit to Facebook’s Prineville data center on Thursday, the temperature hit a high of 93 degrees outside. While the cows we passed on the 20-minute drive from the Redmond, Ore. airport searched for any semblance of shade, Building No. 1 of the data center was as noisy as an industrial factory, with air flowing through the cooling rooms and the humidifier room spraying purified water onto the incoming air at a rapid rate. It’s like peeking inside a brand new Porsche while it’s being driven at its fullest capacity.
Facebook’s data center here is one of the most energy efficient in the world. The social network invested $210 million for just the first phase of the data center, which GigaOM got a chance to check out during a two-hour tour. Building No. 1 is where Facebook first started designing its ultra-efficient data centers and gear, and where it wanted the first round of servers that it open sourced under the Open Compute Project to live. Since then — Building No. 1 was opened up in the spring of 2011 — Facebook has slightly tweaked its designs for Building No. 2 at the Prineville site, as well as the designs for its data centers in North Carolina and Sweden. Building No. 2 will use a new type of cooling system that involves media (made of fiberglass) instead of misters.
Facebook is currently investing heavily in its infrastructure boom. It now has an estimated 180,000 servers for its 900 million plus users — that’s up from its estimated 30,000 in the winter of 2009, and 60,000 in the summer of 2010. While that’s tiny compared to Google’s estimated 1 million-plus servers, Facebook, like Google did years before it, is now learning how to be an infrastructure company.
Follow our tour as we take you through the facility, along the air flow route, and down into the secret server room:
The solar panel project next to the data center (it’s small and doesn’t provide much power for the system).
The entrance to Facebook’s Prineville facility shows off the facility’s energy usage, energy efficiency (called PUE) and water usage.
The outside air flows into the data center through a wall with openings, a rubber flow, and a bug zapper in the corner.
Facebook’s Ken Patchett showing how the air flows in.
It really is outside air, folks, hence the bug zapper.
Then the air moves through filters.
Facebook’s Patchett shows off the air filters.
This the room where the air gets humidified with purified sprayed water. In Building No. 2 Facebook is changing up the system.
Facebook’s Patchett shows off the water spraying system.
There are about 15 motors per server bank.
We’re standing above the server room; the cooled air is pushed down and the hot air moves up and out
This is where the hot air moves up and out of the building
Now we’re outside on the roof, where the air leaves.
This is the small solar field as seen from the roof.
The water has to be purified before its used to spray the air. This is a reverse osmosis machine from Siemens. In Building No. 2 Facebook won’t use this system.
Workers grab time to exercise between shifts.
Where the magic happens: the servers.
Ken describes how the air moves into the server room and also why they separated the Open Compute servers from some of the older store bought servers.
The non Open Compute servers.
The Open Compute servers.
Close up of the Open Compute servers.
The Open Compute servers run more efficiently and the aisles with them are cooler
More server shots.
Ken explains how the backup energy systems work with the Open Compute servers.
A top down view of an open compute server.
Ken explains how engineers fix and swap out open compute parts.
Some of the servers that house sensitive financial information are kept behind locked gates.
Of course, the big Facebook Like button
A napkin that Facebook execs say is the drawing of the original power plan for the building
To review the photos check out the gallery:
For more info on data centers check out:
- The ultimate geek road trip: North Carolina’s mega data center cluster
- 10 reasons Apple, Google & Facebook chose North Carolina for their mega data centers
- The controversial world of clean power and data centers
- The story behind how Apple’s iCloud data center got built