17 Comments

Physical Facebook Like button
Summary:

Follow along our tour as we take you on a rare journey through Facebook’s first data center in Prineville Oregon, which houses its Open Compute servers. We’ll bring you along the air flow route, and down into the secret server room:

Prineville, Oregon: When the temperature creeps above 90 degrees in this rural community, it’s the perfect time to see why Facebook decided to build its first data center here. That’s when the outside air cooling system — which collects the cool, dry Oregon air and pushes it through filters and misters to chill the thousands of servers that hold all those Facebook Likes and photos — has to work overtime.

Sasquatch watches over the lobby of Facebook’s data center in Oregon

In a rare visit to Facebook’s Prineville data center on Thursday, the temperature hit a high of 93 degrees outside. While the cows we passed on the 20-minute drive from the Redmond, Ore. airport searched for any semblance of shade, Building No. 1 of the data center was as noisy as an industrial factory, with air flowing through the cooling rooms and the humidifier room spraying purified water onto the incoming air at a rapid rate. It’s like peeking inside a brand new Porsche while it’s being driven at its fullest capacity.

Facebook’s data center here is one of the most energy efficient in the world. The social network invested $210 million for just the first phase of the data center, which GigaOM got a chance to check out during a two-hour tour. Building No. 1 is where Facebook first started designing its ultra-efficient data centers and gear, and where it wanted the first round of servers that it open sourced under the Open Compute Project to live. Since then — Building No. 1 was opened up in the spring of 2011 — Facebook has slightly tweaked its designs for Building No. 2 at the Prineville site, as well as the designs for its data centers in North Carolina and Sweden. Building No. 2 will use a new type of cooling system that involves media (made of fiberglass) instead of misters.

Facebook is currently investing heavily in its infrastructure boom. It now has an estimated 180,000 servers for its 900 million plus users — that’s up from its estimated 30,000 in the winter of 2009, and 60,000 in the summer of 2010. While that’s tiny compared to Google’s estimated 1 million-plus servers, Facebook, like Google did years before it, is now learning how to be an infrastructure company.

Follow our tour as we take you through the facility, along the air flow route, and down into the secret server room:

The solar panel project next to the data center (it’s small and doesn’t provide much power for the system).

The entrance to Facebook’s Prineville facility shows off the facility’s energy usage, energy efficiency (called PUE) and water usage.

The outside air flows into the data center through a wall with openings, a rubber flow, and a bug zapper in the corner.

Facebook’s Ken Patchett showing how the air flows in.

It really is outside air, folks, hence the bug zapper.

Then the air moves through filters.

Facebook’s Patchett shows off the air filters.

This the room where the air gets humidified with purified sprayed water. In Building No. 2 Facebook is changing up the system.

Facebook’s Patchett shows off the water spraying system.

A closeup of the water spray system.

There are about 15 motors per server bank.

We’re standing above the server room; the cooled air is pushed down and the hot air moves up and out

This is where the hot air moves up and out of the building

Now we’re outside on the roof, where the air leaves.

This is the small solar field as seen from the roof.

The water has to be purified before its used to spray the air. This is a reverse osmosis machine from Siemens. In Building No. 2 Facebook won’t use this system.

Workers grab time to exercise between shifts.

Where the magic happens: the servers.

Ken describes how the air moves into the server room and also why they separated the Open Compute servers from some of the older store bought servers.

The non Open Compute servers.

The Open Compute servers.

Close up of the Open Compute servers.

The Open Compute servers run more efficiently and the aisles with them are cooler

More server shots.

Ken explains how the backup energy systems work with the Open Compute servers.

A top down view of an open compute server.

Ken explains how engineers fix and swap out open compute parts.

Some of the servers that house sensitive financial information are kept behind locked gates.

Of course, the big Facebook Like button

A napkin that Facebook execs say is the drawing of the original power plan for the building

To review the photos check out the gallery:

 

For more info on data centers check out:

  1. Glenn Fleishman Saturday, August 18, 2012

    Hardly a rare tour. Facebook has been eager to have journalists through since the facility opened, and take and publish photos. I was there in July 2011.

    Share
    1. Robert Scoble Monday, August 20, 2012

      Glenn is right. I had the same tour months ago: http://scobleizer.com/2011/04/16/photo-tour-of-facebooks-new-datacenter/

      Share
  2. Guess Google or MS wil buy this when Facebook dies.

    Share
  3. Incredible!

    Share
  4. nik cubrilovic Saturday, August 18, 2012

    A lot of the photos from inside the data center are poorly framed, under exposed and out of focus

    Share
  5. Doobie Brothers Saturday, August 18, 2012

    Wow! Fascinating! not.

    Share
  6. forexmarketonlinetrading Saturday, August 18, 2012

    its amazing how one website requieres all this infrastructure .

    Share
  7. Amazing infrastructure and facility . Wish one can visit the site in person . I hit the Like ‘ button. :)

    Share
  8. Would it have been too much to ask that someone take a phone built in the last few years or o to capture quality video? Still, it’s better than nothing.

    Share
    1. It’s a handheld flip video.

      Share
  9. Thats pretty cool! HA! Coz of the cooling system….

    Share
  10. That’s not a “bug zapper”. It’s a UV disinfectant. http://en.wikipedia.org/wiki/Ultraviolet_germicidal_irradiation

    Share
    1. They specifically told me it was a bug zapper. The UV machines are in other places. If I misunderstood, I’ll update this.

      Share
  11. I have seen this data center cooling design a couple of times and I think it works great. It is good to see this is getting to be more common.
    The solar panels and computer screen “green-ness” is all for show but the rest is good.

    Share
  12. Cool dry air?

    I’ve gone camping in that part of Oregon and in the summer it can reach 100 degrees. It’s basically a hot dry prairie desert.

    I ain’t no zillionaire, but it seems like a strange place to put a building whose main requirement is heat exchange!!

    Share
  13. Well I enjoyed your article Katie. Thank you for posting it.

    Share
  14. Well they need all those servers because they are no 5 in the world and imagine the otrage needed fro all those pics and videos.

    Google only has to store text data.

    This is why Facebook never made money in the first few years, and probably won’t for the next few.

    But I do like Facebook.
    Just that its ad click rates are too high for me.

    Share

Comments have been disabled for this post