Photo tour of Facebook's new datacenter

Facebook's datacenter in Prineville, Oregon, USA from the outside

Today I was very fortunate to have gotten a tour of Facebook’s new datacenter up in Prineville, Oregon (map). This datacenter is the most energy efficient in the world and only a handful of press got a look. We’ll have a video up after editing it, but here’s a look at the datacenter in photos. I shot all of these photos on an unmodified iPhone 4 with Instagram, that just got an update today. For the panoramic photos I was using Occipital’s 360 app.

Here’s the sight that we saw on arriving. Keep in mind this building is HUGE and there’s a sizable solar array out front (here’s a panoramic photo from inside that solar array), which doesn’t really power much of the datacenter, but powers some of the buildings around the site. Photos don’t really do it justice, but think about three average Walmarts put end-to-end :

Facebook's new datacenter. Huge!

Facebook is so big that it has its own flag:

Facebook has its own flag. Hangs in front of datacenter and the tour is over.

Walking in, yes, we are in the right place:

Sign in lobby of Facebook datacenter.

Just past the Facebook sign is a monitor in the lobby that shows you the state of the datacenter and how well the cooling systems are working:

Cooling chart at Facebook datacenter entrance.

Inside the security door the local community made these quilts, which is their interpretation of what a social network looks like:

Quilts done by local community in entranceway to Facebook datacenter.

Walking in Thomas Furlong, director of site operations at Facebook, brought us into a huge series of rooms which “process” the air. First room filters the air. Second room filters it further.

Here’s Thomas showing us one of the huge walls of filters (these filters are similar to the ones in my home heating system, except here Facebook has a wall of them).

Thomas Furlong, director of site operations at Facebook, shows us a huge wall of filters at its datacenter

Here’s a better shot of just how massive this filtering room is:

The air filter at Facebook datacenter. Big!

Then the air goes into a third room, one where the air is mixed to control humidity and temperature (if it’s cold outside, as it was today, they bring some heat up from inside the datacenter and mix it here) and on the other side, there’s a huge array of fans, each of which has a five horsepower motor (today the fans were moving at 1/3 speed, which makes them more efficient).

Here you can see the back sides of one of the huge banks of filters:

Air filters at Facebook's datacenter.

Here Thomas stands in front of the fans:

Facebook fans!

Here’s a closeup look at one of the fans that forces air through the datacenter and through the filtering/processing rooms:

Each fan has 5hp motor.

Finally, the air moves through one final step before going downstairs into the datacenter. In this final step small jets spray micro-packets of water into the air. As the water evaporates, which it does very rapidly, it cools the air. One room I didn’t take photos in was filled with pumps and reverse osmosis filters, which makes the water super pure so it works better when using it to cool in this way. One final set of filters makes sure no water gets into the datacenter. Here’s a closer look at the array of water jets:

Water cooling at Facebook data center.

Here you can see the scale of the room that sprays that water:

Filter room #2 at Facebook datacenter. Huge!

Here’s a closeup of one of the jets of cooling water:

Water-cooling jet at Facebook datacenter.

Finally we got to follow the air down into the datacenter where there was a huge floor with dozens of rows. Each row had rack after rack of servers.

Here Thomas stands in front of just one of those racks:

Tom Furlong gives us our first look at Open Compute servers at Facebook datacenter.

A look down the main corridor at Facebook's new datacenter

This 180-degree view gives you a look down the main corridor (on the side you can see is only half the datacenter — these are the newer “Open Compute” servers, the other half they asked us not to take pictures of, and that contained their older server technology).

If you click here you can see a panoramic photo of one of these rows.
Panoramic Photo of one of the rows of servers inside Facebook's new datacenter

What does this all mean? Well, for one, it brings jobs to Prineville, which is a small town with about 10,000 residents in a very rural county (we drove about half an hour through mostly farmland just to get to Prineville). But listen to Prineville’s mayor to hear what it means for her community.

Which brought up the question: why Prineville. The execs who showed me around today said they chose the site based on an exhaustive search for the perfect combination of low-seismic risks, cooler and mostly dry weather, access to power and Internet trunk lines (Prineville is an old railroad community, and fiber lines run under the railroads here) and a variety of other factors including low tax rates and friendly climate to business, etc.

Anyway, it’s not often that you get to see inside a modern datacenter. You’ll be reading more about this tour, since there were other journalists there as well, hope you enjoyed these early pictures.

By the way, why did Rackspace send me there? For those who don’t know, I’m a full-time employee of Rackspace which is the world’s biggest web hosting company.

Because we’re already building a datacenter based on the “Open Compute” plans that Facebook made and put into Open Source (the datacenter as well as the specs for the machines is all in open source now). More on Open Compute here. Plus we’re datacenter geeks so love seeing how other companies do it so we can learn from what they’ve done.

Comments

    1. It was even cooler being there in real life. The thing is massive. Sure shows that Facebook is a real company and not just a fly-by-night startup! :-)

      1. Fly-by-night? No. Agreed. These guys are the new walmart of the internet. Anything you want all on one site. Games, friends, search, email and good Oregon wine inbound shortly.

      2. Yeah, because after 6 years in business, and subsequently worth billions of dollars, a company might be misconstrued as a fly-by-night company.

    2. It was even cooler being there in real life. The thing is massive. Sure shows that Facebook is a real company and not just a fly-by-night startup! :-)

    3. It was even cooler being there in real life. The thing is massive. Sure shows that Facebook is a real company and not just a fly-by-night startup! :-)

    4. It was even cooler being there in real life. The thing is massive. Sure shows that Facebook is a real company and not just a fly-by-night startup! :-)

  1. Just to think this was built as a result of one man’s passion… and they say the economy is failing… I think that old business stopped innovating, new business is finding a way to make it work!

  2. Of course! They also have everything stored on other datacenters as well and have contingency plans in case the entire datacenter goes down.

    1. What do we make of this statement? “According to a press release issued by Greenpeace, Facebook uses “about 55% coal power while Google uses 34% and Yahoo uses just 12.7%.””

      1. I’d say that Facebook is normal. 55% is roughly how much electricity coal supplies in the United States. Those numbers are entirely dependent on where the datacenters are located, as well. Facebook just happens to place their datacenters in places where coal is predominant source of power. Big deal.

        1. Not really a big deal, though Coal is a HORRIBLE source of electric power. Enormously destructive on terrain and water/air.

          Clean coal energy, what a total 100% contradiction in terms.

          1. or the others perhaps.. solar/wind/geo-thermal?
            You know, what’s commonly used for renewable energy resources… nuclear isn’t renewable.

          2. I’m in complete agreement, the problem is, coal is still so cheap nothing can compete. Nuclear is the next closest, but with its instability still ongoing in Japan I think we should seriously fund ‘green’ alternatives solar and wind. But with the coal and oil lobbies I don’t it getting enough government help – and how can private industry compete without that? The hole is too big.

          3. There is no coal in Oregon. But Oregon has a plus. We have the Bonneville Dam and a wind farm which produces all of the electricity for Oregon. What ever electricity FB with use will come from what Oregon produces.

      2. We make of this statement that Greenpeace is a slightly crazy or irrational organization at times, since that’s just a reflection of what the power grid looks like where they are, I presume, mitigated *slightly* by the amount of solar panels the data centers have built.

  3. @Scobleizer How do you say this?
    “This datacenter is the most energy efficient in the world ” Any measures out there?

    1. Execs from Google and Microsoft were touring today and they didn’t argue with the claim. Also, Facebook made that claim over and over a week ago and no one reputed it. So, it stands.

    2. Google:
      “The trailing twelve-month (TTM), energy-weighted average PUE for all of these facilities is 1.16, exceeding the EPA’s 2011 goal for state-of-the-art data center efficiency.”
      http://www.google.com/corporate/datacenter/efficiency-measurements.html

      Facebook claims 1.15…then uses an 8 hour period measurement to come up with 1.07…
      The Prineville facility is expected to have a Power Usage Effectiveness (PUE) rating of 1.15.
      http://www.datacenterknowledge.com/the-facebook-data-center-faq-page-3/

      The 1.07(with is accompanying notes below)
      PUE calculated at full load over an 8 hour period during the commissioning stage in December 2010. We expect our PUE to fluctuate over time and will report it on a quarterly basis.
      http://opencompute.org/

      Thanks for the spin guys…it would be nice for them to wait to actually find an annual or even quarterly average, but who cares about that – they can claim 1.07 over 8 hours!!!

  4. One of the big takeaways from this is, no matter how much we talk about “the cloud”, every bit and byte is still on a physical machine(s) somewhere. Server hardware must surely be the next top hardware “thing”.

    1. Yeah, they are a little harder to do than a normal photo, but are so good in places like these. It’s really difficult to share the scale of a building like this. Hopefully a bit of that came through.

      1. is there a tool that maps a place in 3d using geolocation and then sort of photosynthes it together from all the images people take from the tour? maybe the true magic behind color could do that?

  5. is there a doom/3d shooter map of the datacenter? i’m sure it’d be the hell of a facebook game : hide’n’seek in our datacenters ;)

  6. how does the energy efficiency and overall design compare with that of the other big player like microsoft and google who are always said to have really good and efficient datacenter designs? (the ms azure deployment system e.g. seemed really clever 2 years ago)

  7. (sorry for mass-asking questions, they just keep coming to my head)…
    is there a crowdsourcing system for servers? i mean: how much energy, traffic and compute capability is just wasted all over the world by private computers idling in the net (just updating tweets or pseudo processing emails)? with a clever system a fraction of each could be used to create a highly redundant server network – kind of like bittorrent meets seti .. data security should be easy by cleverly distributing pieces (microtask-like in a way)…

    1. Each server has a Seagate hard drive in it. I’m not sure if there are some computers that have lots of SSD in them. I’ll try to find out. Look at OpenCompute, though, and they share their specs for the computers.

  8. This is really cool. It seems that there is a lot of opportunity for innovation in the world of data centers. This is great .. thanks for posting!

  9. This is really cool. It seems that there is a lot of opportunity for innovation in the world of data centers. This is great .. thanks for posting!

  10. This is a really great post, Robert, for several reasons:?1) level of detail 2) ability to capture the spirit of Facebook, which is trying hard NOT to be just a corporation (quilts, site selection, solar). I knew this from the HQ tour Randi gave me. 3)demonstration of what it takes to run an online site with 600m visitors demanding that it never be down. You captured all that. Thanks.

  11. That’s one hell of a lot of servers and engineering for one hell of a lot of pokes / sheep / werewolves / embarrassing photos / mafia crimelords / complicated relationships / procrastinators ;)

  12. The State of Oregon put in miles and miles of fiber in the sparsely populated area east of the Cascades many years ago. They caught hell because 99.9% was dark – no users, no prospects. I was one of those who shook my head at the “waste” of money. No more. With Google and Facebook data centers now in place the investment looks better. Planning a data center? Come to Oregon!

  13. That’s an interesting thought. I wonder if, during some sort of “slow” period (does fb have a slow period?), they could donate their cpu’s to genome research, cancer research, etc.

  14. WOW. Super cool pics and I didn’t really expect anything less from the social giant!! Thanks for sharing this Robert and I must admit I wish I was there to see it.

  15. Very nice write-up, Robert. I envy you the opportunity to walk the facility! Aside from the fantastic job they did on the facility design, their new servers are very impressive, too… 38% more efficient, while 24% less costly than anything else on the market. You can see some numbers on the facility and the new servers on this post: http://docsheldon.com/i'm-no-facebook-fan-boy-but…/

    The fact that they’ve put all their drawings and specifications for both the servers and the facility up for public access is to be applauded, too.

  16. I work for a major server vendor and we could never get away with building systems like this. Frankly I don’t know how they do either, but I guess it is because they don’t sell them. These systems would never pass any FCC classification. They have no grounding or electrical shielding, so they would emit massive amounts of electrical noise.

      1. By the way, both my Verizon and AT&T iPhones worked fine in the datacenter (which is funny because they don’t work in downtown San Francisco), so electrical noise sure isn’t a big deal to those. If it exists at all. And I doubt you’re right about not grounding them. How can you figure that out from the photos?

    1. facebook != the internet
      it’s not a direct feed to them… geez

      That’s like saying if you don’t agree with Walmart, don’t drive.

  17. I disagree. First of all there isn’t anything around the town except for farms. So the impact of a huge building is minimal. Second of all, it brought TONS of construction jobs and money to the town. Third of all you now know about Prineville, you would never have heard about this town otherwise. Fourth of all the hotels and restaurants in town say their business has started going up for the first time in years. Fifth, the jobs at the datacenter pay at least 150% of other jobs in town. Sixth, there will be a constant stream of visitors to this location, both Facebook employees who need to work on various hardware, but also other visitors, they will add money into the local economy. Seventh, I bet that over the next five years several other companies locate their datacenters here. Why? Facebook already did the hard work to prove the location is great. Which will bring more construction jobs, etc to the area.

  18. Prineville, OR, seems to me hosted a BMW Motorcycle Owners National Rally a few years ago. It really is in the middle of nowhere. Sounds like a good place to build a datacenter.

  19. Prineville, OR, seems to me hosted a BMW Motorcycle Owners National Rally a few years ago. It really is in the middle of nowhere. Sounds like a good place to build a datacenter.

  20. porque el camino aun no lo conocemos, unamonos para que aquellos que ya recorrieron y cayeron se vuelvan a parar, no nos hagamos el camino mas dificil,unamonos en una sola fuerza, para una mejor dignidad de vida.Porque estamos en busca de una inclusion, de un trato con derechos, por la lucha de una vida digna.los invito a unirse a esta linda causa por los discapacitados. Sitio web:
    http://www.facebook.com/pages/AVE-FENIX-DISCAPACITADOS-EN-BUSCA-DEL-RENACIMIENTO-SOCIAL/169970189718622
    http://www.blogger.com/home
    http://wanm25.wordpress.com/wp-admin/po
    http://www.linkedin.com/updates?&trk=msi

  21. porque el camino aun no lo conocemos, unamonos para que aquellos que ya recorrieron y cayeron se vuelvan a parar, no nos hagamos el camino mas dificil,unamonos en una sola fuerza, para una mejor dignidad de vida.Porque estamos en busca de una inclusion, de un trato con derechos, por la lucha de una vida digna.los invito a unirse a esta linda causa por los discapacitados. Sitio web:
    http://www.facebook.com/pages/AVE-FENIX-DISCAPACITADOS-EN-BUSCA-DEL-RENACIMIENTO-SOCIAL/169970189718622
    http://www.blogger.com/home
    http://wanm25.wordpress.com/wp-admin/po
    http://www.linkedin.com/updates?&trk=msi

  22. Wow! They essentially made the entire building into the computer case with filters and fans. The motherboards are just laying in racks. It’s a giant super computer with tiny people walking around inside! Thanks for the photos Robert. Very cool!

  23. It is pretty amazing to see what goes into the back end infrastructure like Facebook. Many people don’t realize what it really takes to keep a site like that up and running.

  24. so much power and technology, only to waste time (mostly during working hours). Hmm :)

    Strange new world

  25. It was around 70 degrees, if I remember right. They said they are going to try to run at about 85 most of the time. It wasn’t very cold, although the air outside was fairly cold that day.

  26. Yeah, I agree with Dagg. Facebook uses a lot of power and tecnology to help people waste time. lol But actually, everything has a positive and a negative side. With Facebook, people had the access to a great business tool and communication line.

  27. Man Facebook is just taking over. I don’t like that.

    Robert there’s something that’s bothered me about this blog for months. You don’t have it centered on the web page – it’s too far to the right. Is that intentional?

  28. This is SOOOOO exciting – what a brilliant post, Robert. You so rock. Thanks for bringing your travels and insider peeks to us. I loved your interview with Mayor Betty Roppe too – what a sweet lady she sounded like. Prineville must be ecstatic about this new addition to their economy. Plus, the magnitude of these servers and the infrastructure just shows how enormous Facebook is becoming!

  29. Why on earth would you put your datacenters pictures on the web. Where is the security?
    I have seen some of facebooks outsourced datercenters and I am really not that impressed. I work at a datacenter and we never allow pictures or strangers at our site.

  30. You are 100% correct! We get it from Bonneville Dam and the Wind Farm. We even supply most of California with electric too. A few years back California was having brown outs. The reason for that is that the state blew off paying the electric bill to Oregon so we cut their power by half. Its true. Research it. LOL

  31. Those photos are great! Both the outside and the inside look imposing, with the clouds and those massive fans and stuff. Very amazing! Thanks for sharing.

  32. They have lots of smaller datacenters inside bigger colo spaces, but they are building several like this one. I believe the next will be in North Carolina.