Facebook announced the Open Compute Project in 2011 as a way to openly share the designs for its data centers — "to spark a collaborative dialogue ... [and] collectively develop the most efficient computing infrastructure possible."
Starting in 2009, three Facebook employees dedicated themselves to custom-designing servers, server racks, power supplies, UPS units, and battery backup systems for the company's first data center in Prineville, Oregon.
By 2011, Facebook's data center in Prineville used 38% less energy to do the same work as the company's other data centers at the time, while costing 24% less.
Since then, Facebook has improved on its designs, and last summer, opened another data center in Lulea, Sweden.
With the help of Facebook and photographer Alan Brandt, we compiled some photos to show off what Facebook's data centers look like from the inside and outside. And these are some really gorgeous-looking facilities.
The interior of Facebook's data center in Forest City, North Carolina. The company launched this center in 2010.
Facebook began construction on its second data center in Lulea, Sweden, in March.
Here's a rendering of what the finished Lulea data center using rapid data center deployment (RDDC) will look like.
Inside Lulea's first data center building, you can see Facebook's "vanity free" approach to design, since there are no plastic bezels in front of its servers — something commonly found in other data centers — to allow those servers to draw in more air.
In the Lulea data center, web server and storage designs use snaps and spring-loaded catches to hold components in place.
Lulea's rapid deployment data center (RDDC) design is all about being lean, which allows Facebook to deploy two data halls in the time it previously took to deploy one, thus reducing the cost of construction.
This is Facebook's data center in Prineville, Oregon, which is the first data center deployed using the company's Open Compute Project designs.
Facebook used 1,560 tons of steel to build its Prineville data center, which is the equivalent to 900 mid-size cars.
Facebook's Prineville data center also uses a lot of wires and cables. In fact, there are 950 miles worth of wires and cables in this data center alone, which is roughly the distance between Boston and Indianapolis.
The Prineville data center also has a ton of concrete: 14,254 cubic yards, to be exact. Imagine a sidewalk that's 24.3 miles long.
Thanks to Facebook's unique server design, technicians, like this one working in Prineville, don't have to spend time finding the right tools and unscrewing multiple components every time they need to replace a failed component.
Facebook's rapid data center deployment — currently only used in Lulea, and not Prineville, the data center pictured here — is similar to assembling a car: The structural frame is built before all of the components, which are attached on an assembly line in a factory. The entire structure is driven to the building site on a truck.
With the efficiency gains afforded by the unique server designs, Facebook has reduced the average repair time to swap parts by more than 50%.
Here you can see technicians delivering server racks to Lulea's building one, the company's first data center building.
As a result of these unique data centers, Facebook can handle 6 billion daily "Likes," as well as the 400 billion photos and 7.8 trillion messages that have been sent since Facebook was founded a decade ago.
Now that you've looked inside Facebook's data centers ...
See what cars Mark Zuckerberg (and 14 other tech millionaires and billionaires) are driving these days >>
ncG1vNJzZmivp6x7o8HSoqWeq6Oeu7S1w56pZ5ufonxygYypn6isn6h6sLKMn5icnZKkvKx5w5qrmmWTmru1sdGsZGtoYWl6eg%3D%3D