Facebook on Thursday launched an initiative it calls the Open Compute Project, an effort to share the specs and designs of the custom servers in its data center in Prineville, Ore. In other words, Facebook is going to open source its hardware designs just like the software industry largely has.
In many respects, Facebook is open sourcing its data center and server designs. Jonathan Heiliger, vice president of technical operations, said the Open Compute Project is a way of giving back. It’s also a way to get vendors with more scale to incorporate Facebook’s designs to meet its needs with cheaper systems.
The big question is what tech vendors—all pitching their own designs—will make of Facebook’s effort. The move to rid servers of vanity plastic would mean no branding. Facebook went completely barebones. That works for efficient computing, but leaves little for vendors to differentiate with. The figures, however, lean toward Facebook’s favor.
While the fallout remains to be seen, the fact Facebook is detailing all of its specifications for servers and its data center—including the CAD drawings—is going to be disruptive.
Facebook’s PUE rating, the power used for data center compute, came in at 1.07. The industry average is 1.5 PUE and Facebook’s leased data center is 1.4 to 1.6 PUE.
Mark Zuckerberg, CEO of Facebook, said:
You can build servers and design them or get the products that the mass manufacturers put out. A lot of the stuff put out wasn’t in line with what we needed. We’re not the only ones that need the hardware we’re building out.
Among the key points:
- On the server designs, Facebook went lightweight and ditched any screws and “vanity plastic.” The chassis is simple and has 22 percent fewer materials. Indeed, Facebook’s servers weigh 6 pounds less. Facebook also went with custom Intel and AMD motherboard. Expansion slots were ditched. Power supplies were simplified and included with a backup. Power supply efficiency exceeds 94 percent. Racks are in triplets for easy deployment and swaps.
- Facebook aimed to cut the energy loss from the grid to the motherboard. There’s a 480 volt electrical system through the data center and 277 volts go directly to each server.
- Facebook’s data center has no air conditioning system. It is cooled with outside air. Walls eliminate water particles to only get the cool air. There is no duct work in the data center.
- Facebook uses localized uninteruptable power supplies serving six racks of servers.
The upshot here is that many IT buyers will look at Facebook designs and incorporate them into what they do. It’s highly likely that technology vendors will have to respond.
A few thoughts:
- Facebook’s move will change the intellectual property dynamics in the hardware industry. How many IT buyers and vendors are already working to solve the same problems Facebook has?
- Smaller companies such as Zynga will simply adopt Facebook designs.
- IT buyers may start asking for Facebook’s data center approach. Dell is already offering systems based on Facebook’s designs.
- Existing hardware vendors may see more commodity pricing.
- Open standards are going to make it more difficult for hardware giants pitching proprietary stacks.