The company has released specifications for its Open Compute Project -- Facebook's secret data center recipes for rack-mounted servers that weigh less and power systems that are more efficient -- and also is talking about its methods for cooling lots of computers without air conditioning.

Call it an open-source data center design.

Facebook thinks that by sharing information it can help make data centers better. That will be good for Facebook, but also good for new businesses that crunch a lot of data, said Facebook CEO Mark Zuckerberg. "We're trying to foster this ecosystem where developers can easily built startups, and we think that by sharing this we're going to make it more efficient for this ecosystem to grow," he said.

Data center needs are only going to get more intense for Facebook as it adds more real-time applications, Zuckerberg said. "So being able to design more effective servers, both from a power-efficiency perspective and a cost perspective, is a big part of us being able to build all the stuff we build," he said.

The company has been tweaking and tuning its data server specifications for about a year now: cutting out big uninterruptible power-supply systems, designing a building that doesn't need air conditioning, and working with server makers to build lighter and cooler systems that are easy to repair.

"These servers are 38 per cent more efficient than the servers we were buying previously," said Jonathan Heiliger, vice president of technical operations at Facebook. They also cost 24 per cent less than the industry standard to build out, he said.

Facebook has partnered with Advanced Micro Devices, Intel and Quanta on the Open Compute Project, and is also working with Dell, Hewlett-Packard, Rackspace, Skype and Zynga on new designs

The bare-boned boxes may not be much to look at -- Facebook calls the design "vanity free" -- but they get the job done. Citing a standard measurement of data center efficiency, the Power Usage Effectiveness rating, Facebook says that Prineville is a 1.07. That's far below -- and therefore much more efficient -- than the industry standard of around 1.5.

Facebook's servers are about 6 pounds lighter but thicker than a typical 1U (1.75 inch) rack-mounted system. At about 1.5U thick (2.6 inches), they can squeeze in taller heatsinks with more surface area and larger, more efficient fans. That, in turn, means that less air has to be pumped through the servers to cool them. And they can be opened and serviced without tools -- snaps and spring-loaded plungers hold everything in place.

There is one luxury in the vanity-free design. Blue LED lights, which give Prineville a Facebook look, glow in the data center. They cost 5 cents more per light than the cheapest alternative.

Robert McMillan covers computer security and general technology breaking news for The IDG News Service. Follow Robert on Twitter at @bobmcmillan. Robert's e-mail address is robert_mcmillan@idg.com

Copyright 2018 IDG Communications. ABN 14 001 592 650. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of IDG Communications is prohibited.