10 things to learn from Facebook's approach to data centers

Summary: Facebook's sprawling Forest City, North Carolina data center is designed to be a beacon of energy friendliness in the watt-sucking world of server farms. On a recent media tour of the facility, officials showed off green-computing practices that they say have saved the company more than $1 billion.

It's ok to be a follower when deciding where to build

FOREST CITY, NC — This one's the no-brainer. Like many of the Silicon Valley companies, Facebook is known for its office perks. But with only an average of 80 employees at the rural Forest City center, the staff optimized their surroundings to make room for fun — by the way of frisbee golf. Good thing they have branded electric carts to make their way to each post across the 160-acre campus overlooking the South Mountains.

Fun fact: The data center site was formerly home to a toxic textile plant and later a boat manufacturer.

Air recycling is an exhaustive priority

Keven McCammon, the data center's site manager, discussed at length the bevy of air-recycling mechanisms at work in the facility, where 100 percent of the air used for heating and cooling is sourced from the outdoor supply. The white tank next to him and the louvered window just beyond that are the starting point for corridor after corridor of air filtration, cooling and cleansing technology. Even the innocuous ground below him contains a plenum which holds hot air that's later recycled — a key component to how the facility combats the super-humid North Carolina summers. In the winter, hot air caught in the plenum is used for heating the admin area, but in summer it's used to remove humidity and dry the air for use in the server area.

There is immense power in air pressurization

Air is moved throughout the data center entirely by pressurization. Although the photo doesn't show it, heavy suction is wafting air from one side to the next. The area is so pressurized that each corridor entrance has a double-door vestibule to stabilize the pressure. That air is eventually pushed down a shaft into the server hall where it's pulled across the electronics to cool them off. Then that air is sent to the hot aisle and eventually pushed into the plenum, where it's either recycled or sent out the back.

A little water goes a long way

Nothing inside the data center's cooling mechanism is custom made, including this Munters mesh that uses water to regulate the temperature and refresh the air. The system creates a climate with an optimized combination of temperature and humidity — while also using 80 percent less energy than traditional cooling methods. McCammon said the system has also allowed them to eliminate the need for reverse osmosis and cut down on water usage, as water collected from the system is continually reused.

Redundancy, redundancy, redundancy

Facebook likes to think it broke the theory of typical data center design with its high levels of redundancy. Much like with the cooling fans, where one clicks on if another fails, the same goes with the power. The facility has a reserve bus in the power system that allows them to switch between primary and reserve power. In previous data centers that had to be done by switching to generators.

Community can accelerate the pace of innovation

This is the Open Compute database server, one of the latest Open Compute projects. The new design uses flash for storage and is a 40-percent efficiency win compared to previous database devices. It's all solid state with no moving parts, giving it faster response and easier storability. Facebook officials said that open sourcing the things the company's built interally — and then building a community around those designs — has made them better, faster.

Efficient technology is vanity free

Those pizza box-looking cardboard slabs stuffed between the server trays are not there to warm lunch — they're there to force air down to the servers to cool them off. Traditional server towers also typically come with a plastic bezel on the front, but the team found out that by not putting them on there's no longer a need to commission the plastic or deploy it, and as a bonus the fans in the back don't have to work as hard (which obviously saves more engery). Facebook's message here: Looks are secondary to utility and efficiency.

Data analysis can influence design

This is the cold storage facility officials said is the redundant of the redundant. By unlocking user data, Facebook realized that 82 percent of traffic is on 8 percent of its photo base — so older photos that aren't in heavy rotation go here, where the servers are not as active or drawing on as much power. This makes server usage three times more efficient and the building five times more energy efficient, officials said.

Always plan for growth (especially when you're Facebook)

Facebook anticipates a growing need for cold storage, and it's got space planned and ready for commission. The cold storage facility in Forest City is shaped like an E, and so far only one leg of the letter is being used, and not even at full capacity. But with 400 billion photos on the social media site, and 350 million more coming in each day, there'll be no shortage of demand for the digital attic space.

Keep employees playing hard

This one's the no-brainer. Like many of the Silicon Valley companies, Facebook is known for its office perks. But with only an average of 80 employees at the rural Forest City center, the staff optimized their surroundings to make room for fun — by the way of frisbee golf. Good thing they have branded electric carts to make their way to each post across the 160-acre campus overlooking the South Mountains.

Fun fact: The data center site was formerly home to a toxic textile plant and later a boat manufacturer.

A little water goes a long way

Nothing inside the data center's cooling mechanism is custom made, including this Munters mesh that uses water to regulate the temperature and refresh the air. The system creates a climate with an optimized combination of temperature and humidity — while also using 80 percent less energy than traditional cooling methods. McCammon said the system has also allowed them to eliminate the need for reverse osmosis and cut down on water usage, as water collected from the system is continually reused.

Natalie Gagliordi is a staff writer for CBS Interactive based in Louisville, Kentucky, covering business technology for ZDNet. She previously worked as the editor of Kiosk Marketplace, an online B2B trade publication that focused on interactive self-service technology, while also contributing to additional websites that covered retail tec...
Full Bio