Improving the energy efficiency of data centers is one of the top enterprise priorities--and the subject of many a discussion among executives. While there are low-cost tweaks companies can carry out to reap savings from energy efficiency, industry consultants emphasize that ultimately what is critical is having a holistic plan and proper assessment of steps to achieve overall datacenter efficiency.

Achieving energy efficiency in data centers is now a critical corporate priority, said Alex Tay, data center services executive for the Asia-Pacific region at IBM. This could be due to various reasons including the need to manage soaring energy costs or the pressure to "green up" the facility and corporate image, he told ZDNet Asia in an e-mail interview.

He noted that IT managers need to think about physical plant costs such as heating, cooling, ventilating, and lighting for their facilities, as well as the costs of operating IT equipment, to really understand their datacenter energy consumption before working out steps to increase energy efficiency.

Ed Ansett, Asia-Pacific and Japan director for critical facilities services at Hewlett-Packard, concurred that "it is not enough" to simply recognize the need of energy saving. "You need to quantify that need, assess its costs and benefits, prioritize the actions to achieve it, and if necessary, sell the whole package to management and investors," he said in an e-mail.

But roadblocks exist, Ansett cautioned. For instance, companies may use homegrown or ad hoc measurement methods, making it difficult to compare energy efficiency to that of peers or industry standards. Organizations could also adopt a piecemeal approach whereby they mistakenly pump major resources to work on problems with little return on investment (ROI) while overlooking issues that could significantly alter the energy equation and provide high ROI, he added.

Steve Wallage, managing consultant of London-based Broadgroup Consulting, also observed that most enterprises will find it "tempting" to just work with a list of technical recommendations to improve datacenter efficiency and reap savings.

Yet, the biggest savings are made when "taking a step back from the data center" and view issues from the broader business, strategic, IT and real estate perspectives, he argued. But the reality is it is not unusual for datacenter aspects to be ignored in business and IT decisions within an organization, Wallage noted.

According to him, the simplest workaround is to perform a full data center audit. This can range from scrutinizing all the servers in a data center to find out what they are doing, to looking across applications and then deciding what kind of data center is needed, for instance, in terms of tiering, size, physical location and back-up.

All three consultants shared with ZDNet Asia five low-cost, immediate tweaks that businesses can do to enhance the energy efficiency of their facilities--without major capital investments or compromising system integrity.

However, Tay also noted there were caveats. Such tweaks, he explained, can provide benefits such as energy savings between 10 and 35 percent, but the extent of savings differ for individual data centers due to variants including regularity of maintenance, lifespan and type of air-conditioning equipment, and rack server load.

1. Increase temperature Wallage said many companies are spending large amounts of money to cool data centers and typically maintain the temperature at 21 degrees Celsius. But if the temperature is raised, say to 28 degrees or even higher, the amount of cooling required is reduced and instead, more outside air is utilized--that is, free air cooling, he pointed out.

Ansett concurred: "It's fine to operate the inlet temperature to hardware up to 27 degrees."

2. Try blanking plates Ansett said reducing bypass, recirculation and negative airflow can improve energy efficiency, and pointed to blanking panels as a solution. "At the most elementary level, this means proper sealing of cable entry points and the use of blanking plates within equipment racks, he noted.

Tay explained that when equipment racks are not filled 100 percent, the uncovered gaps in between the servers within the racks create a bypass for cold air to escape to the hot aisle. As a result, the cold aisle is not as cold as it can be, and the air-conditioning systems have to work harder to keep the cold aisle cool. Placing low-cost blanking panels for every space not used in the rack can "significantly" reduce the amount of cold and hot air mixing, thereby upping energy efficiency, he said.

3. Define hot and cold aisles For data centers that have yet to do so, rearranging the racks according to a hot aisle/cold aisle layout is a low-cost solution to improve energy efficiency--and complements the use of blanking panels, Tay said. Without the layout, exhaust from the front row blows hot air to the inlet of the back row, meaning the back row servers do not get enough cool air.

He however noted that this concept is not practised by every data center since a downtime window is difficult to come by. So data centers end up spending large investments to add supplementary cooling or pay more to cool the room to ensure sufficiently cold air reaches every server.

Ansett observed that the complete physical separation of the hot and cold aisles is becoming a practised norm for new data centers, but it is also possible to retrofit such a containment system in existing data centers.

4. Do a computational fluid dynamics analysis Many data centers have been filled up over the years and racks requiring high power density may end up being placed next to lower density racks, yet cooling systems are working hard to maintain an even temperature throughout the facility, noted Wallage. This results in high inefficiency and hot spots in the data center. Carrying out a simple computational fluid dynamics (CFD) analysis can show airflow and "heat maps" across the data center, he said.

Such an analysis can also help in determining how to best make use of the computer room air conditioning system, added Ansett.

5. Cover floorboard cut-outs to stop air leakage According to Tay, many data centers cut holes on the floor board to allow cables to enter the racks. A lot of air can escape if the holes are not covered, reducing underfloor air pressure. What this means is there is insufficient air for the servers where it is really needed, he pointed out.

Investing in more air-conditioning units is unnecessary--and consumes even more energy, he noted. Instead, a low-cost solution is to cover all the cable cuts with brushes to prevent air leakages. "All of a sudden, your air conditioning [becomes] more powerful in delivering cold air," he said.

He added that air flow can get blocked if the underfloor becomes choked with cables. Many companies mistakenly think their air conditioning is not working enough and buy more units, but all that is needed is to removing the cable blockage so that the stress on the air conditioning system is reduced--that way, more cold air is available where needed, noted Tay.

Jamie Yap covers the compelling and sometimes convoluted cross-section of IT and homo sapiens, which really refers to technology careers, startups, Internet, social media, mobile tech, and privacy stickles. She has interviewed suit-wearing C-level executives from major corporations as well as jeans-wearing entrepreneurs of startups. Prior...
Full Bio