Forum for Science, Industry and Business

Data center design study: Cool consideration of a hot issue

06.04.2004

Over the next two years, researchers at Binghamton University and partnered institutions will be helping to protect life as we know it. While the claim might sound extreme, keep in mind that they will be working to improve the design and energy efficiency of data centers.

Data centers. Thousands of them. All processing vital information, critically important to much that drives our daily lives-- from world financial markets, government and military operations, business and industry, worldwide shipping and transportation, health and human services, entertainment--even organized athletics and religion.

Keeping data centers in business by optimizing their design, energy efficiency and information processing efficacy is the goal of a $437,270 two-year project backed by a $247,533 contract from the New York State Energy Research and Development Authority.

Peter R. Smith, NYSERDA’s president, said the data center project is one of several high-tech, but fundamentally important research projects sponsored by the public benefit corporation. “These data centers are high electric-demand nerve centers whose utility service is large. They require stable and secure electric power for machine operation and cooling. Considering New York’s prime financial center role, NYSERDA seeks to find ways to serve these centers efficiently and securely, and then replicate those designs at universities and other large computing power centers around New York.”

The project, “Optimizing Airflow Management Protocols in New York Data Centers,” will team researchers at Binghamton University, Georgia Tech, Lawrence Berkley Labs and IBM. The project will focus on surveying, modeling, and then testing design improvements to an existing Manhattan data center, all in hopes of devising new design strategies that can be employed the world over.

Just now approaching their adolescence, data centers have already become the heart and central nervous system of the information age. Without these data storage and processing strongholds, the Internet would be reduced, at least temporarily, to a grown-up, digitized version of the old Campbell’s soup-can network, where end users hunker down and share sketchy information available to few and meaningful to even fewer. Search engines would die with nary a sputter. Encyclopedia salesmen would be cruising middle-class neighborhoods by tomorrow. “Googling” someone, unless quickly redefined and with a possible jail term attached, would be out of the question.

Not to worry. Data centers, like most adolescents, probably aren’t going to leave us anytime soon. They will, instead, grow stronger. Hopefully, they will also grow smarter. Like it or not, they will reproduce. And like adolescents everywhere, data centers, at least for the foreseeable future, will continue to consume fuel at rates guaranteed to raise eyebrows and empty wallets.

Bahgat Sammakia, interim vice president for research at Binghamton University and director of the University’s renowned Integrated Electronics Engineering Center, has spent much of his 30-year career working to improve thermal management strategies in electronics packaging. That means devising ways to keep computers and other electronics from spontaneously combusting the moment they are turned on.

“Imagine the heat generated by a 100 watt light bulb,” Sammakia explained. “Now take that same amount of power and double it, so you have the equivalent of a 200 watt light bulb in an area the size of a computer chip. Now you have extremely high heat density. When you turn your machine on, the temperature shoots up to 200, 300 or even 400 degrees Celsius.”

Sammakia’s point is this: Without adequate thermal management, the electricity coursing through your computer would reduce it to a puddle of melted solder and burnt plastic before you could pull the power cord.

That’s the kind of thermal management challenge Sammakia and other electronics packaging researchers have become accustomed to dealing with at the device and packaging level. The issue becomes even more heated, however, when hundreds of pieces of electronic equipment, drawing thousands of kilowatts of power 24 hours a day, seven days a week, 52 weeks a year, are housed together in a data center, he said.

If the computers had their way, data center energy costs would be significantly higher. For all the heat they generate, computers actually thrive on cold. Most electronic computing devices run 30 to 40 percent faster at subfreezing temperatures, Sammakia said. Human interaction and monitoring of data center equipment, however, is a round-the-clock enterprise, so optimum temperatures in the range of 60 to 70 degrees Farenheit are maintained not because they best serve the needs of the machines, but for the practical purposes of human comfort and survival. While the computers might prefer an Arctic clime in data centers, achieving temperatures even as cool as a late spring day are a tall order, Sammakia said.

“We’re talking about rooms the size of basketball courts, where you have row after row of main frames and servers, all dissipating heat, and the entire room is designed with the sole purpose of sustaining this equipment and maintaining it at the right temperature,” said Sammakia. “These data centers, and there are hundreds of them in New York City alone, consume massive amounts of power. By making computations more cost and energy efficient, by reducing total energy consumption, and by passing on the environmental benefits of those savings, any energy efficiency can make a very significant difference.”

Simply discovering the best location for cold-air delivery vents could easily mean million-dollar savings once incorporated as a standard strategy in data center designs, Sammakia said.

The research team will be striving to achieve whatever energy efficiencies it can. Sammakia thinks improvements on the order of 20 to 25 percent are “very doable.” To reach that goal Binghamton University researchers will build numerical, computer models of the Manhattan data center during the first year of the project. Their models will then be used to enhance the team’s ability to predict the efficacy of design changes proposed throughout the remainder of the project.

Researchers at Georgia Tech will confirm the accuracy of the modeling by building an actual room at scale and taking appropriate measurements. Finally, in year two, the team will send researchers to take measurements in the actual data center and will write a design guide to help improve energy efficiency in all data centers. IBM, an established leader in energy metrics, will mentor modeling and measurements, while Lawrence Berkley Labs, already a prominent name in data center research and design in California, will help to benchmark the Manhattan data center.

Die letzten 5 Focus-News des innovations-reports im Überblick:

A new assessment of NASA's record of global temperatures revealed that the agency's estimate of Earth's long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing confidence that past and future research is correctly capturing rising surface temperatures.

The most complete assessment ever of statistical uncertainty within the GISS Surface Temperature Analysis (GISTEMP) data product shows that the annual values...

Physicists at the University of Basel are able to show for the first time how a single electron looks in an artificial atom. A newly developed method enables them to show the probability of an electron being present in a space. This allows improved control of electron spins, which could serve as the smallest information unit in a future quantum computer. The experiments were published in Physical Review Letters and the related theory in Physical Review B.

The spin of an electron is a promising candidate for use as the smallest information unit (qubit) of a quantum computer. Controlling and switching this spin or...

With a quantum coprocessor in the cloud, physicists from Innsbruck, Austria, open the door to the simulation of previously unsolvable problems in chemistry, materials research or high-energy physics. The research groups led by Rainer Blatt and Peter Zoller report in the journal Nature how they simulated particle physics phenomena on 20 quantum bits and how the quantum simulator self-verified the result for the first time.

Many scientists are currently working on investigating how quantum advantage can be exploited on hardware already available today. Three years ago, physicists...

'Quantum technologies' utilise the unique phenomena of quantum superposition and entanglement to encode and process information, with potentially profound benefits to a wide range of information technologies from communications to sensing and computing.

However a major challenge in developing these technologies is that the quantum phenomena are very fragile, and only a handful of physical systems have been...