Newsroom > News Release

NSF Funds Expedition into Software for Efficient Computing in the Age of Nanoscale Devices

San Diego, Aug. 19, 2010 -- As semiconductor manufacturers build ever smaller components, circuits and chips at the nano scale become less reliable and more expensive to produce. The variability in their behavior from device to device and over their lifetimes – due to manufacturing, aging-related wear-out, and varying operating environments – is largely ignored by today’s mainstream computer systems.

Sensor processing chips like the smallest solar-powered sensor system, developed in the lab of Variability co-PI Dennis Sylvester at the University of Michigan, is an example of the trend to millimeter-scale devices with nano-scale parts that perform more variably than traditional computer components. For more on the Phoenix 2 chip, watch the video at http://www.youtube.com/watch?v=Nku7vJaw3pE.

Now a visionary team of computer scientists and electrical engineers from six universities is proposing to deal with the downside of nanoscale computer components by re-thinking and enhancing the role that software can play in a new class of computing machines that are adaptive and highly energy efficient.

The National Science Foundation (NSF) today awarded a $10 million, five-year grant to researchers who will explore “Variability-Aware Software for Efficient Computing with Nanoscale Devices.” The grant is part of the funding agency’s Expeditions in Computing program, which rewards far-reaching agendas that “promise significant advances in the computing frontier and great benefit to society.”

“We envision a world where system components – led by proactive software – routinely monitor, predict and adapt to the variability in manufactured computing systems,” said Rajesh Gupta, director of the Variability Expedition and a professor of computer science and engineering at the University of California, San Diego’s Jacobs School of Engineering. “Changing the way software interacts with hardware offers the best hope for perpetuating the fundamental gains in computing performance at lower cost of the past 40 years.”

In the UCLA lab of Variability co-PI Puneet Gupta, JPEG compression of the same image done with variability-aware software (at right) produced the same result in quality -- but using more than 40% less power.

Joining Gupta in this effort is the expedition’s deputy director and electrical engineering professor Mani Srivastava from the University of California, Los Angeles’ Henry Samueli School of Engineering and Applied Science, and a team of eleven other computer scientists and electrical engineers from University of Michigan, Stanford University, University of California, Irvine (UCI) and the University of Illinois at Urbana-Champaign (UIUC). The project is based in the UCSD division of the California Institute for Telecommunications and Information Technology (Calit2).

Spending 40% less energy on handheld platforms thanks to variability-aware software would be a real boon to medical first responders sharing situational awareness in the field (pictured here on the UC San Diego campus during a county-wide disaster drill at Calit2, home base for the Variability Expedition).

The research team seeks to develop computing systems that will be able to sense the nature and extent of variations in their hardware circuits, and expose these variations to compilers, operating systems, and applications to drive adaptations in the software stack.

“As the transistors on their chips get smaller, semiconductor makers are experiencing lower yields and more variability – in other words, more circuits have to be thrown away because they don’t meet the timing-, power- and lifetime-related specifications,” said Michigan’s Sylvester, an expert in designing computer circuits in nano-scale technologies. If left unaddressed, added UCLA’s Puneet Gupta, “this trend toward parts that scale in neither capability nor cost will cripple the computing and information technology industries. So we need to find a solution to the variability problem.”

How the Variability Expedition's research agenda will be handled by teams of computer scientists and/or electrical engineers in cross-campus collaborations.

Software experts on the team will develop models and abstractions to expose the hardware’s variable specifications accurately and compactly, and to create mechanisms for the software to react to variable hardware specifications. Hardware researchers will be focused on more efficient design and test methods to ensure that device designs exhibit well-behaved variability characteristics – ones that a well-configured software stack can easily communicate with and influence. “The resulting computer systems will work while using components that vary in performance or grow less reliable over time,” explained deputy director Srivastava. “A fluid software-hardware interface will mitigate the variability of manufactured systems and make them robust, reliable and responsive to the changing operating conditions.” Added professor Rakesh Kumar, who will lead the expedition efforts at UIUC: “Steering the effects of the variability will be particularly important.”

Variability-aware computing systems would benefit the entire spectrum of embedded, mobile, desktop and server-class applications by dramatically reducing hardware design and test costs for computing systems, while enhancing their performance and energy efficiency. Many in-demand applications – from search engines to medical imaging – would also benefit, but the project’s initial focus will be on wireless sensing, software radio and mobile platforms of all kinds – with plans to transfer advances in these early areas to the marketplace.

Variability Expedition principal investigator Rajesh Gupta in front of the GreenLight Instrument -- a modular data center at UC San Diego equipped to gauge the energy efficiency of computing systems like the ones to be developed under the new Expeditions in Computing project.

To ensure that the project reflects real-world challenges in the computing industry, organizers have recruited a high-powered Technical Advisory Board that initially includes top industry executives from HP, ARM, IBM and Intel. The Board also includes two senior academic researchers with expertise in modeling and manufacturing of nano-scale devices and circuits: Robert Dutton and Andrew Kahng, who are professors at Stanford and UC San Diego, respectively.

“If this project is successful, and the breakthroughs are transferred to industry,” observed Stanford’s Subhasish Mitra, “we will have contributed to the continued expansion and reach of the semiconductor and computing industries.”

Transforming the relationship between hardware and software also presents valuable opportunities to integrate research and education, and this Expedition will build on established collaborations with educator-partners in formal and informal arenas to promote interdisciplinary teaching, training, learning and research. Assisting the researchers in this will be three other members of the team: William Herrera, the Educational Coordinator for UCLA Engineering, and consultants Eric Arseneau and Shirley Miranda at UC San Diego, who are experts in science and technology education and outreach at the middle and high school levels.

The Expeditions in Computing program, now in its third year, was established by NSF’s Directorate for Computer and Information Science and Engineering (CISE), to provide the CISE research and education community with the opportunity to pursue ambitious, fundamental research. The grants represent some of the largest single investments currently made by the directorate.