2.29.2012

Researchers at IBM’s Austin
lab recently earned “Best Paper” at The 44th Annual IEEE/ACM International Symposium on Microarchitecture for work on “Active Management of Timing Guardband to Save Energy in POWER7” – a new way of managing and reducing
microprocessor power consumption.

Charles Lefurgy, a Master Inventor on the team, discusses how this technology works, and how it might affect chips in everything from servers to smart phones.

How much energy do microprocessors consume in a system –
from mobile devices to mainframes?

Charles Lefurgy: High performance microprocessors
can consume over 200 Watts, while mobile device microprocessors found in smart
phones may only consume about a half of a Watt.

How is this consumption monitored?

CL: In IBM
POWER7 processor-based systems, a dedicated “EnergyScale” microcontroller measures the power
supplied to individual components, such as the microprocessors, disks, memory,
and fans. While smart phones and PCs may have dozens of sensors to monitor
their operating environments, a POWER7-based system has thousands of physical and
virtual sensors to monitor its operation.

For this work, we use a new sensor built
into the POWER7 chip called a “critical-path monitor.” It tells us when the
chip is experiencing overly conservative conditions (high voltage or low
temperature). This knowledge allows us to safely reduce the chip voltage and
reduce chip power consumption by about 20 percent. Another key sensor we added in
POWER7 monitors chip-level performance in order to guarantee that optimizing
for high energy-efficiency does not harm performance.

Talk a little bit about the history of microprocessor
energy usage, and energy saving breakthroughs.

CL: Technology changes associated with
ever smaller CMOS transistor sizes have had a
major impact on reducing the energy used per computation. From a systems
viewpoint, dynamic voltage and frequency scaling has been commercially
available for over a decade, and is used in everything from smart phones to
supercomputers. This allows the power to reduce by lowering the supply voltage
and clock frequency used by the microprocessor when the processing demand
falls.

We have also seen many micro-architecture
improvements in how instructions are processed within the microprocessor. For
example, speculating the outcome of computations so that the next one can begin
before the previous one ends. This reduces the time that the microprocessor
sits idle (using power) while waiting for data to become available for
processing.

How does your use of the “Critical Path Monitors” technology
further improve energy savings in microprocessors?

CL: Today, the voltage level used to
operate a microprocessor is overly conservative. The reason for this is that a
large amount of engineering margin in the form of additional voltage is
used to protect the microprocessor from worst-case conditions that may occur in
the real world. For example, data centers may overheat or there may be an
unexpected workload with high power consumption – both of which may cause the
chip’s voltage level to droop. If the voltage becomes too low, electric signals
cannot propagate through the chip, and it fails.

The new capability we’ve demonstrated
safely removes some of this voltage margin. Operating at a lower voltage level
reduces the chip’s power consumption – in addition to using traditional
techniques.

The key to our work is to use the
Critical Path Monitor’s (CPM) precision sensing of the time it takes for the
circuits within the chip to complete a computation. If that timing changes due
to a voltage droop, then the chip can react very quickly to protect itself.

The protection is enabled by using
a new element in POWER7, called a digital phase-lock loop (DPLL). It continuously
watches the CPM and reacts within nanoseconds to temporarily slow down the chip
clock frequency to operate at a reduced voltage, and avoid a failure. Once we
had this safety mechanism in place, we modified the EnergyScale microcontroller
to look for opportunities to lower a chip’s voltage when it experienced
excessive voltage margins. Voltage is lowered until the DPLL just begins to
lower the frequency that keeps the chip operating correctly.

An additional benefit of using a
lower voltage is that the microprocessor is cooler. This allows the server fans
to run quieter, at a reduced speed and power consumption.

Can it be applied to microprocessors across all types of
systems and devices?

CL: Every microprocessor applies
additional voltage margin to eliminate harmful voltage droops. I expect that
any system from mobile devices to supercomputers could use our method to save
energy.

Is this a redesign that will take time to realize or
something that can be implemented in today's systems and devices?

CL: Our method requires new sensors,
and the DPLL clocking, to be built into the microprocessor. Today, only the
POWER7 processor has this capability.

How much energy savings could be realized – and would it
change system performance?

CL: Our results on an IBM Power 755 system
achieved a power reduction of about 20 percent for the microprocessor and 18
percent for the entire server.

One way this could be used in
practice is to increase the server density in data centers. Data centers often
have power limitations for the amount of equipment that can be installed in a
rack. Therefore, installing these energy-saving servers could allow a data
center to install 18 percent more servers within the same power budget, and
achieve 18 percent higher performance.

We observed no change in performance for industry-standard server
benchmarks. Our control algorithms
adjust voltage rapidly enough to tightly achieve a desired level of
performance.

What would this energy savings mean for a mobile device
or other consumer products?

CL: For mobile devices, the battery
life could be extended at today's performance level. However, the
microprocessor is a small part of the energy consumption of a mobile device
compared to the display. Therefore, the energy savings would be less, percentage-wise,
compared to servers.

Another benefit is that the microprocessor requires less cooling due to running
at a lower voltage. You could imagine using this in game consoles, laptops, and
PCs where you don’t want loud fan noise.

When and in what industry might we see this technology
first utilized?

CL: Critical Path Monitors are actively
used today in IBM Power 775 supercomputers to detect a loss of voltage level. But
currently, rather than reducing the clock frequency, a different method to
protect the system is used – and is the subject of a pending conference
publication.

We expect that our on demand
reduction of microprocessor voltage margin, using CPMs and DPLLs, will appear
in future IBM Power Systems starting in 2013.

How many patents were granted for CPM?

CL: IBM
holds 11 granted patents related to CPM, including the circuitry used to
synthesize a critical path and the circuitry to finely control the clock
frequency output by the DPLL. Another seven patent applications have been filed.
They’re for things such as methods to select the chip voltage level using the
CPM and DPLL.

The co-authors on the research paper are IBMers Alan Drake, Michael
Floyd, Malcolm Allen-Ware, Bishop Brock, Jose Tierno, and John Carter.

2.28.2012

Quantum computing has been a Holy Grail for researchers ever since Nobel
Prize-winning physicist Richard Feynman, in 1981, challenged the
scientific community to build computers based on quantum mechanics.

For
decades, the pursuit remained firmly in the theoretical realm. But now
IBM scientists believe they're on the cusp of building systems that will
take computing to a whole new level.

In August 2010, Thomas Weigold accepted a new challenge as Technical Assistant (TA) to Zurich Lab director Matthias Kaiserswerth.

Thomas was previously a member of the BlueZ Business Computing team and was keen to get broader exposure to areas and people across IBM.

As his term of office draws to an end, Thomas took a few moments to share some of the insights and experience he has gained in this special role.

Thomas, what motivated you to serve a term as Technical Assistant?

Thomas Weigold. Actually, I didn’t apply for the post, I was approached. I had been in the BlueZ group for more than 10 years and was ready for a new opportunity. It’s clear that this position grooms you for many different potential roles within the company, and I can really say that it broadens your horizons greatly.

As a TA you’re much more directly involved in strategy. You become familiar with all the major processes such as the Accomplishment cycle, budget planning, head count, awards, etc. You’re also involved in developing the technical strategy: the Big Bets, the Grand Challenges, the GTO.

So that’s the most interesting part of the job: first contributing to creating the strategy, then helping to get it implemented. As a researcher you don't usually see the big picture.

What is the biggest misconception about this role?

TW. (Laughs) Some people seem to think it entails sitting in Matthias’ office all day, getting calls from him at all hours of the day and night, traveling with him and, oh, I don’t know, carrying his briefcase or something. That’s not at all what’s it’s like to be a TA. I have two small children at home and wouldn’t have been able to serve in this role if it were like that.

Of course, as a member of Matthias’ team, I attend many of his meetings, but certainly not all of them. I work with our team to organize all the necessary events, particularly the technical aspects.

For example, the TA is typically responsible for creating a technical agenda for a given event because this requires an understanding of the research and technology projects we have here at this Lab.

Then our very competent assistants actually do most of the administrative work of contacting the participants and organizing the event.

What did you find most rewarding about being a TA?

TW. Being relevant to the company. You’re really at the source of information and strategy, and therefore can make a difference. The networking opportunities and visibility are also a big plus. Not only related to myself personally, but for the Zurich Lab as a whole. It’s important that we in Zurich remain visible within Research and are a part of the processes.

What were some of the highlights?

TW. Highlights in the life of a TA are typically being involved in all the important events hosted by Matthias.

Two major examples are of course the Nanotechnology Center opening with John Kelly and many other VPs visiting the Lab, and the TEC Technologieforum 2011, which attracted 37 CTOs and CIOs of important customers and generated a significant sales pipeline.

Another highlight was the IMT Alps Technical Interconnect event.

Of course, when you organize events, things don’t always go entirely smoothly. I’ll never forget the time our top speaker of an event, Paul Martynenko, was trapped in a plane in Heathrow for several hours due to a snow storm and I had to navigate him to St. Gallen as quickly as possible. He had left home in London early that morning and didn’t arrive in St. Gallen until after 6pm - just after the event had ended! Finally, he joined the event dinner and all turned out well.

Do you have any advice for your successor?

TW. Well, there’s a steep learning curve before you know what you’re doing! My predecessor Andreas Schade helped me a lot, especially at the beginning, and of course I’ll be available to help my successor get started.

Primarily the job is to help Matthias with all his tasks, and he of course has a number of roles, not only director of this Lab. He has roles in IBM Research worldwide, within the IBM corporation outside of Research, as well as some external roles, such as in the Swiss Academy of Engineering Sciences.

I always described my role as a long-term shadowing. There’s so much to watch and learn.

So I would recommend it as a good career move to anyone interested in doing something different because it gives you insight into so many areas, not only within Research but throughout all of IBM. It’s certainly an opportunity!

What is your next step? Are you going back to research or staying in a strategy-relevant role?

TW. I'm now becoming a manager in the Storage Technologies department, which is something that I had never considered before. Clearly, this opportunity would never have arisen had I not been the TA.

As I said, being a TA means that you’re involved in everything right at the source. So you get the first crack at interesting opportunities that may come along.

2.24.2012

Editor's note: this post is by Jim Spohrer, director, IBM University Programs Worldwide.

Working with academia has been a longtime passion of mine, and for the last three years, my full-time job at IBM. In this role, I am often asked to offer advice, from an industry vantage point, on what needs to change in order to improve university and college education.

After much thought and discussion with academics and industry practitioners, I have distilled my advice to four points:

1.Help students be transdisciplinary. Knowledge workers today need a combination of skills that span technology, business, and social sciences. This requires those three distinct parts of a university to work together.

Transdisciplinarity is not just working with someone who is expert in another area (interdisciplinary). In a paper I wrote with the University of Phoenix and the Institute of the Future, we defined this for students as “being equipped to think through different disciplinary approaches, themselves.” You can read the paper, here.

2.Students should work on real-world challenges. Design capstone and other team-oriented projects that require students of engineering, business, social sciences, humanities, and other subjects to work together on something that solves a real problem.

For example, we have held social media project competitions with the Hult International Business School that have led to important real-world experiences for the MBA students – and valuable insights on the students’ impressions and social media experiences on the IBM sites and blogs.

The students gave recommendations, such as developing clearer linkage and integration between all social media channels in a more holistic approach to social media, and implementing promotions like a social media-based IBM teen entrepreneur competition.

They also came up with fascinating ways to integrate IBM’s social media presence in new ways, such as using gamification, and guerilla marketing technqiues; creating an external IBM product support forum through Facebook; and encouraging employees to evangelize the brand messages.

And we also started the SmartPitch Challenge at the City University of New York (now in its tenth year!) The project challenges students to start their own businesses. Winning student teams are mentored by IBMers and public sector representatives. This year’s SmartPitch is already underway, but you can read about past finalists such as CashIn and Cosiety, here.

3.Find better ways to encourage faculty to reach across these disciplines. Today’s rewards (tenure, grants, and increased salaries) are optimized in the other direction. But institutions with discipline-oriented departments that seek rigor and ensure depth, combined with research centers that tackle real-world challenges can provide ample opportunities to motivate faculty.

4.Provide faculty and students more opportunities to connect locally and globally. Every student needs the experience of partnering with local businesses, government, and nonprofits to improve regional innovation ecosystems and quality of life. And they also need the experience of a semester abroad to be more-informed global citizens.

The increased flexibility in the university curricula is giving the students more opportunities to take international courses, and enhance their personal and professional development by exploring new cultures. And now, IBM has a recruiting initiative calledInternational Student, that's targeting the students who are studying abroad, and identifying opportunities for them in their home countries.

At IBM, our vision is that individuals and institutions are on a journey of increasing capabilities – and creating college graduates who are transdisciplinary problem solvers, and informed global citizens. Let’s reward faculty and students for depth and breadth of knowledge.

2.22.2012

Yaniv Corem joined IBM Research – Haifa in June 2010 after completing his undergraduate work at the Technion – Israel Institute of Technology, and earning his master’s degree in architecture and computer science from MIT. Aside from his enthusiasm for rock climbing and bouldering, Yaniv is passionate about projects that use the "wisdom of the crowd" to solve difficult problems, complete tasks, gather data, and more.

What is gamification?

YC: Gamification is the process of using game thinking and game mechanics in non-game applications to increase engagement. Game thinking can be used to make almost anything fun and encourage people to get involved.

Does competition really help people learn?

YC: Human beings are competitive by nature. Games bring out that sense of competition within a safe and fun environment, where learning takes place naturally. It's not just competition that does the trick, but an entire set of attributes that make games such powerful tools for learning. Gamification creates a safe environment in which to experiment without suffering the consequences. It also brings in the aspects of new experiences, cooperation with other players, and just having fun.

Competition can be an extrinsic motivator, for example, for a student competing with other students for the best grade on a test. But competition can also be intrinsic, when people push themselves to achieve a certain goal. For example, a toddler learning to stack objects will try the same thing over and over again, while grappling with complex concepts like gravity and balance.

How is IBM using gamification to help people learn and share information?

YC: One great example is in the area of product adoption. New users of Lotus Connections, for example, can find such a feature-rich environment daunting. Bunchball, a leader in gamification, developed a solution for IBM called Level Up to help users adopt Connections. It takes complex learning processes and breaks them up into smaller chunks called levels. At each level, a user/player is asked to perform specific tasks that help teach how to use the product. In return, the users are awarded points, badges, or titles.

Gamification could also be used to keep communities active by rewarding members for their contributions. An interesting byproduct of gamifying a community is the social analytics, such as finding the major contributors; the most helpful contributions; the interaction among community members, and more.

Which industries can benefit from games?

YC: Almost every industry can benefit from games and gamification. In a recent report, the analyst firm Gartner stated that by 2015, more than 50 percent of organizations that manage innovation processes will gamify those processes. The report also notes that by 2014, a gamified service for consumer goods marketing and customer retention will become as important as Facebook, eBay, or Amazon, and more than 70 percent of Global 2000 organizations will have at least one gamified application.

Yaniv rock climbing

How did you get into gamification?

YC: I got hooked through a Lotus Joint Program last year with our partner at IBM Project Northstar. Northstar is the result of IBM's vision to deliver exceptional web experiences to clients and we were looking at how gamification can improve the user experience of our clients' externally-facing applications.

There's also a great wiki about gamification, which provides a huge knowledge base about the topic. The founder, Gabe Zichermann, is regarded as the guru of gamification and I've been sharing ideas with his company, Gamification.co to help evangelize gamification within IBM.

We've been sharing gamification examples across companies and watching them transform their business, from healthcare, to education, and many others.

I’m a big believer in letting people have fun while learning, instead of learning things the hard way.

2.20.2012

The Higgs boson is the only particle predicted by the Standard Model of particle physics that has not yet been observed experimentally. Its observation would be a major step forward in our understanding of how particles acquire mass.

Recently, Prof. Dr. Ivica Puljak who works at CERN and University of Split, visited IBM Research - Zurich for a lecture. Before leaving he took a few questions (video below)

Q. When you introduced yourself this morning, you said you’ve always wanted to visit the IBM Research – Zurich Lab. Why is that?

Ivica Puljak: Well, first of all I’m fascinated by the science pursued here in general and by nanotechnology in particular. I’m a lecturer of general physics and modern physics, and I often lecture on scanning tunneling microscopy (STM). It’s such a simple concept, very easy to understand, and the fact that it allows atoms to be moved around is just amazing. This being the birthplace of STM, of course I was keen to come and visit some colleagues and friends here.

I.P. I’m very impressed by the scientific facility here. It’s equally impressive with that of CERN.

Tell us about a typical day at CERN.

I.P. Well, that depends on the time of year and on the year itself. We’ve been working on LHC [Large Hadron Collider] programs for over 20 years. During that time, there were more relaxed periods of designing the detectors. This was followed by more hectic periods of constructing the detectors, which was followed by even more hectic periods of testing what we had designed and built together with contracted companies. We couldn’t be sure that everything would even work. And it all works so well; only a few minor problems had to be solved.

Now we’re in a very exciting period again where—after 20 years of preparation—we finally have data to analyze. Last year, for example, there were phases where we barely had time to sleep, we were so busy preparing presentations of our data at major international conferences. So these are times of intense collaboration and discussions.

Can you describe the international aspects of your work?

I.P. It’s a truly global project. Someone is always up and working somewhere in the world at any time of day. For example, when we finish a day’s work, our American colleagues are just getting started. We share our results and other colleagues throughout the world continue the work.

There was recently a broadcast on the BBC about the Higgs boson particle, and several CERN physicists gave interviews in which they said they didn’t even believe it exists. Others argued that it’s going to be within a very small realm. In your own opinion, who’s right?

I.P. (Laughs) Well, I’ll tell you what I would like to see: I’d like us to find it because I personally have been working on this for over 17 years. But science is not about beliefs, it’s about finding out what is going on in nature. We can believe what we want to, but nature will show us what there is. We can’t force nature to give us what we want!

Those who do believe it exists have made predictions about what its mass will be. Some say 135, some say 127 GeV/c2. What do you predict?

I.P. The realm has been so narrowed down that it’s awfully difficult to make an exact prediction. Anything is possible within that narrow realm.

You mentioned data storage, and IBM is very active in that field. You said that sometimes data can’t be stored quickly enough, even on the most advanced IBM technology. What do IBM scientists and engineers need to tackle in order to help CERN meet some of these challenges?

I.P. (Laughs) Well, I was half joking and I can only thank all the scientists who are doing this kind of work. But it is a real concern. In fact, when work started on our project 20 years ago, scientists actually tried to predict what technology would be like today. They basically just extrapolated the technology of that time. However, today’s technology far surpasses what they predicted, so we can save much more data than originally thought, and this is very good news.

As we upgrade our systems, we will definitely need improved technology in the next 20 years. We sincerely hope that progress will continue to be made, and this is what we would like to see from your IBM Research Laboratories.

2.06.2012

Editor's note: This article is by James Kaufman, IBM Research manager, Healthcare Information Infrastructure and Computer Science.

The box-office hit "Contagion," a movie about a
lethal airborne virus that kills within days, shed a realistic light on how
public health officials can use data to predict the spread of infectious diseases
on a global scale and answer the age-old question, 'what if' before it gets
asked.

Recognizing the need to see what the potential spread of a
pandemic might be for a given country, geographic region, or world over the
course of days, weeks and months, IBM Research started putting some processing
muscle into the fight against world health problems by collaborating on a new
age of science-based, data-centric disease modeling.

Starting the creation of the Global Pandemic
Initiative in 2006, IBM built a community of users around its
epidemiological modeling framework, called the Spatio Temporal Epidemiological
Modeler (STEM). Issued under the
Eclipse license, it is an open source toolkit and application that allows
epidemiologists, public health officials and students to collaborate and share
data and models for infectious diseases on a common platform. So, while a
graduate student in Africa could be working on mathematics, you might have a
public health official working on particular ways of modeling.

STEM uses large data sets on such things as airport and
highway traffic and county infrastructures with a wide range of variables, such
as the infectiousness of a disease and the availability and distribution of a
vaccine. This allows people to look at and understand different scenarios, get
insight into what the impact of decisions might be and, in some cases,
potentially prevent the spread of such diseases.

Today, the Eclipse Foundation will release a new version of
STEM. Jointly developed by IBM Research, the German Federal Institute of Risk
Assessment, and contributors from several universities, STEM V1.3 will provide
many new features, capabilities, and data including:

Support for modeling food
borne disease including a new “Population Transformer” to simulate food
production and food mediated disease transmission.

Ten years of Historical
Climate and Earth Science Data for use in creating models of insect vector
populations (and how they are affected by climage)

A Vector Capacity model for
the the Anopheles Mosquito

A Malaria Disease Model

Multi-serotype models for
Dengue Fever Disease

Malaria outbreak as related to temperature & rainfall.

As an open source application available through the Eclipse Foundation,
STEM is free and completely open to any scientist or researcher who chooses to
build on and contribute to its growing library of models, computer code, and
denominator data. Check out the new release and demos at http://www.eclipse.org/stem/

2.02.2012

Open data refers to the free access and reuse of government data — excluding private information such as personal medical data. The concept of establishing a portal for open data is fairly new.

Visualization of city activity, such as traffic, in Helsinki

IBM’s Smarter Cities Challenge sent a team of six experts – including two researchers from the Haifa lab in Israel — to Helsinki. Their mission was to provide recommendations on how visualization could be used to make the city’s open data more accessible to the citizens. The idea was to boost citizen engagement and improve democracy.

After three weeks of work, the team produced a final report with specific recommendations on how to turn data into visual information that would encourage action and drive citizen engagement. Just a few examples that the team suggested include using symbols like a tree to explain how the budget is divided up, projecting the city’s consumption of electricity onto a building, or using touch screens to get feedback in places where people congregate.

"We started off with boring looking data," explained Avi Yaeli, IBM Researcher from Haifa. "But when you can take the data and transform it into a visual story that helps people discover and understand new ideas, it's an exciting way to get citizens involved."