Intel: The Future is in 'User-Aware' Computing

“User aware” computing is coming. That was part of the message at the Intel Developer Forum in San Francisco. The twice-yearly event is a springboard for the chip giant to launch new ideas and strategies – both near and long-term.

With the advent in coming years of multi-core processors with not a few, but rather many – even hundreds – of cores inside a single CPU, computers will become more attuned to how a device is used, according to an Intel official speaking at the event.

"A user-aware platform will be any device that can take care of itself, knows who we are, where we are, and tries to anticipate what we want done," Justin Rattner, Intel senior fellow and director of Intel's Corporate Technology Group, said in describing what future computing scenarios might resemble.

"They will need digital senses to be aware of their surroundings and what they are doing. They will also need new levels of intelligence to understand our needs and collaborate with other electronics to take action on our behalf while doing no harm in the process," Rattner said in a statement prepared in advance of his keynote.

The future of electronics will be driven by the need for simpler, more intuitive ways of dealing with technologies that help people accomplish their aims. This will demand a new generation of user-aware platform technologies.

User-aware platforms will change the way hardware, software, services and interfaces are developed. Each chip will be capable of dynamically assigning clusters of processing cores, replete with the necessary memory and bandwidth, to tasks such as seeing, listening, maintaining network security and understanding commands. These platforms will use virtualization software – technology Intel has already started to integrate into its chips -- to put protective walls around each task's dedicated slice of computing resources in order to let each run without interfering with other applications or processes.

Developers will use this new level of intelligence to manage multiple tasks and sources of input (video to “see” with, audio to “hear and speak” with, sensors to “feel” with, storage to “remember” with, and networks and radio connections to connect with the Internet and other devices), according to the company.

The point: to use new and coming technologies to let developers apply machine learning to simplify our lives.

Rattner demonstrated a research project called Diamond, an intuitive image search application under development by researchers at Intel and Carnegie Mellon University in Pittsburgh, Pa.

“Running simultaneously across several computers, Diamond takes advantage of recent advances in computer vision and machine learning to search through data the way people do, first by studying what the desired image 'looks like' -- its shape, color, contents -- and then finding the closest matches,” Rattner said.

Another aim is to enable computing devices to care for themselves, reducing administration and maintenance. A near-term example might be to enable temperature sensors inside servers to monitor for overheating, triggering the intelligent reallocation of workloads among hundreds of systems inside a data center to avoid data loss or system failure.

Building systems that intuitively respond to our needs is a non-trivial challenge that researchers have striven for over decades – giving computers enough “consciousness” for them to “understand” context such as "who," "what" and "where." But some tasks are easier than others. For instance, fairly simple location-aware computing technology could help enable systems to more intuitively respond to requests, such as downloading different types of music to the family car, kitchen or cell phone based on a person's tastes. Or, it could alert a factory worker to specific maintenance needs of different equipment based on proximity, safety conditions and the worker's level of training. And, it should do this in an appropriate, privacy observant manner.

When will this “user-aware” future arrive? Stay tuned.

About the Author

Stuart J. Johnston has covered technology, especially Microsoft, since February 1988 for InfoWorld, Computerworld, Information Week, and PC World, as well as for Enterprise Developer, XML & Web Services, and .NET magazines.