The year-end prediction lists from technology companies and research firms are — let’s be honest — in good part thinly disguised marketing pitches: These are the big trends for next year, and — surprise — our products are tailored-made to help you turn those trends into moneymakers.
But I.B.M. has a different spin on this year-end ritual. It taps its top researchers worldwide to come up with a list of five technologies that are likely to advance remarkably over the next five years. The company calls the list Five in Five, with the latest released on Monday. And this year’s nominees are innovations in computing sensors for touch, sight, hearing, taste and smell.
Touch technologies may mean that tomorrow’s smartphones and tablets will be gateways to a tactile world. Haptics feedback techniques, infrared and pressure-sensitive technologies, I.B.M. researchers predict, will enable a user to brush a finger over the screen and feel the simulated touch of a fabric, its texture and weave. The feel of objects can be translated into unique vibration patterns, as if the tactile version of fingerprints or voice patterns. The resulting vibration patterns will simulate a different feel, for example, of fabrics like wool, cotton or silk.
The coming sensor innovations, said Bernard Meyerson, an I.B.M. scientist and vice president of innovation, are vital ingredients in what is called cognitive computing. The idea is that in the future computers will be increasingly able to sense, adapt and learn, in their way.
That vision, of course, has been around for a long time — a pursuit of artificial intelligence researchers for decades. But there seem to be two reasons that cognitive computing is something I.B.M., and others, are taking seriously these days. The first is that the vision is becoming increasingly possible to achieve, though formidable obstacles remain. I wrote an article in the Science section last year on I.B.M.’s cognitive computing project.
The other reason is a looming necessity. When I asked Dr. Meyerson why the five-year prediction exercise was a worthwhile use of researchers’ time, he replied that it helped focus thinking. Actually, his initial reply was a techie epigram. “In a nutshell,” he said, “seven nanometers.”
Dr. Meyerson, who has a Ph.D. in solid-state physics, was talking about the physical limits on the width of semiconductor circuits, when they can’t be shrunk any further. (The width of a human hair is roughly 80,000 nanometers.) Today, the most advanced chips have circuits 22 nanometers in width. Next comes 14 nanometers, then 10 and then 7, Dr. Meyerson said.
“We have three more cycles, and then the biggest knobs for improving performance in silicon are gone.” he said. “You have to change the architecture, use a different approach.”
“With a cognitive computer, you train it rather than program it,” Dr. Meyerson said.
The cognitive path, if successful, would raise a machine’s level of recognition of the world. Today, computers mimic human intelligence with brute force, collecting a vast amount of data and then sifting for statistical patterns that identify specific words, images, biological or chemical compounds.
But a cognitive computer, Dr. Meyerson said, would “not have to go for all the fine detail, but go up and see the interesting thing. This is about moving computing way, way up from where it is today.”
He provides a sensor-based example. The computer is presented with a white powder in two piles. One is salt and the other is sugar. “It could taste the difference, without having to do a detailed chemical analysis,” Dr. Meyerson explained.
Cognitive computers, by learning some tricks from the way human brains compute, could in theory deliver big energy savings. I.B.M.’s “Jeopardy”-winning computer is a clever machine, defeating its human rivals last year. But the computer, called Watson, runs on 85,000 watts of electricity. The human brain hums along on 20 watts.
“We need to make the machines much, much more efficient,” Dr. Meyerson said.