IBM: In the next 5 years computers will learn, mimic the human senses

IBM today issued its seventh annual look at what Big Blue researchers think will be the five biggest technologies for the next five years. In past prediction packages known as "IBM 5 in 5" the company has had some success in predicting the future of password protection, telemedicine and nanotechnology.

The IBM 5 in 5 research is based collective trends as well as emerging technologies from IBM's R&D labs around the world. This year's research points to the development of what IBM calls a new generation of machines that will learn, adapt, sense and begin to experience the world as humans do through hearing, sight, smell, touch and taste.

"Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges," writes Bernie Meyerson, IBM fellow and VP of innovation.

Your computer will reach out and touch somebody: According to IBM, in five years, industries such as retail will be transformed by the ability to "touch" a product through your mobile device. IBM says its scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric -- as a shopper brushes a finger over the image of the item on a device screen. Utilizing the vibration capabilities of the phone, every object will have a unique set of vibration patterns that represents the touch experience: short fast patterns, or longer and stronger strings of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping simulate the physical sensation of actually touching the material, IBM says.

Can you see me now? Within the next five years, IBM researchers think computers will not only be able to look at images, but help us understand the 500 billion photos we're taking every year (that's about 78 photos for each person on the planet). In the future, "brain-like" capabilities will let computers analyze features such as color, texture patterns or edge information and extract insights from visual media.

One of the challenges of getting computers to "see," is that traditional programming can't replicate something as complex as sight. But by taking a cognitive approach, and showing a computer thousands of examples of a particular scene, the computer can start to detect patterns that matter, whether it's in a scanned photograph uploaded to the web, or some video footage taken with a camera phone, IBM says.

Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. What is critical in these images can be subtle or invisible to the human eye and requires careful measurement. By being trained to discriminate what to look for in images -- such as differentiating healthy from diseased tissue -- and correlating that with patient records and scientific literature, systems that can "see" will help doctors detect medical problems with far greater speed and accuracy, IBM says.

Stop, look. Listen: IBM thinks that by 2017 or so a distributed system of what it calls "clever sensors" will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies. The system will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will "listen" to our surroundings and measure movements, or the stress in a material, to warn of danger, IBM says.

Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other "modalities," such as visual or tactile information, and classify and interpret the sounds based on what it has learned. When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns, IBM says. By learning about emotion and being able to sense mood, systems will pinpoint aspects of a conversation and analyze pitch, tone and hesitancy to help us have more productive dialogues that could improve customer call center interactions, or allow us to seamlessly interact with different cultures.

IBM goes so far as to say that "baby talk" will be understood as a language -- telling parents or doctors what infants are trying to communicate.

Good taste: IBM said its researchers are developing a computing system that experiences flavor, to be used with chefs to create the most tasty and novel recipes. Such a system breaks down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham, IBM says.

Specifically, the computer will be able to use algorithms to determine the precise chemical structure of food and why people like certain tastes. These algorithms will examine how chemicals interact with each other, the molecular complexity of flavor compounds and their bonding structure, and use that information, together with models of perception to predict the taste appeal of flavors.

You smell funny: IBM says that during the next five years, tiny sensors embedded in your computer or cell phone will detect if you're coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone's breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not. Due to advances in sensor and communication technologies in combination with deep learning systems, sensors can measure data in places never thought possible. For example, computer systems can be used in agriculture to "smell" or analyze the soil condition of crops.

The point isn't to replicate human brains Meyerson says and this isn't about replacing human thinking with machine thinking. "Rather, in the era of cognitive systems, humans and machines will collaborate to produce better results-each bringing their own superior skills to the partnership. The machines will be more rational and analytic. We'll provide the judgment, empathy, morale compass and creativity."

One of IBM's 5 in 5 predictions last year had a cognitive system component to it as well. IBM said the idea was to tie your brain to your devices.

"If you just need to think about calling someone, it happens. Or you can control the cursor on a computer screen just by thinking about where you want to move it. Scientists in the field of bioinformatics have designed headsets with advanced sensors to read electrical brain activity that can recognize facial expressions, excitement and concentration levels, and people's thoughts without them physically taking any actions. Within five years, we will begin to see early applications of this technology in the gaming and entertainment industry. Furthermore, doctors could use the technology to test brain patterns, possibly even assist in rehabilitation from strokes and to help in understanding brain disorders," IBM stated.

Other predictions from last year included:

" No passwords: You will never need a password again because of what IBM calls "multi-factor biometrics." Smart systems will be able to use retina scans and other personal information in real-time to make sure whenever someone is attempting to access your information, it matches your unique biometric profile. To be trusted, such systems should enable you to opt in or out of whatever information you choose to provide. Your biological makeup is the key to your individual identity, and soon, it will become the key to safeguarding it. You will no longer need to create, track or remember multiple passwords for various log-ins. Imagine you will be able to walk up to an ATM machine to securely withdraw money by simply speaking your name or looking into a tiny sensor that can recognize the unique patterns in the retina of your eye, IBM said.

" Digital divide? In five years, the gap between information haves and have-nots will narrow considerably due to advances in mobile technology. There are 7 billion people inhabiting the world today. In five years there will be 5.6 billion mobile devices sold -- which means 80% of the current global population would each have a mobile device.

" No more junk? In five years, unsolicited advertisements may feel so personalized and relevant it may seem spam is dead and spam filters will be so precise you'll never be bothered by unwanted sales pitches again, IBM said. IBM pointed out that it is developing technology that uses real-time analytics to make sense and integrate data from across all the facets of your life such as your social networks and online preferences to recommend information that is only useful to you.

" More power: Advances in renewable energy technology will allow individuals to collect all sorts of kinetic energy -- walking, bicycling -- which now goes to waste, and use it to help power our homes, offices and cities. Imagine attaching small devices to the spokes on your bicycle wheels that recharge batteries as you pedal along. You will have the satisfaction of not only getting to where you want to go, but at the same time powering some of the lights in your home, IBM said.

Copyright 2018 IDG Communications. ABN 14 001 592 650. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of IDG Communications is prohibited.