While the sense of touch is commonly thought of as less important than other senses, in certain situations, tactile displays are a lifeline. Think about what Braille means to the visually impaired.

Now research scientists at MIT’s Department of Mechanical Engineering are working on sensors, many wearable, for a variety of very different communication applications. They may be used for people working in loud or dark hazardous environments, such as firefighters; for people with inner ear balance impairments to help prevent falls; to provide sensory feedback to drivers about lane changes or proximity to other vehicles or objects; and for more mundane endeavors, such as helping tourists navigate throughout unfamiliar territory and even for the gaming industry to intensify the realism of activities such as driving over rough terrain or being shot at.

Prof. Lynette Jones, senior research scientist in MIT’s Department of Mechanical Engineering, who designs wearable tactile displays, says the next generation of mobile devices will do much for general acceptance of tactile feedback. She predicts that GPS-type devices will be available within five years. “The technology is there now. It’s just a matter of who will fund this,” and how will it be incorporated into other systems now in use, she says.

Researchers are looking at options for presenting details via varying vibrations for a quick response from people who aren’t holding phones in their hands. “Maybe we can just have a wristband that presents very simple tactile pulses that indicates they are coming to a point where they need to turn right or left, slow down or speed up. I see a coupling of these sorts of displays with other computer-based systems people have now,” Jones says.

Sensors testing sensitivity of the hand. Image: Lynette Jones

Communication through Vibration

The MIT project is multi-faceted, but essentially involves developing tactile displays that are effective for communication. For those with balance difficulties, the tactile displays would detect the tilt of the body and give the wearer feedback via a vibration to indicate that their center of gravity is not what it should be, and they should correct it. These people do not get the normal signals about their balance that other people get, Jones says.

In the situations where communication has broken down in hazardous environments, applications are being worked on that would provide information for navigation or send a vibration alert about a potential danger. For those, the scientists are looking at displays that mount on the body because their hands are typically occupied with other activities.

Jones and her team are deep into building displays and testing actuators. “We put sensors on things and make measurements of the properties of the motors when they are on the body, and we explore new actuator technologies," she says. "Since we are talking about people wearing these, we don’t want things that are power hungry because they will [be] battery-powered.”

Sensing Vibration Changes

Another area of research involves changing the parameters of the stimulation in terms of amplitude or frequency or the wave form or location on the body to understand how easily those changes are encoded by people. Understanding this will facilitate a set of guiding principles on how to create the tactile vocabularies, she says.

“We also do basic research on the skin in terms of mechanical properties to understand the difference of how the displays work on the palm of the hand versus the back of the hand versus the torso versus the thigh,” she explains. In other words, to understand how much the mechanics of the tissue that the sensors are on changes the vibration of the skin because that determines what is picked up by the senses and sent to the brain, Jones adds.

One challenge is to determine how much complexity people will tolerate and how much information should be presented by the skin so that it’s useful in the context people are using it. “We know that people aren’t going to want to have their skin vibrating endlessly,” she says. Tactile feedback systems for everyday use need to be intuitive and straightforward, Jones says, because people aren’t going to spend a lot of time learning how to interpret the input they are getting. The sense of touch is not one that people are “actively processing” in the same way that they are alert to what they are hearing or seeing, she adds.

You might say Prof. Jones has been ahead of her time since the tactile sense has been relatively unstudied. “For years, I was interested in work on human hands and the design of a robotic prosthetic hand," she notes. "When you think about touch, you think about the hand as the most acute sensory area. My interest came from looking at the potential for touch outside of just understanding how human hands were designed and what we could do knowing what we do about touch and about information processing.”