Language &Touch

A Canadian study has found that we understand what someone is saying not just with our eyes and ears, but also with our skin.

Our tactile sense is incorporated in the whole realm of human perception, says Dr. Bryan Gick, one of the study’s researchers.

People have long realised that the eyes and the ears are both at work to determine how we perceive what we hear, the University of British Columbia linguistic scientist said. But in a test of 66 participants, Gick and fellow researchers that the puffs of air that come when the letters “p” or “b” are pronounced affect how listeners hear what’s being said.

Participants were blindfolded for the test, which found their skin registered the difference in air puffs, making it easier to identify aspirated letters: not just “p” and “b” but also “t” and “d.”

The test is the first to measure whether listeners in all circumstances, not just in cases in which they are trained for the task, factor in the “tiny bursts of aspiration” in their perception of speech, he said.

The findings take research one step further in understanding, in particular, how infants and people who are blind understand language. Gick also said the study can be used in future development of hearing aids and telecommunication applications.