Virtual-Reality Scent System Fools Flavor Sense

What you see (and smell) is what you taste

By John Boyd

Posted 18 Apr 2011 | 17:49 GMT

Advertisement

18 April 2011—Sights, sounds, and more recently, touch are commonly employed to create and enhance virtual-reality (VR) experiences. But the sense of smell is rarely a factor. A small group of researchers at the University of Tokyo is working to change that by integrating the sense of smell and sight in a way that alters a person’s perception of taste. Their VR system was able to trick people who were eating a plain cookie into thinking the cookie was of whatever flavor they had selected. The group is making use of the fact that taste is affected by what we see, hear, and smell, as well as the texture of the food, among other things.

"We are using the influences of these sensory modalities to create a pseudo-gustatory display," says Takuji Narumi, an assistant professor at the University of Tokyo. "The aim is to have subjects experience different tastes through augmented reality by only changing the visual and olfactory stimuli they receive."

To do this, the Tokyo team created a system dubbed the Meta Cookie, in which a plain cookie was stamped with an edible marker that allowed machine vision software to track it easily. The experiment also used a computer-controlled olfactory head-mounted display, or HMD, which incorporated a system that overlaid images onto the cookie and a marker detection unit. The olfactory unit employed seven scent-filled plastic bottles fitted with air pumps and tubes that delivered individual aromas to the subject’s nose. An additional air pump was used to dilute the amount of scent the subject received.

Experimental subjects were asked to choose a cookie flavor but were given a plain cookie and were told to observe it before eating it. A webcam in the marker detection unit picked up the pattern on the cookie and calculated the cookie’s position and orientation, as well as the distance between the cookie and the subject’s nose. This information was used to adjust the image of the cookie, which had already been overlaid with an image of the chosen flavor.

Photo: University of Tokyo

A second webcam positioned near the subject’s nose, pointing downward, detected the point at which the cookie approached the subject’s mouth and signaled the olfactory unit to release the scent of the chosen flavor. By mixing air with the scent, the system could adjust the strength of the smell to 127 different levels and cause the odor to increase in strength as the cookie neared the mouth. According to Narumi, the response time to generate the appropriate scent is just 50 milliseconds, quick enough for the user to experience the strengthening aroma without feeling any change in air pressure.

In the latest trials, the researchers tested 43 participants, who were not told the purpose of the experiment. Participants were asked to eat one or two cookies and then write about their taste experience. "In 72.6 percent of all the trials, the participants felt they tasted the kind of cookie they chose," says Narumi. "We had similar results for most of the five flavors chosen, including almond, strawberry, maple, and lemon, though the rate for maple was the lowest, at 55.6 percent."

Narumi delivered a paper on the group’s latest research and trials at the IEEE Virtual Reality conference held in Singapore in March. Now the researchers are working to make the olfactory HMD smaller by replacing the plastic scent bottles with inkjet injection technology. They are also improving the image overlay quality and seeking to add a sense of texture through the use of sound.

As for practical applications, Narumi sees the technology being used to reinforce dietary programs, for instance by making bland food appear tastier, especially for hospitalized patients. He says this might happen in five years. Much further in the future, he says the technology could be used for entertainment in the home "to augment TV viewing pleasure and when playing video games."

This article was modified on 20 May 2011.

About the Author

John Boyd is a Tokyo-based technology and science journalist. He has been covering the ongoing nuclear emergency in Japan since the 11 March earthquake.