Computer learning to read lips to detect emotions

September 12, 2012

Open the pod bay doors, HAL.

Scientists in Malaysia are teaching a computer to interpret human emotions based on lip patterns.

The system could improve the way we interact with computers and perhaps allow disabled people to use computer-based communications devices, such as voice synthesizers, more effectively and more efficiently, says Karthigayan Muthukaruppan of Manipal International University.

The system uses a genetic algorithm that gets better and better with each iteration to match irregular ellipse-fitting equations to the shape of a human mouth displaying different emotions.

They have used photos of individuals from Southeast Asia and Japan to train a computer to recognize the six commonly accepted human emotions — happiness, sadness, fear, angry, disgust, surprise — and a neutral expression. The upper and lower lip is each analyzed as two separate ellipses by the algorithm.

The team’s algorithm can successfully classify the seven emotions and a neutral expression described, the scientists say.