The new technology could be used in automated call centres to route enquiries differently according to the anxiety level of the caller.

Scientists at the Universidad Carlos III de Madrid (UC3M) and the Universidad de Granada (UGR) said “the system created can be used to automatically adapt the dialogue to the user’s situation, so that the machine’s response is adequate to the person’s emotional state.”

“Thanks to this new development, the machine will be able to determine how the user feels and how the caller intends to continue the dialogue”, claimed one of its creators, David Grill, a professor in UC3M’s Computer Science Department.

The researchers looked closely at the anger, boredom and doubt that people often experience when talking to automated call centre voices. By examining tone of voice, the speed of speech, the duration of pauses, the energy of the voice signal and up to a total of sixty different ‘acoustic parameters’, they have produced computer models of what people sound like according to the emotions they are feeling.

The authors of the study, which has been published in the Journal on Advances in Signal Processing, said it is important that the machine be able to predict how the rest of the dialogue is going to continue. “To that end, we have developed a statistical method that uses earlier dialogues to learn what actions the user is most likely to take at any given moment”, they claimed.

Using the ‘adaptive’system produced shorter dialogues, and they were usually more successful, the scientists claimed.