Emotion analysis for real time interactions

Emotions influence how we interpret situations, what we do, and even how we remember them. Thus, emotions are a very important driver of how consumers behave and why. Because of this importance, there has been a growth in the number of products that promise to scan online conversations (e.g., in social networks, or on product review websites) in order to detect how customers feel about a particular topic or brand. These products – i.e., sentiment analysis software – can be particularly helpful to detect brewing social media crises, or to compare brand perceptions of rival products, even if it is a very challenging exercise (see here, here and here). But that this not the whole extent to which technology is being used to study emotions.

There is also software that analyses facial expressions in photos or videos (e.g., via a CCTV camera) to detect emotions. Bentley used this technology in the Bentley Inspirator app, which gauged your reaction to various images and, then, recommended different cars based on the emotions displayed:

Similarly, software can analyse speech and infer emotions based on features such as voice tone or how quickly someone is speaking. This technology could be used, for instance, in a call centre to detect if someone calling a help line is calm or agitated, and to stream the call accordingly. Some examples here.

In turn, eye tracking which measures features such as point of gaze, eye movement or pupil dilation, can be used in website development to detect which areas of the website are of particular interest to the user. The technology can also be used in video gaming to adjust the options presented to the gamer. Eye tracking has also been used in combination with electroencephalography (i.e., the analysis of brain activity) to develop adverts, as in this example:

Sounds great, right?

You get recommendations or get service that fit your state of mind, while the company manages their resources in a cost-effective way.

But… there is always a but.

The bit that I am not comfortable with is that data about emotions, when collected anonymously, are considered neither personal nor sensitive data; and, therefore, such data are not covered by data protection regulations. Data about my emotions can be collected passively, for instance, while I am at the bus stop, when I enter a store, or when I make a phone call… not to mention the myriad of smart phone applications which have access to my phone’s microphone and camera, or the smart home devices which record and transmit what I say (for instance, this story about Google Home Mini).

There is a very fine line between an interaction being relevant and useful… and just being plain creepy. And I worry that emotion analysis for real time interactions may be just over that line. Or am I just being a luddite?

PS: If you want to learn more about the use of technology to read emotions, check the report ‘Empathic Media: The Rise of Emotion AI’, by Professor Andrew MCStay (available here)