This website uses cookies

This website uses cookies, some of which are necessary for the operation of the website and some of which are designed to improve your experience. You can review the cookies we use here. If you are happy with the use of these cookies please continue to browse our website. You may adjust your cookie settings via your chosen browser tools.

Pick your country:

What You've Missed

Can AI understand human emotions?

Artificial intelligence – it’s been making headlines throughout 2017, from how brands are using it to what we might see in future. But, one of the biggest questions being asked is if AI will be able to read human emotion, and if so, what are the implications.

There have been numerous studies carried out by the industry and mostly outcomes are positive. As consumers start to use AI more and more in their everyday lives, they are not only starting to accept them, but trust them. A study written by our very own hosts Mindshare found one in four people are more willing to trust customer service chatbots with their sensitive information than a human, for fear of being judged.

So, what happens when an AI system is able to do more than just answer our most sensitive of questions? What will happen if they are able to display one of the hardest emotions to replicate in a human – empathy? Will we as consumers be willing to be comforted by an artificial intelligence system? Will we accept them as a shoulder to cry on? The questions are endless and there’s only one way to find out – test the theory.

At Huddle 2017 we aim to see how a consumer would react to an AI system that is able to gauge their emotions by simply reading their expressions.

We recently hosted a hackathon where we asked participants to build innovative technology. The team had 24 hours and lots of red bull but the results were amazing. From insights delivered by Alexa to cross device tracking powered by beacons, we were blown away by the innovative ideas that were presented to us by the participating teams. However, the stand out creation was an AI system that’s built to read facial expressions and reward someone based on this, which we invite you to intArtificial intelligence – it’s been making headlines throughout 2017, from how brands are using it to what we might see in future. But, one of the biggest questions being asked is if AI will be able to read human emotion, and if so, what are the implications.

There have been numerous studies carried out by the industry and mostly outcomes are positive. As consumers start to use AI more and more in their everyday lives, they are not only starting to accept them, but trust them. A study written by our very own hosts Mindshare found one in four people are more willing to trust customer service chatbots with their sensitive information than a human, for fear of being judged.

So, what happens when an AI system is able to do more than just answer our most sensitive of questions? What will happen if they are able to display one of the hardest emotions to replicate in a human – empathy? Will we as consumers be willing to be comforted by an artificial intelligence system? Will we accept them as a shoulder to cry on? The questions are endless and there’s only one way to find out – test the theory.

At Huddle 2017 we aim to see how a consumer would react to an AI system that is able to gauge their emotions by simply reading their expressions.

We recently hosted a hackathon where we asked participants to build innovative technology. The team had 24 hours and lots of red bull but the results were amazing. From insights delivered by Alexa to cross device tracking powered by beacons, we were blown away by the innovative ideas that were presented to us by the participating teams. However, the stand out creation was an AI system that’s built to read facial expressions and reward someone based on this, which we invite you to interact with at this years’ Huddle.

We introduce you to ‘Candy-MatiQ' – a fantasy candy land operated by machines that should put a smile on your face at this years’ Huddle.