Massachusetts-based Affectiva is working on this type of "socio-emotive A.I.," and PCMag met the company's director of market development, Jim Deal, at Unity Technologies' Unite 2016 conference recently.

"Our CEO and co-founder, Dr. Rana el Kaliouby, always had a deep interest in building emotionally aware machines. After she got her PhD at Cambridge University, she did post-doc work with Dr. Rosalind Picard at MIT Media Lab and the two of them spun Affectiva out of that," Deal explained.

Kaliouby is part of Dr. Picard's "Affective Computing" movement, which got its start in the late 90s, Deal said. Today, Affectiva's tech creates "classifiers," or face and emotion algorithms, from an "emotion data repository of almost 4.7 million faces analyzed, and 50 billion emotion data points recorded."

This means Affectiva is constantly measuring and analyzing what is known about human expression, categorizing everything into specific human emotions (sadness, happiness, anxiety, joy, and so on) in order to track not what we say but how we really feel about something—the limbic brain knee-jerk reaction as opposed to the conditioned, socially moderated neural network output.

In a demo of the software, Deal walked in front of a large screen running Affectiva in the background and positioned himself squarely in the camera's machine-vision tracker. Instantly, his face was mapped by hundreds of tiny white dots, which scanned and moved as his expression changed, tracking 15 different muscular actions, at 14 frames per second, translating all the data into pixels in order to compare, contrast and analyze against the database.

Why is this useful? Because what we say and what we do are two different things, and machine intelligence needs to learn this in order to be predictively accurate.

Companies know that humans want to be liked, so they don't always tell the truth about a game, movie, or other commercially available product when asked. They might say it's great, but they're not actually going to hand over any cash to buy the next upgrade. But the company takes them at their word and invests millions in development, only to have the product flop.

Deal became keenly aware of this problem during his time at Microsoft and as founder of Airtight Games, his own games software company.

"Sometimes it was clear, during research trials, that feedback is skewed. You're making games, they're just happy to be there, so they didn't want to be negative and we get a false reading," he said.

Unsurprisingly, companies want to cut to the truth. Which is why Affectiva's investors include not just the the National Science Foundation and Kleiner Perkins Caufield and Byers, but also advertising giant WPP.

It has other uses, too. Affectiva's emotion-recognition SDK is being embedded into games (via a Unity3D plugin and other platforms) and, as a result, gameplay can morph to address a player's psychological responses. (Are you scared of snakes? Guess what's slinking over the horizon to psych you out?) Hollywood is also using it to tweak trailers to ensure sold-out crowds on opening weekend.

The other reason companies like facial-expression analytic A.I. is it's simple to use and doesn't require gelling up the electrodes or plugging directly into your brain (like the brain-machine interfaces at Hughes Laboratory and DARPA-funded work out of UCSD that we've reported on previously).

So what's next for the company? "The future goal of Affectiva is to continue to evolve software and devices that respond to humans with EQ," explained Deal. "Rather than just IQ, or rapid calculations. Already our devices are trying to anticipate our needs in a way we're not already aware of, being predictive in nature. What we're doing, using a simple webcam, is getting real data to back up what people are saying about how they really feel."

With companies like Affectiva, soon your digital device won't just ask why you look sad, it might also suggest how to manage your moods. More helpfully, the future embodied A.I. devices, with Affectiva and other such solutions, might zoom off to the kitchen to make coffee, or at least instruct the IoT system to open the blinds and let the sunshine pour in.

Get Our Best Stories!

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.