Over the weekend, The Australianrevealed that Facebook's ad targeting tools may be able to prey on teen insecurities. It's the sort of story that weds traditional advertising's worst tendencies with the creepy omnipotence of its modern form in a way that makes one wonder if Facebook is actively cribbing ideas from episodes of Black Mirror.

To hear somemarketingindustryfigures tell it, though, ads targeted on the basis of perceived emotion are not only harmless, they're actually a benevolent alternative to those steered by more impersonal forms of data.

The line between what's considered beyond the pale and what's everyday targeting is infinitesimally thin

"It humanizes the consumer and it brings to the forefront behaviors and attitudes," says Alanna Gombert, general manager of trade group Interactive Advertising Bureau's tech lab. "In some ways, it's the right thing to do. It equates to a person and not a targeting item."

Gombert adds the caveat that customers are aware of what is happening (though they weren't in Facebook's case).

Obvious conflicting interests aside, the difference in opinion here shows how easy it is for ubiquitous modern ad targeting techniques to veer into unacceptable territory. The line between what's considered beyond the pale and what's everyday targeting is infinitesimally thin.

Consumers who don't take a hard-line stance towards data privacy might appreciate seeing ads relevant to their hobbies or age, for instance. But add race to the equation — as Facebook was caught doing last fall — and advertising can quickly turn discriminatory.

The same could go for emotion-targeting. Sure, some consumers might not necessarily mind an ad that reinforces their upbeat mood. But hitting emotional underage users with ads when they feel "worthless,""stressed" or "defeated" — as Facebook reportedly studied in Australia — smacks of the type of emotional manipulation of which advertisers have always been accused.

"It's something you have to be careful with," says Michel Wedel, a consumer science professor at University of Maryland's business school whose research deals with emotional reactions to ads. "Targeting people who are underage is something marketers have to do with great care in general, and that holds even more true for using emotions and vulnerable states."

Facebook issued a rare mea culpa after leaked documents revealed the project on Sunday. But a spokesperson denied in a statement that the research was ever actually used for ad targeting.

"Facebook does not offer tools to target people based on their emotional state," the spokesperson said. "The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook."

That doesn't necessarily mean Facebook won't continue to look for ways to play to its users emotions — or that advertisers can't easily find makeshift ways to do so with Facebook's hyper-specific suite of targeting tools.

The rise of mood marketing

Advertisers have obviously always relied on emotion to make their pitches resonate, but one of the first examples of online targeting in this respect was a company called Mindset Media, according to Gombert.

That startup, which Google bought along with messaging platform Meebo in 2012, aimed to build personality and psychographic profiles of consumers with which brands could steer ads.

Since then, various tech heavyweights have flirted with the practice. Apple locked down a patent to explore the idea in 2014; Spotify announced it would start adjusting ads to playlist moods the next year; and Ebay launched a mood-based ad tool the year after that. More recently, GIF platform Tenor has started pitching advertisers on being able to put relevant content next to real-time emotions.

A company called Mediabrix — rebranded to Receptiv as of last month — saw breakneck growth with a model that tailors in-app ads to how the outcomes of how a mobile game might affect your emotions.

For the most part, though, the technology failed to gain widespread traction, according to Gombert. Defining mood is still too much of a soft science for advertisers who like hard numbers.

"It's very qualitative, and qualitative is hard to measure," she says. "Because it's hard to measure and hard to equate and hard to think about. People default to the 'normal' targeting methods: age, gender, inclination to shop."

That's not to say advertisers aren't interested. Or that progress isn't still being made.

"The measurement of emotions is becoming easier and easier," Wedel says. "It's not unlikely that facial expressions in pictures can be analyzed to determine what emotion a person has at a particular time — similar to what we did with front-facing cameras [in experiments]."

Facebook feels

Meanwhile, as Google was gobbling up Mindset Media, Facebook was in the midst of a creepy social experiment meant to test whether it could control the mood of its users through their News Feeds.

For one week at the start of 2012, the social network adjusted its algorithms in order to ply nearly 700,000 users with content that triggered happy or sad feelings. It then recorded how the tweaks affected what the people subsequently posted.

Facebook users were of course unaware at the time that the company's terms and conditions had rendered them lab rats. The controversialstudy would remain a secret until it was published two years later.

"Emotional states can be transferred to others via emotional contagion," its authors concluded, "leading people to experience the same emotions without their awareness."

In other words, the tampering had worked.

As far as we know, that sort of research has never actually been translated into targeting tools. Attributes related to mood or emotion are not among the 98 disconcertingly detailed data points the company has said it uses to steer ads.

Say you wanted to address vulnerable teens, as Facebook did in its research. Setting an age range is easy enough.

Facebook does admit that it routinely studies user reactions and emotions by way of its "Compassion Research Team," a group of engineers, neurologists, psychologists, and sociologists ostensibly tasked with keeping interactions on the platform as humane and civil as possible.

Some of the research collected by the team was funneled into the "Reactions" feature Facebook launched last year — a palette of emoji alternatives to "likes" based on what data show were the most common emotions expressed on the site.

Advertisers can't use "Reactions" information to direct ads, but Facebook does supply them with a more detailed read-out than the average user.

A thin line

The fact that Facebook doesn't explicitly allow for emotional targeting may ultimately be irrelevant. The platform offers enough nuance through its ultra-specific metrics that working backwards from research findings and jerry-rigging a proxy category from existing options may not be all that hard.

Say you wanted to address vulnerable teens, as Facebook did in its research. Setting an age range is easy enough.

Beyond that, there are plenty of anonymized factors Facebook collects through a combination of third-party store data, web tracking, and actual profile information that might be triangulated to create a window into mental state.

Mashable
is a global, multi-platform media and entertainment company. Powered by its own proprietary technology, Mashable is the go-to source for tech, digital culture and entertainment content for its dedicated and influential audience around the globe.