A Leaked document reveals Facebook knows when teens feel 'useless,' but the company denies serving targeted ads based on emotions

The Australian (paywall) got its hands on a 23-page Facebook document, dated in 2017, marked as “Confidential: Internal Only,” and authored by two Australian Facebook executives, Andy Sinn and David Fernandez. While no screenshots were included, the report allegedly explained how Facebook could analyze posts, photos and interactions to help determine the emotional states of 6.4 million “high schoolers,” “tertiary” (college) students and “young Australians and New Zealanders ... in the workforce.”

Some of the snippets from the report include how Facebook can determine when young people are interested in “looking good and body confidence” or “working out and losing weight.” While that’s handy for targeting ads, Facebook’s algorithms can also determine in real time “moments when young people need a confidence boost.” According to the report, the document claims, “Monday-Thursday is about building confidence” and “anticipatory emotions;” the weekend is for broadcasting achievements” and “reflective emotions.”

The document also reportedly claims that by monitoring posts, Facebook can estimate when teenagers are feeling “worthless,” “useless,” “defeated,” “stupid,” “overwhelmed,” “insecure,” “stressed,” “anxious,” “nervous” or like “a failure.”

You may notice your teenager using emoticons in or after text in Facebook to incorporate hints around the emotions shes feeling. Right-go through the shortcut to your facebook login website with your desktop, then choose "Properties" in the menu.

The second most powerful executive at the company, Sheryl Sandberg, says experiments were ‘poorly communicated’

The experiment, revealed by a scientific paper published in the March issue of Proceedings of National Academy of Sciences, hid "a small percentage" of emotional words from peoples' news feeds, without their knowledge, to test what effect that had on the statuses or "likes" that they then posted or reacted to.

“This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated,” said Sandberg, Facebook’s chief operating officer while in New Delhi. “And for that communication we apologise. We never meant to upset you.”

Facebook’s first public comment on the experiments came as the social network attempted to woo Indian advertisers as part of its efforts to tailor adverts to users outside of the US. The aim of the government-sponsored study was to see whether positive or negative words in messages would lead to positive or negative content in status updates.

The company's researchers decided after tweaking the content of peoples' "news feeds" that there was "emotional contagion" across the social network, by which people who saw one emotion being expressed would themselves express similar emotions.

Facebook has created software that would allow a third party to delete content based on specific geographical regions, current and former Facebook staff told the New York Times. Facebook has not offered the tool to China, and may never do so, according to the NYT.