Facebook Built an AI That’s a World-Class Expert on Porn

One thing you won’t be able to use Facebook’s new WhatsApp cryptocurrency for is buying or selling revenge porn on the social media platform. Facebook, the company revealed today, just made an AI that’s an expert on porn.

It’s like U.S. Supreme Court Justice Potter Stewart who famously said his threshold test for pornographic obscenity in Jacobellis v. Ohio (1964) was “I know it when I see.”

Amazingly, Facebook’s artificially intelligent software algorithm is wired up to be aroused by the same visual cues in photographs that sexually arouse male (and female) viewers.

Join CCN for $9.99 per month and get an ad-free version of CCN including discounts for future events and services. Support our journalists today. Click here to sign up.

Facebook Announces AI to Combat Revenge Porn

As Facebook pursues a vision of being the platform you trust for privacy, Mark Zuckerberg wants you to know the platform has your sexy pictures on lockdown. | Source: AFP PHOTO / JOSH EDELSON

With the ability to detect nude and “near-nude,” intimate photos, the robo-content cop can help stop malicious users from posting or selling people’s nude and intimate selfies.

This can happen without the knowledge or permission of the person in the photo because the poster doesn’t care, or it can be done with deliberate intent to embarrass the victim.

But Facebook’s new revenge porn and graphic content dragnet is sophisticated enough to block not only nude photographs but also any content that a person would recognize as serving a prurient interest, Facebook claims. They really want to make this point clear.

“Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram.”

Facebook Wants You to Upload Your Nude Selfies to Get the Image File Blacklisted

The strangest part of Facebook’s revenge porn blocker is that the Silicon Valley giant is seriously asking users to preemptively upload image files of their nude selfies so Facebook can auto-block them. What could possibly go wrong? It’s like an Onion headline, but it’s CNBC:

“Users worried an inappropriate image might appear on Facebook’s platforms are asked to send an intimate image via Messenger, a preventive measure designed to flag the images before they’re shared.”

Who at Facebook is Reviewing Nude Photos?

As part of the program, Facebook users upload pictures to a “secure, one-time upload link,” which will then be reviewed by a “handful of specially trained members of our Community Operations Safety Team,” according to Facebook.

Who are the people working in Facebook’s nude selfie reviewer department, and did they already have to have experience working at the NSA to get an interview?

In all seriousness, Facebook needs to make it clear whether or not human eyes are reviewing these nude selfies that get preemptively uploaded, because it sounds like they are.

About The Author

Grew up reading Isaac Asimov, J.R.R. Tolkien, The Bible, Ayn Rand, John Locke, and Robert Heinlein while listening to conservative talk radio, reading used economics textbooks, and reading through most mainstream political newspapers and magazines.