Finally, a neural net with the sensibilities of a 13-year-old boy.

To understand how image recognition algorithms “see,” companies like Google can force their software to create images based on their training, rather than simply sort them. It’s the same technique that gave us these bizarre images of dogs created by Google’s artificial neural network Deep Dream. The images made sense when you learned that the software was trained heavily on images that skewed canine in origin. If all you’d seen were dogs in your life, then you’d probably see dogs everywhere, too. Clouds become dogs. Trees become dogs. Everything is, essentially, a dog.

advertisement

advertisement

Now I want you to imagine another world. One where, instead of dogs, a neural network was trained on images of breasts and genitalia.

Gallery

This is the worldview of Yahoo’s NSFW image filtering software, which was created to filter objectionable content from its websites–a major problem that many companies are striving to better automate with AI. The algorithm divides images into one of two buckets: Safe for Work (SFW) or Not Safe for Work (NSFW). Then, as Prosthetic Knowledge first spotted, a PhD student at UC Davis named Gabriel Goh decided to flip the switch into generation mode. What the software created depicted each of these image types, SFW and NSFW, with uncanny clarity. SFW images looked like green mountains mixed with rippling water topped with feathers, maybe, from various animal species. It’s like a scrambled vision of Animal Planet. Strangely soothing. Peaceful. Vaguely rated G.

And then there are the NSFW images. They hit you like psychedelic uber genitalia, where the only color is flesh-colored, where labia, testes, and penises sprout forth from unknown origins like perverse Rorschach tests. It’s like the terrified vision of some non-corporeal alien intelligence who has heard about sex but only has a loose understanding of how all the parts come together to make the mechanics work. Or, maybe, a diagram of the average Redditor depicting the G-spot.

Grand Canyon

Where this difference between SFW and NSFW becomes most compelling is when Goh filters the same images through each opposing setting, and shows the results side by side. If you thought Shark Week was scary, wait until you read the most literal interpretation of Moby Dick to be penned by an algorithm. If you have ever likened the beauty of the Grand Canyon to a woman’s bosom, let me assure you, the two natural wonders should never be mixed. And if you find yourself lost in the hot desert, DO NOT VENTURE INTO THE CAVE. IT IS NOT WHAT YOU THINK IT IS.

There’s no color there even resembling skin, just the general gestalt of male genitalia.

Yet while this image set may look like laughable surrealist smut, “If anything, it shows [Yahoo] is doing its job,” says Goh of the complex thinking necessary to generate such complex salaciousness. “What’s interesting about this network is that skin color doesn’t play a massive role in the network’s judgement. It doesn’t seem to be using any simple heuristic. Take the porny water towers, for example. There’s no color there even resembling skin, just the general gestalt of male genitalia. The convolutional net is recognizing higher-level concepts which make sense. And I think its doing absolutely the right thing by classifying the towers as NSFW.”

Thanks for keeping our workplaces safe from inappropriate orifices, Yahoo. But I think we should all ask, what monster hath you unleashed on the world to do so?