Facebook says it needs your explicit photos to combat revenge porn

Facebook's image-matching technology uses the digital footprint of a photo to prevent future uploads of the image.

Facebook's image-matching technology uses the digital footprint of a photo to prevent future uploads of the image.

Travis M. AndrewsThe Washington Post

Would you voluntarily send Facebook nude photos of yourself? The company is insisting it needs them — for your own protection.

Let's say you have a spiteful ex who decides to embarrass you by posting a nude photo made in private. Facebook says if you send the photo to the company first, it will make sure it never shows up on its site.

But can you trust Facebook? The company says it won't store the photos but instead create a digital footprint so that its image-matching technology can prevent any future uploading of a copy of the photograph.

The one caveat is the original image file needs to be uploaded, the Verge reported.

That's where the system can backfire, according to digital forensics expert Lesley Carhart, who said it's not that simple to completely delete a digital photograph.

"Yes, they're not storing a copy, but the image is still being transmitted and processed. Leaving forensic evidence in memory and potentially on disk," Carhart told Motherboard. "My specialty is digital forensics and I literally recover deleted images from computer systems all day — off disk and out of system memory. It's not trivial to destroy all trace of files, including metadata and thumbnails."

Facebook is piloting the program in Australia in partnership with the country's Office of the eSafety Commissioner, a government agency dedicated to online safety. Next, it'll be tested in the United States, Britain and Canada, the Times of London reported.

"It would be like sending yourself your image in email, but obviously this is a much safer, secure end-to-end way of sending the image without sending it through the ether," Australia's eSafety commissioner, Inman Grant, told the Australian Broadcasting Corporation. "They're not storing the image, they're storing the link and using artificial intelligence and other photo-matching technologies."

Carrie Goldberg, a New York-based lawyer who specializes in sexual privacy, told the Guardian she is "delighted" with the initiative and thinks it can help fight revenge porn.

"With its billions of users, Facebook is one place where many offenders aggress because they can maximize the harm by broadcasting the nonconsensual porn to those most close to the victim," she said. "So this is impactful."

Revenge porn isn't uncommon in the United States. Four percent of internet users have fallen victim to it, and 10 percent of women under 30 have had someone threaten to post explicit photos of them online against their will, according to a 2016 study by Data & Society.

Earlier this year, for example, Facebook shut down a private group called Marines United, in which more than 30,000 members, many of them active-duty Marines, solicited and shared nude photos of their female colleagues.

"We see many scenarios where maybe photos or videos were taken consensually at one point, but there was not any sort of consent to send the images or videos more broadly," Grant told the Australian Broadcasting Corporation.

The pilot program is a more advanced version of the system Facebook announced in April, in which the social media network would use image-matching technology to identify and block the uploading and sharing of photographs that had previously been reported and removed from the site.

The difference is for that the work, the photo needed to already be uploaded to Facebook. In the new system, Facebook says, users upload the photos themselves as a preventive measure.