The growing scourge of “revenge porn,” which involves the distribution of sexually explicit material without the consent of all parties concerned, has prompted a number of countries to legislate against such activities. Thirty-five U.S. statesnow have revenge porn laws, as do the U.K., Germany, and Israel.

While laws play an important part in curbing revenge porn attacks, preventing the distribution of such material is obviously the first priority, which is why technology companies have been increasingly working on new tools to help put a stop to the practice. Googlehas rules and processes in placeto remove revenge porn imagery, as do Microsoftandblogging platform Medium, among others.

advertisement

Facebook is known to have been working on specific revenge porn toolsfor a while, and the company updated its community standards guidelines back in 2012 to address revenge porn specifically. But today Facebook has offered its first glimpse of new tools to combat the sharing of explicit personal imagery, features that were developed “in partnership with safety experts.”

Moving forward, if you see an “intimate” photo shared on Facebook, Messenger, or Instagram that appears as though it may have been uploaded without permission, you can hit a little Report button within the downward arrow next to the post.

“Specially trained representatives from our Community Operations team review the image and remove it if it violates our Community Standards,” explained Antigone Davis, Facebook’s head of global safety, in a blog post announcing the new tools. “In most cases, we will also disable the account for sharing intimate images without permission. We offer an appeals process if someone believes an image was taken down in error.”

Facebook has courted controversy in the past for the way its algorithms decide what content is pushed to the masses, so it’s heartening to know that in revenge porn instances the company will be using humans to assess the situation. However, it has also landed in trouble when using human editors, perhaps most notably when itremoved the Pulitzer-Prize winning“napalm girl” picture and banned the writer who shared it.

Facebook says it won’t be relying exclusively on humans when revenge porn reports are filed. Indeed, once it has been decided that a photo shouldn’t be shared across the company’s various platforms, Facebook will use photo-matching smarts to prevent the image from being reposted. “If someone tries to share the image after it’s been reported and removed, we will alert them that it violates our policies and that we have stopped their attempt to share it,” added Davis.

Additionally, Facebook says that it’s teaming up with a number of safety organizations that will offer extra resources and support to revenge porn victims.