“When this content, often referred to as ‘revenge porn,’ is reported to us, we can now prevent it from being shared on Facebook, Messenger and Instagram,” said Facebook’s Head of Global Safety Antigone Davis, in a Wednesday release. “This is part of our ongoing effort to help build a safe community on and off Facebook.”

The new process for reporting and removing revenge porn begins with individual users, who would flag an image that “looks like it was shared without permission.” The flagged post is later reviewed by “specially trained representatives” from Facebook’s Community team, who will review the image and disable the account sharing it. The option to appeal is also available if the owner of the disabled account believes an error was made.

Once Facebook is aware that a sexual image might be being shared without permission, it will block others from reposting the image, using artificial intelligence and image recognition technology, CEO Mark Zuckerberg said Wednesday.

Offending images are kept for a period of time, Facebook said on Wednesday. After this period, the company relies on the image’s “hash” — a kind of virtual fingerprint, to identify and block reposts. Facebook cannot reconstruct the photo from a hash, it said.

Revenge porn evolved with the web

Debate has spread over the term “revenge porn,” hinging mainly on whether the label “obscures the reality of the harm” caused when sexual images of an individual are distributed without permission. Some, like professor Clare McGlynn of Durham Law School in England, prefer the term “image-based sexual abuse.”

Nonconsensual porn had been part of mainstream culture for decades. According to multiple timelines, 1980was the first time naked photos of a woman were published without her permission in the U.S. — in Hustler magazine. The images had reportedly been stolen.

The genre evolved as the internet did. In the 2000s, porn sites and message boards dedicated solely to the content emerged. Some published subject’s full names and social profilesand encouraged viewers to mock, taunt, or harass the subject — things that ensured current and prospective employers, family members, and others could find the posts via search engines.

As some of these sites shuttered — the most infamous of them, called IsAnyoneUp?, finally shuttered in April 2012 — non-consensual porn increasingly infiltrated social media platforms like Facebook. As recently as March, Facebook was was criticized for what some saw as a sluggish response to allegations of shared nude and sexual images.

A recent study conducted by the Cyber Civil Rights Initiative shows 93 percent of those in the U.S. whose intimate images have been shared without their permission struggle with “significant emotional distress.”

An overwhelming 90 percent of non-consensual porn subjects are women, the study found; 70 percent of those are between 18 and 30. And most of the offending material was posted by an ex-boyfriend.

Facebook’s gesture to curb revenge porn was celebrated by cyber civil rights advocates on Wednesday, some of whom had worked with the company to develop the new tool. The platform is the first major social network to implement this kind of technology. Since Facebook is a tech leader, the move might signal more change to come.

But questions still abound: How many employees at Facebook have access to the images? The privacy-conscious might also worry about Facebook, a private company, keeping a repository of images flagged as revenge porn for its matching technology. How long are the offending images retained? Facebook has not disclosed this information.

Meanwhile, state governments are also working to combat non-consensual porn — 35 U.S. states and Washington, D.C. have laws against it. But as Wired notes, enforcement is often lacking, and U.S. lawmakers are “slow to act.”

Considering this, it’s likely more private tech companies will implement new instruments to combat revenge porn in the meantime.