Carrie Goldberg is working on a lawsuit that could change the internet forever. Goldberg—an attorney who has made her name helping victims of cyber harassment—has already helped criminalize revenge porn in New York state. She argues that we do not take sexual privacy rights seriously enough and that, “if somebody is injured, the person or entity responsible must pay.”

I caught up with Goldberg to discuss her new book, Nobody’s Victim: Fighting Psychos, Stalkers, Pervs, and Trolls. We talked about why it took so long to criminalize revenge porn, what other cyber crimes might be falling through the cracks of our legal system, and why she wants to eradicate a piece of legislation that created the internet as we know it. This interview has been lightly edited for clarity.

You worked on the New York bill criminalizing revenge porn, which passed in February. Why did it take so long and why isn’t there a federal law like this? Who are the people lobbying against these bills?

For a very long time, it was civil libertarian organizations and so-called internet freedom fighters—ACLU, Electronic Frontier Foundation—who felt that peddling in somebody’s naked images was a vital part of free speech. They disseminated a lot of propaganda that did a lot of harm, basically saying that revenge porn laws would freeze speech on the internet. So there was a reluctance because for some people they’re really trusted organizations.

Or it’s just a lack of interest. Mainly women are the victims of these crimes, and we still have a lot of states with mostly older, male lawmakers in control and they don’t necessarily want to use their political power to be fighting for sexual privacy. On a federal level, there is the SHIELD Act that was introduced earlier this spring. It is largely a bipartisan issue but it’s very difficult to get laws moving right now. There are so many urgent issues and there are a lot of different political concerns that people have, like climate change and gun control, that aren’t about sexual privacy.

Revenge porn does seem to be a novel situation where our existing laws might not fit. Are there other examples like this?

Revenge porn definitely falls through the cracks of our laws. We’re seeing it again with deepfakes. As those cases move forward, it’s going to be very difficult to show where in the legal system they lie. I think with deepfakes they’ve already created something that is outside the boundary of most revenge porn laws—though possibly we could say it’s defamation because it’s a manipulation of truth to make it look like somebody did pornography who didn’t.

Sign up for The Download

- Your daily dose of what's up in emerging technology

Stay updated on MIT Technology
Review initiatives and events?

YesNo

In other cases, it’s important to understand the conduct as opposed to the technology. When a victim reports a crime, they’ll say “I’ve been the victim of cyber harassment” and a cop might think “she’s being called a bitch on Twitter, there’s nothing we can do about that.” But she might be getting specific threats on text message and Facebook by multiple people threatening to have her raped. That’s not “cyber harassment,” that’s stalking. That’s conduct there already are criminal codes for.

You’ve been vocal in criticizing Section 230, a provision of a law that protects websites from liability. Why is that?

The biggest barrier with the law that I deal with is Section 230 of the Communications Decency Act, which immunizes companies for content provided by a third party. It immunizes an entire industry—not just an entire industry but the most omnipotent, omniscient, data-rich, wealthy industry in the history of the universe. If you can never be sued for anything bad or negligent, you don’t have an incentive to build basic safety features into the products to prevent people from being harmed by them.

You’re working on a major Section 230 case, Herrick v. Grindr. How is the law falling short there?

Our client, Matthew Herrick, was being impersonated by his ex-boyfriend on the gay dating app Grindr. His ex posted pictures of Matthew in the profile and then would communicate via direct message with strangers pretending to be Matthew and set up sex dates, giving them the locations and using Grinder’s geolocating technology to send men to Matthew’s home and to his job. Over the course of several months, over 1200 strangers came to Matthew believing that they had a sex date. They came in person.

This was the weaponization of a product and Matthew reported it to the police 14 times. He had a restraining order against his ex-boyfriend, but none of that would stop him from being impersonated. Grindr really held the keys to the kingdom. Grindr knew, but their lawyers told us they didn’t have the ability to “exclude and identify” a user, which was to us outrageous. We then came up with a pretty novel legal theory of suing them under product liability. The courts said that Grindr was protected by Section 230 and everything could be chalked up to content created by his ex. A couple weeks ago, we petitioned the Supreme Court to hear the case.

How would you amend Section 230 to prevent these types of situations?

I don’t think it should be amended. I think it should be eradicated. Everyone is saying, “this will completely erode free speech.” How? Anybody who sues will still have to say that the issue was caused by the tech company and would have to prove harm. That’s an extraordinary burden.