Closing Online Spaces Won’t End Trafficking

Automated filters can be useful as an aid to transparent, human moderation, but when they’re given the final say over who can and can’t speak online, innocent users are invariably pushed offline.

One of the most egregious problems with FOSTA and SESTA is the difficulty of determining whether a given posting online was created in aid of sex trafficking. Even if you can assess that a given posting is an advertisement for sex work—which can be far from obvious—how can a platform determine whether force or coercion played a role? Under SESTA, that uncertainty would force platforms to err on the side of censorship.

SESTA supporters consistently underestimate this difficulty, even suggesting it should be trivial for web platforms to build bots that remove posts in aid of sex trafficking but keep everything else up. That’s simply not true: automated filters can be useful as an aid to transparent, human moderation, but when they’re given the final say over who can and can’t speak online, innocent users are invariably pushed offline.

The House Judiciary Committee appears to have attempted to sidestep this problem, but it’s potentially created a larger problem in the process. That’s because the new version of FOSTA isn’t primarily a sex trafficking bill; it’s a prostitution bill. This bill would expand federal prostitution law such that online platforms would have to take down any posts that could potentially be in support of any sex work, regardless of whether there’s any indication of force or coercion, or whether minors were involved.

The bill includes increased penalties if a court finds that the offense constituted a violation of federal sex trafficking law, or that a platform facilitated prostitution of five or more people. As Professor Eric Goldman points out in his excellent analysis of the bill, the threshold of five prostitutes would implicate nearly any online platform that facilitates prostitution. If a prosecutor could convince a judge that a platform had had the “intent” to facilitate prostitution, then those enhanced penalties would be on the table.

It’s easy to see the effect that those extreme penalties would have on online speech. The bill would push platforms to become more restrictive in their treatment of sexual speech, out of fear of criminal liability if a court found that they’d had the intent to facilitate prostitution. Ironically, such measures would make it more difficult for law enforcement to find and stop traffickers.

Section 230 Is Still Not Broken

Some supporters of SESTA and FOSTA wrongly claim that Section 230 (the law protecting online platforms from some types of liability for their users’ speech) prevents any civil lawsuits against online intermediaries for user-created material that they host. That’s not true. Fair Housing Council of San Fernando Valley v. Roommates.com set a standard for when a platform loses Section 230 immunity in civil litigation—when the intermediary has contributed to the illegal nature of the content. As the Ninth Circuit said: “A website helps to develop unlawful content, and thus falls within the exception to Section 230, if it contributes materially to the alleged illegality of the conduct.”

We think the authors of this new version of FOSTA attempted to acknowledge the Roommates.com line of cases that discuss when a platform will lose Section 230 immunity against a civil claim. However, courts assume that Congress doesn’t write superfluous language. With that in mind, the new FOSTA can be read to authorize civil claims against platforms for user-generated content beyond what existing case law has allowed. The bill would allow civil suits against platforms that were responsible for “the creation or development of all or part of the information or content provided through any interactive computer service.”

That distinction between contributing to part of the content and materially contributing to the illegal nature of the content is an extremely important one. The former could describe routine tasks that online community managers perform every day. It’s dangerous to pass a bill that could create civil liability for the everyday work of running a discussion board or other online platform. The liability would be too high to stay in business, particularly for nonprofit and community-based platforms.

Bottom Line: SESTA and FOSTA Are the Wrong Approach

With this new version of FOSTA, House Judiciary Committee Chair Bob Goodlatte and his colleagues on the Committee have clearly attempted to narrow the types of platforms that would be liable for third-party content that reflects sex trafficking. But a less bad bill is not the same thing as a good bill. Like SESTA, the proposed new FOSTA bill would result in platforms becoming more restrictive in how they manage their online communities. And like SESTA, it would do nothing to fight sex traffickers.

Supporting bills like FOSTA and SESTA might help members of Congress score political points with their constituents, but Congress must do better. It’s urgent that Congress seek real solutions to finding and apprehending sex traffickers, not creating more censorship online.

In January 2011, after hearing about the unrest unfolding in Sidi Bouzid, Tunisian blogger Lina Ben Mhenni (who passed away in January of this year from a chronic illness) began traveling around the country to document the nascent protests and the government’s response to them. “There are no journalists...

This is one of a series of blog posts about President Trump's May 28 Executive Order. Links to other posts are below. The inaptly named Executive Order on Preventing Online Censorship (EO) is a mess on many levels: it’s likely unconstitutional on several grounds, built on false premises, and...

Long before the pandemic crisis, there was widespread concern over the impact that tech was having on the quality of our discourse, from disinformation campaigns to influence campaigns to polarization. It's true that the way we talk to each other and about the world has changed, both in form (thanks...

We’ve been skeptical of Facebook’s Oversight Board from day one. We’ll follow closely and keep open minds, because we appreciate it is a first attempt at some semblance of much-needed governance and external review. But no amount of “oversight” can fix the underlying problem: Content moderation is extremely difficult to...

We're closely watching how Facebook enforces its newly-announced policy that limits speech by users who are organizing public protests. This policy is deserving of special attention since it effects free expression on two levels: the organization of the protest itself, and the speech about it. This new policy adds to...