EFF to Court: Remedy For Bad Content Moderation Isn’t to Give Government More Power to Control Speech

We’ve taken Internet service companies and platforms like Facebook, Twitter, and YouTube to task for bad content moderation practices that remove speech and silence voices that deserve to be heard. We’ve catalogued their awful decisions. We’ve written about their ambiguous policies, inconsistent enforcement, and failure to appreciate the human rights implications of their actions. We’re part of an effort to devise a human rights framing for removing or downgrading content and accounts from their sites, and are urging all platforms to adopt them as part of their voluntary internal governance. Just last week, we joined more than 80 international human rights groups in demanding that Facebook clearly explain how much content it removes, both rightly and wrongly, and provide all users with a fair and timely method to appeal removals and get their content back up.

These efforts have thus far been directed at urging the platforms to adopt voluntary practices rather than calling for them to be imposed by governments through law. Given the long history of governments using their power to regulate speech to promote their own propaganda, manipulate the public discourse, and censor disfavored speech, we are very reluctant to hand the U.S. government a role in controlling the speech that appears on the Internet via private platforms. This is already a problem in other countries.

We recently filed an amicus brief in a case currently before the United States Court of Appeals for the Ninth Circuit addressing the proper role, if any, U.S. courts and other branches of the U.S. government should play in content moderation decisions. In the case, Prager University claimed that YouTube violated its First Amendment rights by excluding its channel from Restricted Mode, thereby making it inaccessible to the “small subset of users, such as libraries, schools, and public institutions, who choose to have a more limited viewing experience on YouTube.” Because YouTube is a private company, and because the First Amendment only restricts government action, Prager University needs to establish that YouTube is a “state actor,” that is, in this context, a private entity functioning as the state.

We do not shy away from challenging existing law when appropriate—indeed, that’s our job as impact litigators.

But in this case, we believe that existing law serves Internet users and human rights best. Existing law—whereby platforms are not constitutionally compelled to publish any users’ speech—allows for both unmoderated and moderated platforms. Even positive and desired moderation would be difficult and burdensome for platforms should they be deemed state actors. This is explained in detail in our brief.

As we told the court, YouTube’s moderation of the videos, like many of its content moderation decisions, was faulty in many ways, and Prager University is rightfully concerned about how the platform enforced content rules against it. And YouTube is not alone: Facebook, Twitter, and others have made, and will continue to make, wrong decisions to take down content, and we will continue to call them out for it.

But the answer to bad content moderation isn’t to empower the government to enforce moderation practices. Rather, the answer, as we told the court, is for users’ platforms to adopt moderation frameworks that are consistent with human rights, with clear take down rules, fair and transparent removal processes, and mechanisms for users to appeal take down decisions. Our brief thus concludes with a discussion of the Santa Clara Principles, a set of minimum standards we helped craft for content moderation practices that provide meaningful due process to affected speakers and better ensure that the enforcement of content guidelines is fair, unbiased, proportional, and respectful of users’ rights.

Related Updates

EFF is deeply saddened and disturbed by the massacre in New Zealand. We offer our condolences to the survivors and families of victims.This horrific event had an online component; one gunman livestreamed the event, and it appears that he had an active and hateful online presence. Enforcing their terms of...

Last week, Facebook CEO Mark Zuckerberg announced a new “privacy-focused” direction for the company that, while sounding great in theory, also set off several alarm bells—including concerns about competition as the company moves to make its messaging properties indistinguishable from one another. As usual for Zuckerberg, it’s all...

In his latest announcement, Facebook CEO Mark Zuckerberg embraces privacy and security fundamentals like end-to-end encrypted messaging. But announcing a plan is one thing. Implementing it is entirely another. And for those reading between the lines of Zuckerberg’s pivot-to-privacy manifesto, it’s clear that this isn’t just about privacy. It’s...

Today we are announcing Fix It Already, a new way to show companies we're serious about the big security and privacy issues they need to fix. We are demanding fixes for different issues from nine tech companies and platforms, targeting social media companies, operating systems, and enterprise platforms on...

San Francisco—The Electronic Frontier Foundation (EFF) is standing with Californians demanding more control over their personal data by supporting the Privacy For All bill, which requires tech companies to get their permission to share and use private information.“All eyes are on California, which has taken the lead nationwide in passing...

Last week, Pew released the results of a survey investigating how users understand Facebook’s data collection practices and how they react when shown what the platform thinks it knows about them. The upshot is that 74% of users weren’t aware that Facebook assembles lists of their interests and traits...