Intermediary Liability

Related Projects

Related Topics

Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.

In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.

To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.

Giancarlo F. Frosio is a Non-Residential Fellow at the Center for Internet and Society at Stanford Law School. Previously he was the Intermediary Liabilty fellow with Stanford CIS. Giancarlo is a qualified attorney with a doctoral degree (S.J.D.) in intellectual property law from Duke University Law School. Additionally, he holds an LL.M. with emphasis in intellectual property law from Duke Law School, an LL.M. in information technology and telecommunications law from Strathclyde University in Glasgow, and a law degree from Università Cattolica in Milan.

Luiz Fernando Marrey Moncau is a Non-Residential Fellow at the Stanford Center for Internet and Society. He was previously the Intermediary Liability Fellow at Stanford CIS. He was also head of the Center for Technology and Society (CTS) at the law school of the Getulio Vargas Foundation in Rio de Janeiro (FGV DIREITO RIO), where he coordinated and conducted research on freedom of expression, intellectual property, Internet regulation, consumer rights and telecommunications regulation.

Jennifer Granick fights for civil liberties in an age of massive surveillance and powerful digital technology. As the new surveillance and cybersecurity counsel with the ACLU's Speech, Privacy and Technology Project, she litigates, speaks, and writes about privacy, security, technology, and constitutional rights.

Pages

The Fourth Circuit has issued its decision in BMG v. Cox. In case you haven’t been following the ins and outs of the suit, BMG sued Cox in 2014 alleging that the broadband provider was secondarily liable for its subscribers’ infringing file-sharing activity. In 2015, the trial court held that Cox was ineligible as a matter of law for the safe harbor in section 512(a) of the DMCA because it had failed to reasonably implement a policy for terminating the accounts of repeat infringers, as required by section 512(i). In 2016, a jury returned a $25M verdict for BMG, finding Cox liable for willful contributory infringement but not for vicarious infringement. Following the trial, Cox appealed both the safe harbor eligibility determination and the court’s jury instructions concerning the elements of contributory infringement. In a mixed result for Cox, the Fourth Circuit last week affirmed the court’s holding that Cox was ineligible for safe harbor, but remanded the case for retrial because the judge’s instructions to the jury understated the intent requirement for contributory infringement in a way that could have affected the jury’s verdict.

This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.

This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.

Popularity doesn't equal truth. And yet Facebook's recent proposal to rank the trustworthiness of news sources based on popularity is loosely equating truth with popularity. In so doing, Facebook may be putting form over function.

Last Friday, the Justice Department charged 13 Russians with attempting to subvert the 2016 U.S. presidential elections. The case presented by Special Counsel Robert Mueller laid out an elaborate scheme of information operations, carried out primarily via the social media websites Facebook, Instagram, and Twitter. Through the Internet Research Agency, a so-called “troll factory” in St.

These comments address the issue of transparency under the GDPR, as that topic arises in the context of Internet intermediaries and the “Right to Be Forgotten.” CIS Intermediary Liability Director Daphne Keller filed them in response to a public call for comments from the Article 29 Working Party – the EU-wide umbrella group of data protection regulators established under the 1995 Directive, soon to be succeeded by the European Data Protection Board established under the GDPR.

Pages

"“Trust and safety and content moderation is a field that, if you don’t have gender and ethnic and other [types of diversity], you can’t do it,” Feerst said. “Because you don’t have enough perspective to generate the cultural competencies, and, most importantly, you don’t know what you don’t know.”"

"“Russian activity is the canary in the coal mine for a much bigger problem,” said Ben Scott, an adviser to the State Department under former Secretary of State Hillary Clinton for internet freedom, cybersecurity and the role of social media in public diplomacy. “The tools the Russians are using are built for advertisers, which means they are available to anyone.”"

"“We were always careful to condition our optimism and our advocacy that this technology was potentially a double-edged sword,” said Mr. Scott, who is a senior adviser to the Open Technology Institute at New America. “But I guess we didn’t expect them to hit home quite as hard as it did.”"

"Policy experts also question how the bill would actually work. Daphne Keller of the Stanford Center for Internet and Society pointed to the challenges of determining whether an ad buyer is a foreign entity, particularly if buyers rely on outside vendors to purchase ads.

“Nobody knows how to figure out who counts as Russian,” she said. “It seems extremely easy to hide your identity.”"

Pages

In the summer of 1956, several key figures in what would become known as the field of "artificial intelligence" met at Dartmouth College to brainstorm about the future of the synthetic mind. Artificial intelligence, broadly defined, has since become a part of everyday life.

President Trump has blocked Twitter followers on his personal feed--raising questions, and a lawsuit, about first amendment rights on social media. An expert on free speech in the online world says the case has wide implications for public figures on all forms of social media.

""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "

Rebecca Tushnet, professor at Georgetown university law school, and Andrea Matwyshyn, Professor of Law at Northeastern University, discuss one lawsuit against Google, Facebook and Twitter, which was brought by the families of the victims of the Pulse Nightclub shooting in Miami, and another suit against Google for unlawfully censoring its workers. They speak with June Grasso on Bloomberg Radio’s "Bloomberg Law."