Intermediary Liability

Related Projects

Related Topics

Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.

In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.

To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.

Giancarlo F. Frosio is a Non-Residential Fellow at the Center for Internet and Society at Stanford Law School. Previously he was the Intermediary Liabilty fellow with Stanford CIS. Giancarlo is a qualified attorney with a doctoral degree (S.J.D.) in intellectual property law from Duke University Law School. Additionally, he holds an LL.M. with emphasis in intellectual property law from Duke Law School, an LL.M. in information technology and telecommunications law from Strathclyde University in Glasgow, and a law degree from Università Cattolica in Milan.

Luiz Fernando Marrey Moncau is a Non-Residential Fellow at the Stanford Center for Internet and Society. He was previously the Intermediary Liability Fellow at Stanford CIS. He was also head of the Center for Technology and Society (CTS) at the law school of the Getulio Vargas Foundation in Rio de Janeiro (FGV DIREITO RIO), where he coordinated and conducted research on freedom of expression, intellectual property, Internet regulation, consumer rights and telecommunications regulation.

Jennifer Granick fights for civil liberties in an age of massive surveillance and powerful digital technology. As the new surveillance and cybersecurity counsel with the ACLU's Speech, Privacy and Technology Project, she litigates, speaks, and writes about privacy, security, technology, and constitutional rights.

Canada's Office of the Privacy Commissioner has concluded that an existing law, the Personal Information Protection and Electronic Documents Act (PIPEDA), gives individuals legal power to make individual websites take down information. This goes well beyond the rights recognized by the European Court of Justice in its “right to be forgotten” case, and raises the following important questions

Should Canada adopt its own version of the “right to be forgotten”? The Office of the Privacy Commissioner of Canada (OPC) recently concluded, in a Draft Position Paper, that such a right actually exists already. According to the OPC, Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) gives individuals legal power to make search engines like Google de-list search results about them, and to make individual websites take down information. In a Comment filed last week, I argued that this interpretation of PIPEDA will create far more problems than it solves.

The Fourth Circuit has issued its decision in BMG v. Cox. In case you haven’t been following the ins and outs of the suit, BMG sued Cox in 2014 alleging that the broadband provider was secondarily liable for its subscribers’ infringing file-sharing activity. In 2015, the trial court held that Cox was ineligible as a matter of law for the safe harbor in section 512(a) of the DMCA because it had failed to reasonably implement a policy for terminating the accounts of repeat infringers, as required by section 512(i). In 2016, a jury returned a $25M verdict for BMG, finding Cox liable for willful contributory infringement but not for vicarious infringement. Following the trial, Cox appealed both the safe harbor eligibility determination and the court’s jury instructions concerning the elements of contributory infringement. In a mixed result for Cox, the Fourth Circuit last week affirmed the court’s holding that Cox was ineligible for safe harbor, but remanded the case for retrial because the judge’s instructions to the jury understated the intent requirement for contributory infringement in a way that could have affected the jury’s verdict.

Pages

If you paid attention to Mark Zuckerberg’s testimony before Congress last month, you might have gotten the impression that the internet consists entirely of titanic, California-based companies like Twitter, Facebook and Google. Congress is right to call these companies to account for outsize harms like disclosing personal data about many millions of users. But it is very wrong to act as though these companies are representative of the whole internet.

This Comment was filed response to the Canadian Office of the Privacy Commissioner's Draft Online Reputation Position Paper. As summarized here, the Comment discusses the shortcomings of PIPEDA and data protection statutes generally as sources of law regulating online information and expression.

Pages

"“If you walked up to the average person on the street in the U.S. and ask them about GDPR, they’d probably say, ‘Is that a hockey team?’ ” said Albert Gidari, director of privacy at the Center for Internet & Society at Stanford Law School, on Thursday. Gidari said many people don’t seem too concerned about privacy issues.

“I think people believe the benefits (of technology) outweigh the risk to their privacy,” he said."

"Malkia Cyril, a Black Lives Matter activist in Oakland, Calif., who is also the executive director for the Center for Media Justice, was among a coalition of more than 70 civil rights groups that pressured Facebook in 2017 to fix its “racially-biased” content moderation system. Among the changes the coalition sought was an appeals process for posts that are taken down.

Pages

When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?

Ads are the lifeblood of the web -- but the legal challenges have never been greater. On May 25, Europe's privacy regime is overhauled for the first time in 20 years, and publishers, advertisers and ad tech companies alike are confused about what it all means. Struan Robertson, a product counsel working in Google's ads business for the past seven years, gives his perspective on the legal challenges facing the industry.

Content moderation is such a complex and laborious undertaking that, all things considered, it's amazing that it works at all, and as well as it does. Moderation is resource intensive and relentless; it requires making difficult and often untenable distinctions; it is wholly unclear what the standards should be, especially on a global scale; and one failure can incur enough public outrage to overshadow a million quiet successes.

Vinton G. Cerf is one of the founding fathers of the internet, and on Wednesday, February 28th, he will be on Canada 2020’s stage for an exclusive event.

Tickets are free and open to the public, but available in limited quantities. Click below to secure yours.

Known most for being the co-designer of the TCP/IP protocols and the architecture of the modern Internet, Vint will join us in Ottawa to talk about online citizenship, the right to be forgotten, and state of the modern internet.

Privacy and free speech aren't fundamentally opposed, but they do have a tendency to come into conflict — and recent developments in Europe surrounding the right to be forgotten have brought this conflict into focus. This week, we're joined by Daphne Keller of Stanford's Center For Internet And Society to discuss the collision between these two important principles.