Intermediary Liability

Related Projects

Related Topics

Whether and when communications platforms like Google, Twitter and Facebook are liable for their users’ online activities is one of the key factors that affects innovation and free speech. Most creative expression today takes place over communications networks owned by private companies. Governments around the world increasingly press intermediaries to block their users’ undesirable online content in order to suppress dissent, hate speech, privacy violations and the like. One form of pressure is to make communications intermediaries legally responsible for what their users do and say. Liability regimes that put platform companies at legal risk for users’ online activity are a form of censorship-by-proxy, and thereby imperil both free expression and innovation, even as governments seek to resolve very real policy problems.

In the United States, the core doctrines of section 230 of the Communications Decency Act and section 512 of the Digital Millennium Copyright Act have allowed these online intermediary platforms user generated content to flourish. But, immunities and safe harbors for intermediaries are under threat in the U.S. and globally as governments seek to deputize intermediaries to assist in law enforcement.

To contribute to this important policy debate, CIS studies international approaches to intermediary obligations concerning users’ copyright infringement, defamation, hate speech or other vicarious liabilities, immunities, or safe harbors; publishes a repository of information on international liability regimes and works with global platforms and free expression groups to advocate for policies that will protect innovation, freedom of expression, privacy and other user rights.

Joan Barata is an international expert in freedom of expression, freedom of information and media regulation. As a scholar, he has spoken and done extensive research in these areas, working and collaborating with various universities and academic centers, from Asia to Africa and America, authoring papers, articles and books, and addressing specialized Parliament committees.

Annemarie Bridy is a Professor of Law at the University of Idaho. She is also an Affiliated Fellow at the Yale Law School Information Society Project and a former Visiting Associate Research Scholar at the Princeton University Center for Information Technology Policy. Professor Bridy specializes in intellectual property and information law, with specific attention to the impact of new technologies on existing legal frameworks for the protection of intellectual property and the enforcement of intellectual property rights.

Giancarlo F. Frosio is a Non-Residential Fellow at the Center for Internet and Society at Stanford Law School. Previously he was the Intermediary Liabilty fellow with Stanford CIS. He is also a Senior Lecturer and Researcher at the Center for International Intellectual Property Studies (CEIPI) at Strasbourg University. Giancarlo also serves as Affiliate Faculty at Harvard CopyrightX and Faculty Associate of the Nexa Research Center for Internet and Society in Turin.

Pages

This discussion, excerpted from my Who Do You Sue article, very briefly reviews the implications of what I call “must-carry” arguments – claims that operators of major Internet platforms should be held to the same First Amendment standards as the government, and prevented from using their Terms of Service or Community Guidelines to prohibit lawful speech.

Lawmakers today are increasingly focused on their options for regulating the content we see on online platforms. I described several ambitious regulatory models for doing that in my recent paper, Who Do You Sue? State and Platform Hybrid Power Over Online Speech. This blog post excerpts that discussion, and sketches out potential legal regimes to address major platforms’ function as de facto gatekeepers of online speech and information.

The new EU Audiovisual Media Services Directive (AVMSD) has been officially adopted and published. it is now time for member States to start the process of incorporating its provisions into their respective legal and institutional frameworks.

Pages

Lately, politicians and newssources have been repeating a persistent myth about, of all things, technology law. The myth concerns a provision of the 1996 Communications Decency Act, generally known as Section 230 or CDA 230.

In recent years, lawmakers around the world have proposed a lot of new intermediary liability (IL) laws. Many have been miscalibrated – risking serious collateral damage without necessarily using the best means to advance lawmakers’ goals. That shouldn’t be a surprise. IL isn’t like tax law or farm subsidies. Lawmakers, particularly in the United States, haven’t thought much about IL in decades.

Tighterregulation of social media and other online services in now under discussion in several European countries, as well as in the UK where the government has released a white paper outlining its proposed approach to tackling online harm.

The Internet was going to set us all free. At least, that is what U.S. policy makers, pundits, and scholars believed in the 2000s. The Internet would undermine authoritarian rulers by reducing the government’s stranglehold on debate, helping oppressed people realize how much they all hated their government, and simply making it easier and cheaper to organize protests.

Pages

"Some cyberlaw experts fear a ruling against Grindr will put the creativity of the internet as we know it at risk. They say that requiring platforms to more closely monitor users would give an advantage to tech giants like Facebook, Twitter, and Google while hindering smaller startups with niche audiences, including Grindr. It would be more expensive to start new businesses online because of the cost of hiring watchdogs, said Jennifer Granick, surveillance and cybersecurity counsel at the American Civil Liberties Union.

"“For a reform of this scope and magnitude, it’s only expected that several months will pass before enforcement comes into focus,” said Omer Tene, VP and chief knowledge officer at the International Association of Privacy Professionals. “2018 wasn’t even a full year for GDPR.”"

"“Ultimately, regulators and courts will have to decide what is the right balance between individuals’ privacy concerns and businesses’ interest to pursue data-driven innovation,” said Omer Tene, VP and chief knowledge officer at the International Association of Privacy Professionals."

Pages

Yana Welinder from Wikimedia Foundation will join our conversation and present emerging intermediary liability issues for online platforms. In particular, we will discuss thorny issues arising in the context of Wikimedia's activities, where the global, peer-produced nature of Wikipedia can test the limits of the tension between freedom of expression and third-party claims.

Professor Eric Goldman from Santa Clara Law School will join our conversation and present developments in US internet intermediary law. In particular, we will discuss Section 230 of the of the Communications Decency Act, intermediary liability and freedom of expression. We will consider recent questions surfacing before U.S. courts, such as whether Twitter should be liable for terrorist propaganda.

What, if anything, should we do about extremist content on the Internet? What is the role of Internet companies in promoting free expression and privacy around the world? How should we manage data requests from law enforcement and intelligence agencies around the world, when countries have different privacy protections and different laws?

Pages

The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.

The latest in the EU's string of internet regulatory efforts has a new target: terrorist propaganda. Just as with past regulations, the proposed rules seem onerous and insane, creating huge liability for internet platforms that fail to do the impossible.

Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?