calendar

Blog

Attached to this post are Powerpoint slides introducing intermediary liability basics. This particular deck comes from a great CIDE program in Mexico City. It is descended from others I’ve used over the years teaching at Stanford and Berkeley, presenting at conferences, and training junior lawyers at Google. Ancestral decks that evolved into this one go back to at least 2012. (Which might explain why I struggle with fonts whenever I update them.)

The Fourth Circuit has issued its decision in BMG v. Cox. In case you haven’t been following the ins and outs of the suit, BMG sued Cox in 2014 alleging that the broadband provider was secondarily liable for its subscribers’ infringing file-sharing activity. In 2015, the trial court held that Cox was ineligible as a matter of law for the safe harbor in section 512(a) of the DMCA because it had failed to reasonably implement a policy for terminating the accounts of repeat infringers, as required by section 512(i). In 2016, a jury returned a $25M verdict for BMG, finding Cox liable for willful contributory infringement but not for vicarious infringement. Following the trial, Cox appealed both the safe harbor eligibility determination and the court’s jury instructions concerning the elements of contributory infringement. In a mixed result for Cox, the Fourth Circuit last week affirmed the court’s holding that Cox was ineligible for safe harbor, but remanded the case for retrial because the judge’s instructions to the jury understated the intent requirement for contributory infringement in a way that could have affected the jury’s verdict.

This piece is exerpted from the Law, Borders, and Speech Conference Proceedings Volume, where it appears as an appendix. The terminology it explains is relevant for Intermediary Liability and content regulation issues generally - not only issues that arise in the jurisdiction or conflict-of-law context. The full conference Proceedings Volume contains other relevant resources, and is Creative Commons licensed.

This panel considered issues of national jurisdiction in relation to Internet platforms’ voluntary content removal policies. These policies, typically set forth in Community Guidelines (CGs) or similar documents, prohibit content based on the platforms’ own rules or values—regardless of whether the content violates any law.

Popularity doesn't equal truth. And yet Facebook's recent proposal to rank the trustworthiness of news sources based on popularity is loosely equating truth with popularity. In so doing, Facebook may be putting form over function.

The topic of how well the tool of black letter law works in the Internet law setting is of course huge, and associated with obvious definitional challenges. To point to but one; how ought we define “black letter law” in our present legal culture where legal rules necessarily must take account of the technical reality in which they operate? Indeed, given Wikipedia’s definition of “black letter laws” as laws that are “the well-established technical legal rules that are no longer subject to reasonable dispute,” one may legitimately question whether we can speak of any real black letter law within our field of enquiry. Fortunately, however, the panel was asked to approach only the more concrete topic identified in the description above.

Without a doubt, human rights law provides an important framework for the discussion of cross-border speech regulation. The International Covenant on Civil and Political Rights (ICCPR) in Article 19 clearly states the right to express opinions and ideas “regardless of frontiers” and the Internet is a particularly relevant tool and platform for the exercise of this right, both in its individual and social dimensions. There was a common underlying basic agreement among the different panelists as to the need to include a human rights perspective in content removal discussions, whether judicial, regulatory or legislative.

This panel addressed the right to be forgotten (RTBF) from a global perspective, presenting points of view from relevant stakeholders and academic researchers from different regions. As established in the Court of Justice of the European Union’s 2014 Google Spain case, this is a right under data protection law for individuals to request that search engines de-list specified results appearing in response to a search for the individual’s name.[1] While search engines may decline to de-list results based on public interest considerations, the RTBF is still far broader than de-listing or removal rights in many countries, including the United States. This is especially the case since de-listing can also be requested for information that lawfully published online.

On January 17, the Minnesota Supreme Court issued its opinion in State v. Diamond. It affirmed the appellate court’s holding that compelling a defendant to provide a fingerprint to unlock a seized cellphone (for which police had a warrant) did not violate the Fifth Amendment privilege against self-incrimination.