Daphne Keller is the Director of Intermediary Liability at Stanford's Center for Internet and Society. Her work focuses on platform regulation and Internet users' rights. She has published both academically and in popular press; testified and participated in legislative processes; and taught and lectured extensively. Her recent work focuses on legal protections for users’ free expression rights when state and private power intersect, particularly through platforms’ enforcement of Terms of Service or use of algorithmic ranking and recommendations. Until 2015 Daphne was Associate General Counsel for Google, where she had primary responsibility for the company’s search products. She worked on groundbreaking Intermediary Liability litigation and legislation around the world and counseled both overall product development and individual content takedown decisions.

This discussion, excerpted from my Who Do You Sue article, very briefly reviews the implications of what I call “must-carry” arguments – claims that operators of major Internet platforms should be held to the same First Amendment standards as the government, and prevented from using their Terms of Service or Community Guidelines to prohibit lawful speech.

Lawmakers today are increasingly focused on their options for regulating the content we see on online platforms. I described several ambitious regulatory models for doing that in my recent paper, Who Do You Sue? State and Platform Hybrid Power Over Online Speech. This blog post excerpts that discussion, and sketches out potential legal regimes to address major platforms’ function as de facto gatekeepers of online speech and information.

Pages

Lately, politicians and newssources have been repeating a persistent myth about, of all things, technology law. The myth concerns a provision of the 1996 Communications Decency Act, generally known as Section 230 or CDA 230.

In recent years, lawmakers around the world have proposed a lot of new intermediary liability (IL) laws. Many have been miscalibrated – risking serious collateral damage without necessarily using the best means to advance lawmakers’ goals. That shouldn’t be a surprise. IL isn’t like tax law or farm subsidies. Lawmakers, particularly in the United States, haven’t thought much about IL in decades.

This essay closely examines the effect on free-expression rights when platforms such as Facebook or YouTube silence their users’ speech. The first part describes the often messy blend of government and private power behind many content removals, and discusses how the combination undermines users’ rights to challenge state action. The second part explores the legal minefield for users—or potentially, legislators—claiming a right to speak on major platforms.

On Tuesday, in a courtroom in Luxembourg, the Court of Justice of the European Union is to consider whether Google must enforce the “right to be forgotten” — which requires search engines to erase search results based on European law — everywhere in the world.

Policymakers increasingly ask Internet platforms like Facebook to “take responsibility” for material posted by their users. Mark Zuckerberg and other tech leaders seem willing to do so. That is in part a good development. Platforms are uniquely positioned to reduce harmful content online. But deputizing them to police users’ speech in the modern public square can also have serious unintended consequences. This piece reviews existing laws and current pressures to expand intermediaries’ liability for user-generated content.

Pages

"Daphne Keller, a former Google lawyer now at Stanford’s Center for Internet and Society, agreed that the “knowingly” language is problematic. “It creates this incentive to bury your head in the sand and not try to find bad content,” she said."

"In a recent paper, Daphne Keller, director of Intermediary Liability at the Stanford Center for Internet and Society, points out that whether and how content hosts—such as social media companies—must honor RTBF requests under the GDPR is unclear.

"Policy experts also question how the bill would actually work. Daphne Keller of the Stanford Center for Internet and Society pointed to the challenges of determining whether an ad buyer is a foreign entity, particularly if buyers rely on outside vendors to purchase ads.

“Nobody knows how to figure out who counts as Russian,” she said. “It seems extremely easy to hide your identity.”"

"Daphne Keller of the Stanford Center for Internet and Society says that the new law could push some platforms and publishers to crack down on a wide variety of speech, to avoid the threat of lawsuits. It would give them “a reason to err on the side of removing internet users’ speech in response to any controversy,” she says, “and in response to false or mistaken allegations, which are often levied against online speech.”"

"“When platforms don’t know what to do, the legally over-cautious response is to go way overboard on taking things down just in case they’re illegal,” Daphne Keller, Director of Intermediary Liability at Stanford University’s Center for Internet and Society, told BuzzFeed News. “My worst case scenario legislation would be some vague obligation for platforms to make sure that users don’t do bad things.”"

Pages

Stanford CIS brings together scholars, academics, legislators, students, programmers, security researchers, and scientists to study the interaction of new technologies and the law and to examine how the synergy between the two can either promote or harm public goods like free speech, innovation, privacy, public commons, diversity, and scientific inquiry

Pages

The question of what responsibility should lie with Internet platforms for the content they host that is posted by their users has been the subject of debate around in the world as politicians, regulators, and the broader public seek to navigate policy choices to combat harmful speech that have implications for freedom of expression, online harms, competition, and innovation.

Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?

""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "

Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.