Daphne Keller studies the ways that Internet content platforms – and the laws governing them -- shape information access and other rights of ordinary Internet users. As the Director of Intermediary Liability at the Stanford Center for Internet and Society, she has written and spoken widely about the Right to Be Forgotten, copyright notice-and-takedown systems, cross-border content removal orders, platforms’ own discretionary content-removal decisions, and more. She has testified on these topics before legislatures, courts, and regulatory bodies around the world. In her previous role as Associate General Counsel at Google, Daphne worked on cases including Viacom, Perfect 10, Equustek, Mosley, and Metropolitan Schools; and was the primary counsel for products ranging from Web Search to the Chrome browser. Daphne has taught Internet law at Stanford, Berkeley, and Duke law schools. She is a graduate of Yale Law School and Brown University, and mother to some awesome kids in San Francisco.

Public demands for internet platforms to intervene more aggressively in online content are steadily mounting. Calls for companies like YouTube and Facebook to fight problems ranging from “fake news” to virulent misogyny to online radicalization seem to make daily headlines. British prime minister Theresa May echoed the politically prevailing sentiment in Europe when she urged platforms to “go further and faster” in removing prohibited content, including through use of automated filters.

Europe's new General Data Protection Regulation (GDPR) goes into force today, after two years of preparation. Meanwhile, in the US, a remarkable number of people are suggesting we should adopt something like the GDPR. What does that actually mean, and what policy trade-offs does it entail?

Canada's Office of the Privacy Commissioner has concluded that an existing law, the Personal Information Protection and Electronic Documents Act (PIPEDA), gives individuals legal power to make individual websites take down information. This goes well beyond the rights recognized by the European Court of Justice in its “right to be forgotten” case, and raises the following important questions

These comments address the issue of transparency under the GDPR, as that topic arises in the context of Internet intermediaries and the “Right to Be Forgotten.” CIS Intermediary Liability Director Daphne Keller filed them in response to a public call for comments from the Article 29 Working Party – the EU-wide umbrella group of data protection regulators established under the 1995 Directive, soon to be succeeded by the European Data Protection Board established under the GDPR.

This Stanford Center for Internet and Society White Paper uses proposed US legislation, SESTA, as a starting point for an overview of Intermediary Liability models -- and their consequences. It draws on law and experience from both the US and countries that have adopted different models, and recommends specific improvements for SESTA and similar proposed legislation.

Most observers cheered when the neo-Nazi Daily Stormer was booted from YouTube, CloudFlare, and other platforms around the Internet. At the same time, the site’s disappearance stirred anxiety about Internet companies’ power over online speech. It starkly illustrated how online speech can live or die at the discretion of private companies. The modern public square is in private hands.

"According to Daphne Keller, a director at the Center for Internet and Society at Stanford’s school of law, outing those anonymous defendants might be the only way Miltenberg can get the case heard. It’s likely that Google – which was not named in the suit – and Donegan as the document’s creator will be immunized by federal statute and could get the case dismissed, Keller said.

"It will set governments’ expectations about how they can use their leverage over internet platforms to effectively enforce their own laws globally,” said Daphne Keller, who studies platforms’ legal responsibilities at the Stanford Center for Internet and Society and previously was Google’s associate general counsel."

"“Users are calling on online platforms to provide a moral code,” says Daphne Keller, director of the intermediary liability project at Stanford’s Center for Internet and Society. “But we’ll never agree on what should come down. Whatever the rules, they’ll fail.” Humans and technical filters alike, according to Keller, will continue to make “grievous errors.”"

"We don’t have nearly enough information to see the big picture and know what speech platforms are taking down. For the most part, we only find out when the speakers themselves learn that their posts or accounts have disappeared and choose to call public attention to it. But the idea that platforms’ rules are biased — and that this undermines democracy — isn’t new, and it isn’t unique to conservatives.

Internet platforms like Facebook and Twitter play an ever-increasing role in our lives, and mediate our personal and public communications. What laws govern their choices about our speech? Come discuss the law of platforms and online free expression with CIS Intermediary Liability Director Daphne Keller.

When you give sites and services information about yourself, where does it go? Who else will get hold of it, and what will they use it for? The recent revelations about Cambridge Analytica's acquisition of data about tens of millions of Facebook users without their knowledge or consent have prompted renewed interest in how data about us gets shared, sold, used, and misused -- well beyond what we ever expected. Join us for a SLATA/CIS lunchtime conversation with three experts from Stanford’s Center for Internet and Society as we discuss the legal and policy implications of the Cambridge Analytica scandal and responses from Congress and courts. How can we prevent this from happening again? What new problems might we create through poorly-crafted legal responses?

Pages

Cybersecurity is increasingly a major concern of modern life, coloring everything from the way we vote to the way we drive to the way our health care records are stored. Yet online security is beset by threats from nation-states and terrorists and organized crime, and our favorite social media sites are drowning in conspiracy theories and disinformation. How do we reset the internet and reestablish control over our own information and digital society?

""Half the time it's, 'Oh no, Facebook didn't take something down, and we think that's terrible; they should have taken it down,' " says Daphne Keller, a law professor at Stanford University. "And the other half of the time is, 'Oh no! Facebook took something down and we wish they hadn't.' "

Full episode of "Bloomberg West." Guests include Daphne Keller, director of intermediary liability at the Center for Internet and Society at Stanford Law School, David Kirkpatrick, Techonomy's chief executive officer, Radu Rusu, chief executive officer and co-founder of Fyusion, Crawford Del Prete, IDC's chief research officer, and Daniel Apai, assistant professor at The University of Arizona.