Custodians

I’m thrilled to say that my new book, Custodians of the Internet, is now available for purchase from Yale University Press, and your favorite book retailer. Those of you who know me know that I’ve been working on this book for a long time, and have cared about the issues it addresses for a while now. So I’m particularly excited that it is now no longer mine, but yours if you want it. I hope it’ll be of some value to those of you who are interested in interrogating and transforming the information landscape in which we find ourselves.

By way of introduction, I thought I would explain the book’s title, particularly my choice of the word “custodians.” This title came unnervingly late in the writing process, and after many, many conversations with my extremely patient friend and colleague Dylan Mulvin. “Custodians of the Internet” captured, better than many, many alternatives, the aspirations of social media platforms, the position they find themselves in, and my notion for how they should move forward.

moderators are the web’s “custodians,” quietly cleaning up the mess: The book begins with a quote from one of my earliest interviews, with a member of YouTube’s content policy team. As they put it, “In the ideal world, I think that our job in terms of a moderating function would be really to be able to just turn the lights on and off and sweep the floors . . . but there are always the edge cases, that are gray.” The image invoked is a custodian in the janitorial sense, doing the simple, mundane, and uncontroversial task of sweeping the floors. In this turn of phrase, content moderation was offered up as simple maintenance. It is not imagined to be difficult to know what needs scrubbing, and the process is routine. As with content moderation, there is labor involved, but largely invisible, just as actual janitorial staff are often instructed to “disappear,” working at night or with as little intrusion as possible. yet even then, years before Gamergate or ISIS beheadings or white nationalists or fake news, it was clear that moderation is not so simple.

platforms have taken “custody” of the Internet: Content moderation at the major platforms matters because those platforms have achieved such prominence in the intervening years.As I was writing the book, one news item in 2015 stuck with me: in a survey on people’s new media use, more people said that they used Facebook than said they used the Internet. Facebook, which by then had become one of the most popular online destinations in the world and had expanded to the mobile environment, did not “seem” like the Internet anymore. Rather than being part of the Internet, it had somehow surpassed it. This was not true, of course; Facebook and the other major platforms had in fact woven themselves deeper into the Internet, by distributing cookies, offering secure login mechanisms for other sites and platforms, expanding advertising networks, collecting reams of user data from third-party sites, and even exploring Internet architecture projects. In both the perception of users and in material ways, Facebook and the major social media platforms have taken “custody” of the Internet. This should change our calculus as to whether platform moderation is or is not “censorship,” and the responsibilities of platforms bear when they decide what to remove and who to exclude.

platforms should be better “custodians,” committed guardians of our struggles over value: In the book, I propose that these responsibilities have expanded. Users have become more acutely aware, of both the harms they encounter on these platforms, and the costs of being wronged by content moderation decisions. What’s more, social media platforms have become the place where a variety of speech coalitions do battle: activists, trolls, white nationalists, advertisers, abusers, even the President. And the implications of content moderation have expanded, from individual concerns to public ones. If a platform fails to moderate, everyone can be affected, even those who aren’t party to the circulation of the offensive, the fraudulent, or the hateful — even those who aren’t on social media at all.

What would it mean for platforms to play host not just to our content, but to our best intentions? The major platforms I discuss here have, for years, tried to position themselves as open and impartial conduits of information, defenders of their user’s right to speak, and legally shielded from any obligations for how they police their sites. As most platform managers see it, moderation should be theirs to do, conducted on their own terms, on our behalf, and behind the scenes. But that arrangement is crumbling, as critics begin to examine the responsibilities social media platforms have to the public they serve.

In the book, I propose that platforms become “custodians” of the public discourse they facilitate — not in the janitorial sense, but something more akin to legal guardianship. The custodian, given charge over a property, a company, a person, or a valuable resource, does not take it for their own or impose their will over it; they accept responsibility for ensuring that it is governed properly. This is akin to Jack Balkin’s suggestion that platforms act as “information fiduciaries,” with a greater obligation to protect our data. But I don’t just mean that platforms should be custodians of our content; platforms should be custodians of the deliberative process we all must engage in, that makes us a functioning public. Users need to be more accountable for making the hard decisions about what does and does not belong; platforms could facilitate that deliberation, and then faithfully enact the conclusions users reach. Safeguarding public discourse requires ensuring that it is governed by those to whom it belongs, making sure it survives, that its value is sustained in a fair and equitable way. Platforms could be not the police of our reckless chatter, but the trusted agents of our own interest in forming more democratic publics.

If you end up reading the book, you have my gratitude. And I’m eager to hear from anyone who has thoughts, comments, praise, criticism, and suggestions. You can find me on Twitter at @TarletonG.

Post navigation

One thought on “Custodians”

This generates some interesting theoretical challenges in social science terms: by foregrounding individual custodians (a very Sci Fi term greatly loved) you leave structural questions perhaps to one side, and suggest individuals may be able to do more than they can. In our work on Cyber Racism and Community Resilience (Palgrave 2017) we argue that it is the deep political economy of social media that opens them up to being a superb arena for online racism, almost irrespective of whom is trying to moderate the flow of harm. I am looking forward to reading the text.

About Us

The Social Media Collective (SMC) is a network of social science and humanistic researchers, part of the Microsoft Research labs in New England and New York. It includes full-time researchers, postdocs, interns, and visitors. Beginning in 2009, the researchers who now lead the initiative are: Nancy Baym, danah boyd, Kate Crawford, Tarleton Gillespie, and Mary Gray. Our primary purpose is to provide rich contextual understanding of the social and cultural dynamics that underpin social media technologies. We use a variety of methodologies and span multiple disciplines.

Subscribe by Email!

Enter your email address to subscribe to the SMC blog. You'll get an email when we update the blog.