William Shatner, Reddit, And The Complications Of "Free Speech" On The Internet

Yakkin’
About The Internet is an ongoing
series by Whitney Phillips and Kate Miltner. Whitney recently
completed her PhD dissertation on trolls; Kate wrote her Masters
thesis on LOLCats—yep! Up for discussion today:
William Shatner’s visit to Reddit, community moderation, and
the complexities of “free speech” on the internet.

Whitney: Two weeks ago, William Shatner
tweeted with Chris Hadfield, an astronaut stationed at the
International Space Station. This resulted in the ENTIRE INTERNET
BEING WON by Shatner, at least according to
this Reddit thread. Apparently the 81-year-old Shatner got wind
of the thread, and promptly created an account. He then proceeded
to spend the next few days feeling out the platform and openly
criticizing its most characteristic elements, namely Reddit’s
karma system, wherein points are given or deducted based on
community feedback, as well as AMA threads, which stands for “Ask
Me Anything” and provides celebrities and other notables an
opportunity to interact with fans. Regarding the karma system,
Shatner expressed outright confusion (“isn’t the system basically
broken?” he
asked), and regarding AMAs, Shatner wondered if it was
meaningful for anyone but the people who happened to be sitting in
front of the computer as the conversation unfolded. And anyway, he
asked, “don’t I do that daily on Twitter?”

Shatner then tackled a much meatier problem—Reddit’s moderation
policies. Or lack thereof, as he
lamented, which are inextricably tied to the aforementioned
karma system, in which “good” comments are rewarded with karma
points and increased visibility while the “bad” comments are
downvoted and essentially run out of town. Shatner was
appalled by what often passes as “good” commentary on Reddit
(“good,” here, translating to “most popular/most upvoted,” not
necessarily “positive”), namely rampant racism, sexism and
homophobia. “The fact that someone could come here, debase and
degrade people based on race, religion, ethnicity or sexual
preference because they ‘have a right’ to do so without worry of
any kind of moderation is sending the wrong message, in my humble
opinion,” he
wrote.

Seriously though, good for Shatner. From the comments that
followed, he said what a lot of people have been saying or wanted
to say, which is that this sort of noxious speech and content is
not acceptable, and that it shouldn’t be tolerated by the
moderators.

Whitney: My interest in Shatner’s comments are twofold.
The first is good old fashioned Schadenfreude, because how do you like that
nerd apple, Reddit (a breakdown of my feelings about the site
can be found here).
The second reaction was much more reflective, since Shatner’s
argument—which takes for granted that Reddit as a whole (meaning
all its mods and admins) are responsible for, as they say, the
shit Reddit
says—dredges up a number of larger questions, particularly the
ideal relationship between platform user(s) and platform
moderator(s). What is the ideal relationship? Do users have a
“right” to “free speech” on privately owned platforms, as many
Redditors insist between rape jokes? Do platforms have a
responsibility to shut that sort of content down before it ingrains
itself in the site culture?

Kate: Ah, responsibility. What a loaded word! Okay, so
first of all, if we’re going to talk about the site culture, we
need to look at its roots. Reddit is a platform that is largely
based in the libertarian ethos of the early web. Free Speech At All
Costs is a fundamental precept of that ethos, and for better or
worse, I think that’s what’s fueling a lot of these claims of FREE
SPEECH!!11. Just to provide a little bit of context: in 1996, John
Perry Barlow (one of the founders of the Electronic Frontier Foundation) wrote
A
Declaration of the Independence of Cyberspace. In it, he
declared:

We are creating a world where anyone, anywhere may express his
or her beliefs, no matter how singular, without fear of being
coerced into silence or conformity… In our world, all the
sentiments and expressions of humanity, from the debasing to the
angelic, are parts of a seamless whole, the global conversation of
bits. We cannot separate the air that chokes from the air upon
which wings beat.

So, if you come at it from this perspective, you cannot separate
“good” speech from “bad” speech, and as such, both are protected
(or should be).

The thing is, as much as some Redditors may want to claim
otherwise, Reddit is not The Internet. Reddit is a privately owned
platform that can decide what sort of user-generated content will
or will not be tolerated. Legally, Condé Nast (who owns Reddit) can
do whatever they want to control what is posted on the site (which
seems like not much, because pageviews, probably). The question
that Shatner’s comments raise is whether or not they
should.

Whitney: And yet the “free speech” issue lingers (scare
quotes used to differentiate the legal sense of the term from the
cartoon internet sense of the term), which is best summarized by
the concurrent assertions that “you are not the boss of me” and
“don’t tell me what to do.” This card is most frequently played by
those who think they should be able to do or say whatever they want
whenever they want, often at the expense of women, gays and
lesbians, and people of color, because… well because free speech
(I’m looking at you, Men’s Rights-types). In response to Shatner’s
posts, many Redditors either directly reiterated this position, or
argued a
slightly more nuanced version of the same thing, namely that
imposing some external morality police (for example by assigning
paid moderators to each specific subreddit) would undermine the
very spirit of the site, which is based on, you guessed it, “free
speech.”

Taste and opinion
aside, you’re also dealing with a medium where it’s relatively easy
to misinterpret context, or intent, or any number of things that
give content meaning within a particular community.

To be fair, as several Reddit users pointed out, Reddit is
already subject to some on-site moderation. Reddit is a
self-moderating community; community members police their own
borders through upvoting (which, again, is designed to reward the
good and punish the bad, at least what passes as “good” and “bad”
on that particular subreddit). Furthermore, volunteer moderators,
who are also members of the community, can intervene when other
users cross whatever behavioral or ethical line deemed
acceptable/desirable by the subreddit (which is often determined
through karmic upvoting/downvoting—what the community accepts is
the content the community consistently upvotes). In other words,
Reddit is designed to self-regulate; so long as individual
subreddits are allowed to decide what’s appropriate for themselves,
we should be good.

Kate: Right, but what subreddits often decide is
“appropriate” is the type of content that a lot of other people
find offensive, and Reddit’s boundaries are porous—the site is set
up for community-sourced discovery. We talked about this issue of
competing norms and mores last time around with our public
shaming discussion; what some people find acceptable (or
entertaining) can be completely offensive to others. Taste and
opinion aside, you’re also dealing with a medium where it’s
relatively easy to misinterpret context, or intent, or any number
of things that give content meaning within a particular
community.

As redditor Morbert
said in the comments, “Reddit isn’t a single community. It is a
variety of communities, for better or for worse.” That makes things
incredibly tricky. The truth is, there are some racist, homophobic,
misogynist jerks out there who think that there is nothing funnier
than a rape joke (ugh). These people are going to congregate
somewhere—and I hate to say it, but that is their right—and I mean
that in a constitutional sense: it is totally the legal right of
gross bigots to hang out on a message board and make disgusting
jokes about anyone who is not white and male, because that is what
our laws allow.

The other thing I wanted to bring up is why we are even talking
about this in the first place. I mean, who cares whether or not
Reddit is tolerant of this sort of stuff except for people who hang
out on Reddit? Why is this even a story to begin with? Well, it’s a
story because Reddit is an influential platform—influential enough
that President Obama’s campaign staff thought it would behoove him
to do an AMA. So the reason that this matters is because one of the
most influential and highly-trafficked sites on the internet is
also a site that hosts a lot of content that demeans and insults
the
majority of the US population.

Whitney: That Reddit has been plagued with, let’s call
them, “behavioral issues” isn’t just surprising, it’s built into
the platform, which is then built into the overall ethos of the
site. This is not to say that all Reddit users are Violentacrez clones, or that all
subreddits are gross—many users are extremely thoughtful (you can
see that in one of Shatner’s follow-up
threads; scroll down to see a handful of Redditors grappling
with many of these same issues). Take these comments, for
example:

As a girl on reddit I get really upset and disheartened about
the amount of sexist bull I see on here. It’s not just sexist crap,
it’s down right hypocritical. One day you’ll see an article on the
front page about men protesting rape, and the comments will be all
about how they would never commit a rape and are super anti rape.
Until someone goes in there and posts about their being raped. They
get called liars, told they put themselves in that situation and so
on. I had one guy tell me I wasn’t raped because I gave up
protesting, fighting back and saying no. He said persistence
doesn’t equal rape.

And from
BottleRocket2012 (in response to the claim that Reddit isn’t a
single community, and that for better or worse it is comprised of
many smaller communities):

I think this is the frequent reddit response. But the reality is
all the racist “humor” that makes the front page is upvoted by the
community at large and these millions of people aren’t rotated
every day. The other reality is everyone reading William Shatner’s
posts thinks he is talking about other people. I mean “OP is a
faggot” is a hilarious meme and he isn’t getting the inside joke.
And besides a gay person said “I’m gay and I find this hilarious”
so now in the mind of a redditor this isn’t hating. No racist ever
thinks they are, everyone creates a wall of bullshit to believe
what he is doing is ok.

And smaller subreddits—a subreddit devoted to the staggering
variety of topics covered by other subreddits can be found here—are much less likely
to be overrun by violent sexism (except for subreddits devoted to
violent sexism, oh for example /r/beatingwomen, which
has nearly 34K subscribers).

That said, the site as a whole is undergirded by a basic kind of
libertarian permissiveness. In a perfect world, one that has
achieved gender, racial and sexual equality, and in which all
voices are equally represented, this sort of permissiveness might
be enough to ensure a stable, healthy, self-regulating platform.
But this is not a perfect world; you can’t hand a bunch of racists,
misogynists and homophobes the keys to the castle and then
reasonably expect them to deny entry to other racists, misogynists
and homophobes (for a visual representation of this idea, consider
the following
infographicfrom Modern Primate). So what to do? The obvious
answer is to take away the keys and hand them to someone who
doesn’t stand to benefit from all that permissiveness. Someone who
couldn’t give two shits about the karma they stand to win or lose.
In other words, you start moderating. And not just moderating, but
culling the very worst offenders. Extreme, maybe, but so is
/r/beatingwomen. (And yes, I am fully aware of the practical
complications of iron-clad moderation policies, namely that lots of
moderation requires lots of paid employee labor, and furthermore
that said labor is often actively thwarted by those with mayhem in
their hearts. I am also aware that ban-happiness flirts with a
whole new set of problems, usually having to do with the mod’s
personal bias. Still, I present to the jury the basic failings of
Reddit’s current moderation model, and suggest that what they’re
doing now doesn’t work, and merely opens the door for all
kinds of abuse.

Kate: Yes, ugh. God. And /r/rapingwomen and
/r/killingwomen. Technically, all of those fit the definition of
hate
speech, which means that they go from offensive to (borderline?
technically?) illegal, which is another issue, really. The majority
of racist and sexist content/commentary on Reddit—the commentary
that Shatner was referencing— isn’t that extreme (thank god). And
this is where I get squirmy about culling—because it’s all so
relative and context-dependent (“I said ‘OP is a faggot’
sarcastically to point out its innate homophobia and call everyone
else out on their bigotry, look at my previous trail of comments”).
I’ve
already said this elsewhere but: who moderates whom is a major
issue. Who gets to decide what is acceptable and what is offensive?
That is such a slippery slope—you cull (or dare I say censor) one
thing, and then where does it stop? The path to hell is laid with
good intentions, etc, etc.

Still, I present
to the jury the basic failings of Reddit’s current moderation
model, and suggest that what they’re doing now doesn’t work,
and merely opens the door for all kinds of abuse.

The other thing is that having this stuff out in the open might
not entirely be a bad thing, as upsetting as it might be (just
stick with me for a second). There are a growing number of people
out there who think that we’re in a
post-racial, post-gender, post-whatever world, and that racism
and sexism aren’t as problematic as they usedto
be (AHAHAHA, HA HA HA HA). The more that blatantly
prejudicial/bigoted/hateful expression is pushed to the margins,
the easier it will be for certain people to be like, “What do you
mean, racism and sexism are problems? Oh, THOSE crackpots on weird
site no one has heard of? Whatever, they’re just a minority. CHECK
MAH SOCIAL PROGRESS.” I’d like to point out that you and I wouldn’t
be talking about this right now if these comments were being
published on I’mARacist.com– we are only talking about it because
it’s on Reddit.

As you’ve noted previously, shaming (or in this context,
moderating) ignorant people isn’t going to change their fundamental
beliefs. They’ll just end up taking their isht elsewhere—and that
may clean up the tone/content on Reddit/create a filter bubble for
offensive content on major platforms, but it won’t eliminate the
underlying problem. It is absolutely essential that we (as a
society, as individuals, as academics, as people who publish their
opinions on websites) keep talking about this, frequently and
publicly. Otherwise, these beliefs (which are not going away
anytime soon) will become (further) silently institutionalized,
which is arguably more difficult to combat.

Whitney: Yes, if you give a mouse a cookie, he’ll want
you to ban the word Christmas from all public-school functions
(it’s actually not a bad idea). The problem I’ve always had with
that argument—if we start censoring some of the things, what will
stop us from censoring ALL of the things??—is that it essentially
plays on a person’s fear of being silenced, not their sense
of basic human decency. In short: this person is being censored for
their beliefs. You don’t want to be censored for YOUR beliefs, do
you?? Then you better defend with your life other Redditors’ right
(which isn’t actually their right, as they’re posting to a
privately owned website) to post incendiary, unnecessary,
completely unproductive bile all day, because “free speech.”

In other words, the argument that selective censorship can only
lead us down a path to fascism often does little more than to lull
everyone else into complicity, and therefore functions as
preemptive self-censorship. You are encouraged to hold your tongue
when you see something upsetting, because maybe next time you’ll be
the one whose speech is under the microscope. This is a problem,
because some people need to be told to SHUT UP, particularly when
their speech interferes with their audience’s basic human
right—what should be a basic human right—not to be constantly
inundated with violently racist, sexist, homophobic, pedophilic or
otherwise ignorant bullshit every time they go online. On Reddit,
there are ways of shutting the most egregious content down; but in
order for that to happen, some people (ahem, white dudes) have to
be willing to acknowledge that the “free speech” to which they so
desperately cling actually costs quite a bit, a point with which
Reddit’s managers and investors would also have to make peace.
Because banning bigots would mean less traffic, and less traffic
would mean less money. And wouldn’t that be a shame. Which is not—I
repeat, is not—an argument against offensiveness generally.
Nor is it an argument against all forms of dissent or discomfort,
both of which can be quite generative. This is an argument against
what is already dead cultural weight. Nobody benefits from keeping
it around, except maybe the websites themselves. But even then,
it’s not so much “benefit” as “profit.”

Kate: I’m not arguing against solid moderation policies
on Reddit or anywhere else. Those are private sites that can
dictate the tone of discourse however they see fit. However, if
we’re talking about people shutting their mouths in the larger
sense, I don’t know if I agree. Speech—and who is listened to—is
often about power and access. If restrictions on speech are put in
place—even with the aim of helping marginalized groups—I worry that
they will end up backfiring. Those who are used to having power are
awfully good at figuring out ways to
circumvent things to ensure it’s business as usual.

A few months ago, social media scholar danah boyd wrote an excellent blog
post about the nature of freedom of expression in a
cross-national, online context. She was discussing the uproar over
The
Innocence of Muslims and the
racist MTA campaign by the American Freedom Defense Initiative.
Different case studies, same issues. She wrapped up the whole
problem pretty neatly, so I’m just going to end it with a quote
from her:

I think that we need to start having a serious conversation
about what freedom of speech means in a networked world where
jurisdictions blur, norms collide, and contexts collapse. This
isn’t going to be worked out by enacting global laws nor is it
going to be easily solved through technology. This is, above all
else, a social issue that has scaled to new levels, creating
serious socio-cultural governance questions. How do we understand
the boundaries and freedoms of expression in a networked world?

In the Summer of 2012, Whitney Phillips received her PhD in
English (Folklore/Digital Culture emphasis) from the University of
Oregon. Her dissertation, titled “This is Why We Can’t Have Nice
Things: The Origins, Evolution and Cultural Embededness of
OnlineTrolling,” pulls from cultural, media and internet studies,
and approaches the subject of trolling ethnographically. She writes
about internet/culture here and here and
is currently a lecturer at NYU.

Kate Miltner is the Research Assistant for the Social Media Collective at
Microsoft Research New England. She received her MSc from the
London School of Economics after writing her dissertation on
LOLCats, something for which she has been mocked mercilessly in the
comments sections of Gawker, The Huffington Post, Mashable, Time
Magazine, and the Independent. She has also written about internet
culture for The Guardian and The Atlantic. You can find out more
about her at katemiltner.com.