We’ve Seen This Movie Before

Last Thursday, Huffman said (via Reddit post, of course) that the site would bury but not ban “content that violates a common sense of decency.” The most common response to that was: What about your commitment to free speech?

Nonprofit forums can also collapse from trolling. The distributed system of message boards called Usenet foundered in the late ’90s largely because some of the unmoderated “newsgroups” became ungovernable.

“What would happen in those groups was lots of trolling, lots of abusive material published,” said Purdue University computer-science professor Gene Spafford, a pioneering Usenet organizer. “The majority of the people who wanted useful distribution mechanisms left, because the trolls just polluted it.”

Rules On the Books Aren’t Enough

Ellen Pao (Photo: Reuters)

Reddit reminds me of Usenet in good and bad ways. Like newsgroups at their best, it can be a never-ending source of help, humor, and humanity. It has informed my coverage and given me useful feedback.

Reddit has its rules, but the company puts much of the responsibility for policing users on the moderators of subreddits. And if those administrators themselves are bigots (there’s a fascinating study of Reddit’s spectrum of attitudes), you have a nasty feedback loop.

Reddit, like some other online forums, does offer one incentive for civility that Usenet lacked: Your account can be anonymous, but anybody can see the numeric “karma” score you earn when other Redditors rate your input.

That basic level of accountability — more than a ban on anonymity — is what can steer an online community in the right direction. But such technical decisions are often delayed until it’s damage-control time.

Journalist Sarah Jeong makes this point in her just-released book The Internet of Garbage, comparing the massive investment companies have made in fighting spam with the little they’ve done to combat trolling. Often, it’s left to outsourced employees to process abuse reports.

How Do We Fix This?

That won’t do the job. You need committed humans who feel ownership of the rules and who show up in the forums — which can cost money. As entrepreneur and writer Anil Dash pithily summarized it in a post four years ago, “If your website’s full of assholes, it’s your fault.”

And there’s no one-size-fits-all comment-moderation system ready for Reddit or anybody else to install. As Jeong told me, “What works for Facebook may not work for Twitter, may not work for Reddit, and so on.”

(FYI: At Yahoo Tech, writers and editors can promote or demote comments, and those we reply to also move up.)

But some of the people most experienced with Web forums, like Dash, remain convinced this is a solvable problem with settled principles.

My friend Esther Schindler, a prolific Redditor who has been active in online communities since running a CompuServe forum in 1990, compared the moderator’s job to two core tasks of running a bar: “The barkeep makes everyone feel welcome; the bouncer ensures that people play by the rules.”

That enforcement should be visible and understandable. “If the moderator steps in publicly (but never meanly), the community will (a) recognize that this is a safe place and (b) feel empowered to apply the rules when you aren’t around.”

Greg Barber, digital news projects editor at the Washington Post, made the same point, saying that clear, consistent moderation can convey a site’s culture to its members. “The best communities I’ve seen at the Post are ones that have evolved organically, with community members playing a part in setting and maintaining the tone.”

Barber also works on the Coral Project, an effort by the Post, the New York Times, and Knight-Mozilla OpenNews to develop better online-community systems. They recently launched a Tumblr blog highlighting insightful feedback from readers across the Web.

But it may take years of more work before this blog’s title doesn’t inflict an instant sense of dread among many of you: “Do Read the Comments.”