How a team of psychologists and scientists at Riot Games is unlocking the secret to eliminating abuse within an online video game.

By Simon Parkin on September 14, 2015

Like many online spaces, League of Legends, the most widely played online video game in the world today, is a breeding ground for abusive language and behavior. Fostered by anonymity and amplified within the heated crucible of a competitive team sport, this conduct has been such a problem for its maker, Riot Games, that the company now employs a dedicated team of scientists and designers to find ways to improve interactions between the game’s players.

During the past few years the team has experimented with a raft of systems and techniques, backed by machine learning, that are designed to monitor communication between players, punish negative behavior, and reward positive behavior. The results have been startling, says Jeffrey Lin, lead designer of social systems at Riot Games. The software has monitored several million cases of suspected abusive behavior. Ninety-two percent of players who have been caught using abusive language against others have not reoffended. Lin, who is a cognitive neuroscientist, believes that the team’s techniques can be applied outside the video-game context. He thinks Riot may have created something of an antidote for online toxicity, regardless of where it occurs...<more>

I'm sure this technology could be turned to more nefarious uses as well. But it's a small step in the right direction for some web communities that are experiencing major civility issues. In the case of "all inclusive" communities like League of Legends or Facebook, bigger is better - and anything driving away even a small number of potential users is bad for business. So the incentive is certainly there to do something to reform or expel the bad apples.

But there's an easier solution too. If more communities on the web could just be more like here... it would already be "problem solved."

I suspect you hit the nail on the head there by use of the word "communities". I think it was the Bell Hawthorne Labs workgroup experiments in the '50s that showed pretty conclusively that a workgroup community will tend to generate its own collective ethos - i.e., the characteristic spirit of the community as manifested in its attitudes and aspirations.One of the conclusions was that, whilst one can dictate formal rules, the group will find its own vector and rules regardless and may even informally overturn any formal rules set that it does not agree with.

To avoid this, @mouser's relatively laissez-faire approach, coupled with a strict diversion of irrelevant (to the forum) or flameworthy discussions to the private Basement area probably ensures that the forum is controlled and the community cannot hijack it and alter the ethos. The forum thus retains an air of being very much "@mouser's front room", and people seem to respect that.It seems to be a fairly straightforward approach to maintaining a public focus on what matters to the forum, and because the forum does not necessarily condone or agree with what goes on in the Basement, the robot.txt file blocks crawlers, so the Basement content can never appear in searches or the Wayback machine and is thus effectively expunged.

Any problems or issues arising in the public forum otherwise seem to be managed adequately by admins or forum members putting people straight.

The result is arguably a relatively stable, well-focused forum, and at a personal level the donation aspect probably tends to give a subconscious feeling of individual personal buy-in and commitment to the community, amongst DCF members. It seems to work.