When we hear the word "software," most of us think of things like Word, Powerpoint, or Photoshop, tools for individual users. These tools treat the computer as a box, a self-contained environment in which the user does things. Much of the current literature and practice of software design -- feature requirements, UI design, usability testing -- targets the individual user, functioning in isolation.

And yet, when we poll users about what they actually do with their computers, some form of social interaction always tops the list -- conversation, collaboration, playing games, and so on. The practice of software design is shot through with computer-as-box assumptions, while our actual behavior is closer to computer-as-door, treating the device as an entrance to a social space.

We have grown quite adept at designing interfaces and interactions between computers and machines, but our social tools -- the software the users actually use most often -- remain badly misfit to their task. Social interactions are far more complex and unpredictable than human/computer interaction, and that unpredictability defeats classic user-centric design. As a result, tools used daily by tens of millions are either ignored as design challenges, or treated as if the only possible site of improvement is the user-to-tool interface.

The design gap between computer-as-box and computer-as-door persists because of a diminished conception of the user. The user of a piece of social software is not just a collection of individuals, but a group. Individual users take on roles that only make sense in groups: leader, follower, peacemaker, process nazi, and so on. There are also behaviors that can only occur in groups, from consensus building to social climbing. And yet, despite these obvious differences between personal and social behaviors, we have very little design practice that treats the group as an entity to be designed for.

There is enormous value to be gotten in closing that gap, and it doesn't require complicated new tools. It just requires new ways of looking at old problems. Indeed, much of the most important work in social software has been technically simple but socially complex....

Many popular websites such as Wikipedia offer little differentiation between the experience and tools available to registered users and to anonymous visitors. Many active users of such websites may, therefore, be virtually unknown to their hosts. As most party hosts knows, it’s usually the people who just “show up” and that you don’t know who can create the biggest headaches.

Six Steps to Better Online Community Through Membership

Know Thy Users

Simple Registration is Not a Burden

Segment Your Registration System

To Verify a User’s Email Address or Not?

Provide a Rating or Reputation System

Keep the Communication Flowing

Membership systems are no panacea, and they won’t stop a person who is committed to disrupting your site. But they do offer an important stepping stone to connect a website’s community to a real person, and that person to their actions. Such a registration process also works to limit the disinhibitory effects of online behavior—or at least some of the more negative ones—and creates a subtle but important psychological difference between an anonymous visitor and a known community member. A person who is invested in a community through a membership system is one less likely to abuse the community.

Sociologists have long known that communes and other cooperative groups usually collapse into bickering and disband if they do not have clear methods of punishing members who become selfish or exploitative.

Now an experiment by a team of German economists has found one reason punishment is so important: Groups that allow it can be more profitable than those that do not.

Given a choice, most people playing an investment game created by the researchers initially decided to join a group that did not penalize its members. But almost all of them quickly switched to a punitive community when they saw that the change could profit them personally.

The study, appearing today in the journal Science, suggests that groups with few rules attract many exploitative people who quickly undermine cooperation. By contrast, communities that allow punishment, and in which power is distributed equally, are more likely to draw people who, even at their own cost, are willing to stand up to miscreants.

An expert not involved in the study, Elinor Ostrom, co-director of the Workshop in Political Theory and Policy Analysis at Indiana University, said it helped clarify the conditions under which people will penalize others to promote cooperation.

"I am very pleased to see this experiment being done and published so prominently," Dr. Ostrom said, "because we still have many puzzles to solve when it comes to the effect of punishment on behavior."

Dr. Ostrom has done fieldwork with cooperatives around the world and said she often asked other researchers and students whether they knew of any long-lasting communal group that did not employ a system of punishment. "No one can give me an example," she said.

In the experiment, investigators at the University of Erfurt in Germany enrolled 84 students in the investment game and gave them 20 tokens apiece to start. In each round of the game, every participant decided whether to hold on to the tokens or invest some of them in a fund whose guaranteed profit was distributed equally among all members of the group, including the "free riders" who sat on their money. Because the profit was determined by a multiple of the tokens invested, each participant who contributed to the fund enjoyed less of a return than if the free riders had done so as well.

The tokens could be redeemed for real money at the end of the experiment.

About two-thirds of the students initially chose to play in a group that did not permit punishment. In the other group, the students had the option in each round of penalizing other players; it cost one token to dock another player three tokens. All participants could see who was contributing what as the game progressed, and could choose to switch groups before each round.

By the fifth round, about half of those who began the study in the no-penalty group had switched to the punitive one. A smaller number of students migrated in the other direction, but by Round 20 most had come back and the punishment-free community was a virtual ghost town.

"The bottom line of the paper is that when you have people with shared standards, and some who have the moral courage to sanction others, informally, then this kind of society manages very successfully," said the study's senior author, Bettina Rockenbach, who was joined in the research by Bernd Irlenbusch, now at the London School of Economics, and Ozgur Gurek.

Switching groups frequently prompted remarkable behavioral changes in the students. Many of those who had been free riders in the laissez-faire group eagerly began penalizing other selfish players upon switching. Dr. Rockenbach compares these people to heavy smokers who are insistent on their right to light up, until they quit. "Then they become the most militant of the antismokers," she said.

Being exploited appeared to cause deep frustration and anger in most students, she said.

Other experts said the results were an important demonstration of how self-interest can trump people's aversion to punitive norms, at least in the laboratory. Out in the world, they said, it is not usually so clear who is free-riding, or even whether a given group is encouraging cooperative behavior in most people.

"The mystery, if there is one, is how these institutions evolve in the first place," Duncan J. Watts, a sociologist at Columbia, wrote in an e-mail message, "i.e., before it is apparent to anyone that they can resolve the problem of cooperation."

It's an emerging rule of thumb that suggests that if you get a group of 100 people online then one will create content, 10 will "interact" with it (commenting or offering improvements) and the other 89 will just view it.

It's a meme that emerges strongly in statistics from YouTube, which in just 18 months has gone from zero to 60% of all online video viewing.

The numbers are revealing: each day there are 100 million downloads and 65,000 uploads - which as Antony Mayfield (at http://open.typepad.com/open) points out, is 1,538 downloads per upload - and 20m unique users per month.

That puts the "creator to consumer" ratio at just 0.5%, but it's early days yet; not everyone has discovered YouTube (and it does make downloading much easier than uploading, because any web page can host a YouTube link).

Consider, too, some statistics from that other community content generation project, Wikipedia: 50% of all Wikipedia article edits are done by 0.7% of users, and more than 70% of all articles have been written by just 1.8% of all users, according to the Church of the Customer blog (http://customerevangelists.typepad.com/blog/).

Earlier metrics garnered from community sites suggested that about 80% of content was produced by 20% of the users, but the growing number of data points is creating a clearer picture of how Web 2.0 groups need to think. For instance, a site that demands too much interaction and content generation from users will see nine out of 10 people just pass by.

Bradley Horowitz of Yahoo points out that much the same applies at Yahoo: in Yahoo Groups, the discussion lists, "1% of the user population might start a group; 10% of the user population might participate actively, and actually author content, whether starting a thread or responding to a thread-in-progress; 100% of the user population benefits from the activities of the above groups," he noted on his blog (www.elatable.com/blog/?p=5) in February.

So what's the conclusion? Only that you shouldn't expect too much online. Certainly, to echo Field of Dreams, if you build it, they will come. The trouble, as in real life, is finding the builders.

The most important is distance. People will say things in anonymous forums that they'd never dare say to someone's face, just as they'll do things in cars that they'd never do as pedestrians—like tailgate people, or honk at them, or cut them off.

Trolling tends to be particularly bad in forums related to computers, and I think that's due to the kind of people you find there. Most of them (myself included) are more comfortable dealing with abstract ideas than with people. Hackers can be abrupt even in person. Put them on an anonymous forum, and the problem gets worse.

The third cause of trolling is incompetence. If you disagree with something, it's easier to say "you suck" than to figure out and explain exactly what you disagree with. You're also safe that way from refutation. In this respect trolling is a lot like graffiti. Graffiti happens at the intersection of ambition and incompetence: people want to make their mark on the world, but have no other way to do it than literally making a mark on the world.

The final contributing factor is the culture of the forum. Trolls are like children (many are children) in that they're capable of a wide range of behavior depending on what they think will be tolerated. In a place where rudeness isn't tolerated, most can be polite. But vice versa as well.