The moment you identify yourself as a gamer, you will be the subject of stereotype. You’re apathetic, you have a short temper, and you don’t bathe as often as a normal human being should. The word gamer, like any other label, immediately chucks your identity out the window and lumps you in with a group whose identity has been established in part by the gaming industry. The other part comes from the opposition to gaming -- the parents, politicians, and average people who believe games do more harm than good. The opposition I hear most, though, is that video games will ruin players’ social skills.

When I was a kid, I worried what that could mean for the future of my gaming endeavors. My parents were my primary source of income, and so my primary source of new virtual material. If my mother got wind of my deteriorating social skills, it could spell the end of the digital adventures I enjoyed. And what did this mean for other gamers? If games were to blame for boring conversation, other parents would find out and discerning gaming adults would be close behind. It might not just be my end; it might be the end of gaming, period.

As I grew older and played more games, I started to see how the industry strove to combat this obvious problem. More games were designed with complex multiplayer components, allowing players to interact with each other and compete against one another while in the same room or remotely. Games were proving to be sociable, but it wasn’t enough.

As much as the industry tried to provide outlets for a gamer’s social interests, the gamers wanted more. While developers began to offer official forums and fansites for their games, gamers were producing their own sites, sites that were often more advanced than the pages from the developers. Strategy games had pages dedicated to user generated maps, custom skins for individual units. First person shooters had sites for uploading screens of your latest headshot, and the gaming industry caught on.

In 2001, Bungie Studios released “Halo,” perhaps the most successful first person shooter of all time. That is, until the sequels. Bungie had their own site for players to connect, but the fan base was much bigger than they had ever anticipated. Sites like halo.bungie.org and High Impact Halo let players share their tips, tricks and screenshots, just like thousands of earlier games. The Halo community was clamoring for more. LAN parties and local tournaments weren’t nearly enough.

So Bungie pioneered the best Xbox Live system on the market with “Halo 2.” The game made a few graphical improvements over the original, but the real news was the multiplayer system. Players could connect with 16 other people and, harnessing the voice capabilities of Xbox Live, communicate and compete. I know, this is all old news, right? As a technological development, yes, but as a player driver movement, maybe not. “Halo 2” allowed players to view individual stats from every game they played, but it still wasn’t enough. Players created sites that pulled the stats from the official pages for more advanced diagnostics. Teams could fine tune their strategies, try to lower their death counts, and on and on. It’s not what this allowed players to do that is so interesting, but again, that the players were driving it.

The final “Halo” release took details even further. Bungie developed tools for players to capture screen shots and videos in game, and share them on the spot. Players can compare the pictures without getting up and going to a computer. Players are ranked not only by skill, but by the duration of play. As much as players wanted to socialize, they were doing so behind a two-way mirror. Anonymity limits the extent to which players could know each other.

Developers again saw the patterns apparent in the forums and usergroups all over the web and began creating ways for players to harness their persistent identities in-game. Players are rewarded for longevity, keeping the same name as a gamertag, username, forum profile, whatever. Player interaction continues to become more and more human, mimicking aspects of typical social relationships. Like knowing someone’s name and face, players identify one another by voice, level of achievement, character customization, and so forth.

So what happens when the games allow players to undo their own identities? In another of the most popular games in the world, “World of Warcraft,” players have recently been given the ability to change character names, albeit at a cost. Plenty of players were thrilled by the change. It meant they didn’t have to live with ‘lolipwndu’ for the rest of their gaming experience. It also meant that players who violated the social code of conduct within the game could run free. Empty your guild bank and you’re just $10 away from a fresh start with a lot of gold.

Players have long reacted negatively to attempts at anonymity within the game. Players posting on the official WoW forums are required to post under their account name, but may do so with any character, including a low-level ‘alt’ created just for forum abuse. The most active threads regularly include statements like “stop hiding behind your alt,” or “post on your main, pussy.” The players who value the virtual experience most revile those who long for anonymity.

So where can the games of the future take us? Developers listened to the early desires of the community and developed social systems so advanced that players began to take on persistent identities. There are still ways for a player to undo that identity -- rename the character, or change their gamertag -- but the hardest of the hardcore despise the ability and are probably among those who dump the most time, energy and, of course, money into a game. Can developers afford to cater to the needs of the casual gamer, instead? Are games destined to be a virtual existence, a recreation of “Second Life” with a new face? Where is the middle ground? With each new game comes a new virtual society, and though the base of gamers is always expanding, many citizens of these new worlds are players from another game, for better or worse. And like any society, the gaming world has its share of pariahs. So how do we keep gaming social without becoming overbearing?

The government of our virtual worlds has long been in the hands of the developing community, but as these societies continue to evolve, players treat the worlds more and more like the systems they imitate. Pay-to-play games like “World of Warcraft” (right) have scads of forum posts full of sentiments like, “I pay my $15 a month, I should be able to play how I want.” The fee is treated like a tax the players are willing to pay for free access to anything within the world. The same is true of the Xbox Live service. For the most part, players believe these capitalism-driven entertainment machines should be functioning like a democracy, taking player input as the basis for changing the game. Gone are the days of “Don’t like it? Don’t play it.” Gamers expect their games to evolve.

In other ways, the gamer-developer relationship is more like a government than we think. Instead of revolution, gamers riot and boycott in the virtual world. When things go poorly, players stop playing, or as has been done in several MMOs, attempt to use as much bandwidth as possible in an effort to crash the servers. A developer only has one course of action, which we’ve seen play out across the history books: exile. Players get suspended and more players get angry until something is fixed and the world is back to normal. These are harrowing times indeed.

As virtual society continues to collide with physical society, developers and gamers alike will have to adapt new ways of dealing with those collisions. While there may not be “governments” for these virtual worlds now, what will the future of gaming hold? That, as they say, remains to be seen.