Compulsion

The following blog post, unless otherwise noted, was written by a member of Gamasutra’s community.
The thoughts and opinions expressed are those of the writer and not Gamasutra or its parent company.

“Ding!”

“Grats!”

Is a really common thing to hear, in any online game. The expression originated in EverQuest, whereupon leveling up a player would hear something like a 40-foot-tall bronze gong being torn in half by another giant gong, with that sound being run through someone’s gravelly track & field loudspeaker. It’s not exactly my favorite sound in the world, but then I never got to hear it after days or weeks of grinding, as the harsh audio reward for my long labors.

Players would broadcast their glee, by telling nearby players and guildmates, “Ding!”

To great adulation and congratulation. It became such a cultural artifact, such a hallowed ceremony for the first big player organizations, that they brought the tradition into the games they migrated to later: Warcraft, Galaxies, Dark Age of Camelot, City of Heroes, EverQuest 2. The Ding! has since been repurposed not just for levels, but picking up coveted weapons, new skills, even sometimes a chance to practice rough fights. Lately I’ve even heard people who don’t play many games, using it after getting promoted at work, moving to a nicer apartment, or getting some hot guy’s telephone number.

“Ding!”

A great deal of the ding’s power rests securely in its ability to inspire pride. It’s a part of fun. In fact, a lot of our language for fun – in this book – relates back to the ding.

Many of the game design languages built specifically for developers, for instance by designers Dan Cook, or Benjamin Ellinger, relate strongly back to musical notation. They take specific kinds of fun (or engagement, if you prefer) specific rewards, then paint in terms of very subtle dings, and very powerful ones. In this way, structuring rewards is central to design; it’s as key as grammar in developing a writing style, or as vital as pacing in cinematography. Especially in games whose systems ask us to perform actions we’ve never tried before, and to navigate systems that might work in stark contrast to the other realities we’ve lived. We often need to use the rewards, the dings, to teach a player simply how to walk around.

It’s inevitable that the artistic structuring of reward would have some crossover with behavioral conditioning. This is especially tricky ground to walk, because the implications of behavioral conditioning are chilling.

“We want to know why men behave as they do.” Wrote the father of radical behaviorism, Burrhus Frederic Skinner “Any condition or event which can be shown to have an effect upon behavior must be taken into account. By discovering and analyzing these causes we can predict behavior; to the extent that we can manipulate them, we can control behavior."

B.F. Skinner is best known for placing a pigeon, or rat, into a box. Inside these Skinner Boxes, he would eventually place a food dispenser, a signal light, an electrified floor, and a push button. By mixing the dispensation of food, signals, and punishing jolts of electricity, he displayed that it was possible to train pigeons and rats to any number of behaviors. Pigeons were trained to be more effective than early computer algorithms, in guiding missiles towards enemy ships. The only reason they weren’t ultimately used in wartime, was that the notion seemed laughable to officers.

In discussing his training of pigeons, and un-training of them, Skinner noted that they remembered what they’d been conditioned to do "… as long as six years after the response had last been reinforced. Six years is about half the normal life span of the pigeon." Skinner’s primary interest isn’t just placing exacting routines into pigeons. He wants to control the behavior of human beings.

We start this process by basic shaping. A boxed pigeon may wander to the left, wander to the right, but once he tilts his head slightly towards the button? We drop a food pellet. He dallies a bit longer, then takes a step towards it? Another food pellet. He taps the button? Maybe more than one pellet. Like a nice mini-jackpot. He taps it again? We give him pellets for every press, and so shape the behavior that we want. Though we don’t keep going indefinitely, or eventually the pigeon gets full (or maybe we run out of food pellets).

Thankfully, we don’t need to reward him for every press. So enters the science, in Skinner’s book Science of Human Behavior. He believes that there are ways to structure rewards, so as to simply keep pigeons, or human beings, pressing the proverbial button. Skinner doesn’t believe in free will. In point of fact, he writes that "Man's power appears to have increased out of all proportion to his wisdom." That it’s the duty of better men to shape their lessers to “productive” ends, else they’ll never, ever, be satisfied.

So after shaping, we use Interval Reinforcement, reinforcing a behavior after a certain amount of time, a certain interval. “We are less likely to see friends or acquaintances with whom we only occasionally have a good time,” writes Skinner, “and we are less likely to write to a correspondent who seldom answers. The experimental results are precise enough to suggest that in general the organism gives back a certain number of responses for each response reinforced.”

Rewards based on the passage of time will never quite be so effective at encouraging behavior as Ratio Reinforcement, where rewards which come in response to some behavior. This could be a Fixed Ratio, as when “the schedule of reinforcement depends upon the behavior of the organism itself—when, for example, we reinforce every fiftieth response” He writes that this is essentially getting paid on commission.

What’s much more effective than every 15th press – if our goal is getting a pigeon or player to tap furiously on buttons – is Variable Ratio Reinforcement. When the pigeon doesn’t know quite how many responses it will take, somewhere around fifty, maybe it’s around a thousand. Skinner writes that the “… pigeon may respond as rapidly as five times per second and maintain this rate for many hours.” He continues, “The efficacy of such schedules in generating high rates has long been known to the proprietors of gambling establishments. Slot machines, roulette wheels, dice cages, horse races, and so on pay off on a schedule of variable-ratio reinforcement….The long-term net gain or loss is almost irrelevant in accounting for the effectiveness of this schedule.”

We can also combine schedules, “so that reinforcement is determined both by the passage of time and by the number of unreinforced responses emitted. In such a case, if the organism is responding rapidly, it responds many times before being reinforced, but if it is responding slowly, only a few responses occur before the next reinforcement.” When rats and pigeons are kept in physical boxes, tapping on buttons, it's the ratio schedules, especially variable ratio schedules, that get them tapping in the vain expectation of something. They go from tapping around 50 times – without reward – to thousands. Since they never know when the next food pellet is coming, they keep pressing the button.

Human beings may or may not need better rewards than food pellets, but our rewards can be so much more subjective. Shiny best-sword-in-game may fit the bill. Perhaps it's shiny best-shield-in-game, or shiny best-horse-in-game. Warcraft provides its players no dearth of variety for their (often game-actualizing) pellets. Last I left Warcraft, one friend was killing foxes on the 1/10,000 chance they'd yield a fox kit. Maybe the momma fox is pregnant? They don’t explain it. Anyway, probably fewer than twenty foxes were out in the world at any given time. When I last logged out he'd been at it awhile, with no dice. But if he got one of those kits? Boy howdy it'd be worth some gold.

And there is, always, gold. Skinner noted that with physical internal needs, say food, we can eventually get full. With sex there’s a refractory period. At the very least a Gatorade break. That’s why, when these “better” men went to improve life for their “lessers,” they’d also use Generalized Reinforcers. “The commonest example is money. It is the generalized reinforcer par excellence because, although 'money won't buy everything,' it can be exchanged for primary reinforcers of great variety…the exchange value of money is more obvious than that of attention, approval, affection, or even submissiveness.”

Oddly enough, in near every online world in which I’ve played, near every monster is carrying currency. How did I get seventy-five pieces of silver from that Embittered Dire Wolf? I really, truly, don’t need to know. I’m just glad to have it. I’ll need it for something, down the line, I’m sure.

I’m conditioned.

It's probably not that every big-ticket game designer spends all her time crayoning inside the lines of a BF Skinner coloring book. It's no huge surprise when game experiences wind up looking like even blatant Skinner Boxes, with a designer who knows what passion looks like. Not necessarily because she picked up a psychology degree, but because she lived it. Because, whether or not she has any inkling as to why, her imagination thrives, writhes, and lives within the language of systems.

But there is a difference. Even in our simple language of fun, we had eleven unique ways in which these rewards might speak to a player. I hope that the languages on community, place, challenge, and art provide more. I hope they spark insights on the artful crafting of rewards. When the depth of our world is reduced to the cold, hard, simple Ding!, it’s not about the designer providing something unique. The word “compelling” doesn’t even seem to fit, despite being so tied to its sister “compulsion.” At the point of compulsion, designers are being lazy. When their only goal is retention, they deserve less consideration than a sidewalk con artist peddling fixed shell games. This is why compulsion, especially when used to knowingly and willfully create grinds, is so far beneath fun.

B.F. Skinner was a man for whom culture and the arts were naught but the ringing of some dog’s food bell. "Literature, art, and entertainment,” he wrote, “are contrived reinforcers. Whether the public buys books, tickets to performances, and works of art depends upon whether those books, plays, concerts, or pictures are reinforcing."

Which is offensive. To Skinner it’s all about the science, “only the observable in behavior.” Which is nice in theory, but nobody can accurately observe everything. It’s almost painfully clear, the fissure between Skinner and the vitality and soul of the arts. Science without philosophy is like sex without condoms, and in rejecting free will B.F. notoriously infects his theories with a demented megalomania. Rejecting the philosophies that enrich and color our world gives the impression of a cold interior, to the boy who once wanted to write fiction. Who abandoned it out of the belief that he had nothing to say. Skinner once remarked, "I do not admire myself as a person. My successes do not override my shortcomings."

Gamers have been drunk on the deluge of sights and sounds and feeling, because we were the ones seeing something new come to life. But when it works, and more than that you uncover a new piece of vocabulary – so painstakingly carved – in the language of experience? Suddenly new dimensions of understanding, human understanding, become available. Your world is the richer, and all it took was play. That’s not the product of dumbfuck conditioning, it’s a hallmark of art. The two couldn’t be more different.

Novel systems are the height of game design, but they're unreliable at best. Expensive at best. Rather than capture something essential about the processes of human experience, rather than draw us in with the magic of the unknown – so painstakingly carved – many developers have turned to the more fiscally-responsible twisting of behavioral, motivational, even positive psychology. Some of you designers are my friends. Hear me when I say,