I fully intend to stay away from commenting on the whole Activision/Bungie thing, but the graphic on this article is too awesome to not link to. (We'll see if Swami Pachter's prediction here turns out to be as Dr. Panglossy as his last.) Besides, Ryan Geddes at IGN has a more nuanced perspective than I on this issue.

Russ Pitts' Editor's Note at The Escapist this week links to several interesting takes on the question of whether gaming's glory days are long gone.

Tuesday, April 27, 2010

Welcome to the first entry of a recurring series here on Infinite Lag, Tale of the Tape. Here I'll pit two games against each other, lay out their merits and shortcomings as I see 'em, and judge a victor - the game that better fulfills my Arbitrarily Selected Criteria for Success™. To keep things relatively fair, I'll try to pick games of a similar era, genre, and/or style. Oh, and with any luck, in the process I'll work toward some larger point about the genre or era or whatever. Yeah, we'll see how that goes.

And with that rousing intro...

LADIES AND GENTLEMEN!

In our inaugural battle, we have a slight twist: although Assassin's Creed II and Borderlands are very different games, they both have multiple DLC packs, the first entries of which I've finally gotten around to playing. Since it wouldn't make much sense to make an apples-to-apples comparison here, I'm going to judge these DLC packs with a little different slant: which DLC does a better job of adding to the core game experience?

Your Arbitrarily Selected Criteria™ for this bout are:

Feel - how well does the DLC stay true to the spirit and universe of the core game?

Scope - how much new content is there to explore or acquire?

Value - for what it costs, how much value does this DLC add to the core game?

I'll take on the first ASC™ today, and deal with the next two in subsequent posts. And now, without further ado...

IN THIS CORNER...

The Battle of Forli, which I snagged for a cool $4 when it released in late January but didn't play until last week, fills in one of the missing "corrupted memory" slots in Assassin's Creed II, Chapter 12. In this chapter, Ezio returns to the city of Forli and its surrounding wetlands to defend the fiery Catarina Sforza's turf from invaders and to protect a Piece of Eden. Oh, and Niccolo Machiavelli is hanging out there, too. Yeah, I don't know why either.

IN THE OTHER CORNER...I waited until The Zombie Island of Dr. Ned went on sale this month to buy it for $5, at 50% off. This pack adds several new maps (the aforementioned "Zombie Island") to Pandora, sending your hero on a mission to discover the source of the zombie outbreak at this Jakobs outpost and put an end to it.

ROUND 1: FeelThe central conceit of Battle of Forli, that it's a recently-recovered "missing memory," fits perfectly within the narrative structure of Assassin's Creed II. Since we visited Forli in the core game, there's nothing new here in terms of the art style or environs, but the subplot about chasing after the stolen Piece of Eden does advance the overarching storyline toward its conclusion. No real surprises here: it's more ACII. That's not a bad thing.

The main problem with the feel of Battle of Forli is that it's far too reliant on two of the clunkier aspects of the game: combat with multiple enemies and escort missions. ACII was at its worst when the player was forced to resort to button-mashing to fend off a lot of enemies at once, and that's prominent here. Keeping AI allies alive in escort missions - the bane of many a frustrated gamer - constitutes a significant portion of Battle of Forli. Let me say here, too, that Catarina is a terrific character in the cutscenes, but man, is she generic and dumb in combat. Machiavelli too. I spent half my time in the castle defense sequence figuring out where one of them had run off to so I could make sure some guard didn't run them through. I should add that Machiavelli was a wasted character, as unlike Leonardo Da Vinci in the core game, he doesn't have a distinct personality and his interaction with Ezio doesn't illuminate or exploit what players already know about the historical figure. I would have rather spent my time in Forli chasing down assassination targets (preferably not on a timer, as Battle has you do at one point) and getting opportunities to try out a variety of my maxed-out skills in different contexts. As it was, all I had time or space to do was hack away.

I admit to a little trepidation upon first taking my level 49 Hunter to Zombie Island; I'd read middling reviews of Dr. Ned, as some players felt it went too far afield from the look and tone of Borderlands. I was pleasantly surprised, though. Anyone who's played the core game knows that story isn't exactly a strong point, but this B-movie-inspired schlockfest exudes the tongue-in-cheek goofiness the core game is known for. The public service announcements from the disembodied voice of the Jakobs PR rep and the ironically self-aware audio logs from Dr. Ned himself had me chuckling throughout. That said, the quests - particularly the "Braaaaaains!" sequence - can be tedious collect-a-thons, like in the core game. But I think because of the funky atmosphere, I didn't mind as much as I did in the core game. The whole experience feels very much like a drive-in horror movie, except in the Borderlands universe. In this sense, I think Dr. Ned actually surpasses the core game in projecting a consistent tone, even though that tone is ridiculous.

There are certainly some irregularities about Dr. Ned when compared to the core game. Yes, the color palette is quite different, trading the often bleached beige shades of the dusty hills for a sickly green tint that openly mimics the Frankenstein monster's skin. It takes some getting used to, but I thought it brought some interesting variety to the game. The enemies are repetitive, but that's no different than in the core game. You'll face two types of zombie, carrion birds, werevolves, the bizarre giant Tankensteins, and a few different bosses, including a pumpkinheaded monster. This weirdo, and the final boss, seemed like the biggest stretches in terms of the game universe, but of course Dr. Ned is all about playing up the inherent absurdity of that universe.

The thing I like about Borderlands, and Dr. Ned in particular, is that unlike many other fine game franchises - Assassin's Creed, Mass Effect, Gears of War - it doesn't even come close to taking itself seriously. It's always reminding you that it is a game, not an immersive interactive story. I think that gives Borderlands a good bit of leeway to take some hefty risks in terms of its tone, and it pays off in this DLC.

Monday, April 26, 2010

I'm pretty sure it's a marketing requirement these days that every piece of copy written about every game franchise from God of War to Pokemon has to describe the story, setting, characters, and/or gameplay as "epic" at least once.

That's all as it should be; what would marketing copy be if it weren't always hyperbolizing?

But for crying out loud, people, stop feeding into it. Not every goddamn Halo 3 headshot is "epic." That one YouTube video of the kid faceplanting off the trampoline is not "epic." The party you and your bros had last night where Donnie did like 15 Jagerbombs and passed out on the porch and one of the bros drew hairy balls on his forehead was not "epic."

This is what the word "epic" means. The term does not apply to Donnie's balls-adorned forehead unless one or more of the following conditions obtain:

1) Said balls are 50 feet tall.2) You are describing a lengthy poetic composition about the heroic bro who drew said balls.3) Said heroic bro had to accomplish Herculean feats of bravery in order to draw said balls.

As an English major, I'm sad to see this once-meaningful word diluted to this level of douchebaggery. As a gamer, I'm just burned out. There's only so much room in my mind for grandeur and awe, and space is at a premium right now.

Wednesday, April 21, 2010

This past weekend, Mrs. JPG and I drove to upstate New York for a friend's wedding. We stayed at a "casino resort," where we felt severely underdressed, as at no time were either of us wearing any of the following items of apparel:

camouflage baseball hat

NASCAR-branded suede jacket

cellphone clipped to the waistband of sweatpants

Despite our sartorial inadequacy, we managed to have a great time celebrating and catching up with old friends, dancing poorly, and losing only a modest percentage of our discretionary income.

On our way to the car Sunday morning we walked by one of the gaming areas, where the pleasant harmony of chiming slot machines was inexplicably drowned out by some hideously generic pop music blaring from the ceiling speakers. No plinking coin sounds, no faux-glockenspiel ascending melodies, no resounding jackpot klaxons. I remarked on this curious fact to Mrs. JPG, who obligingly feigned interest. I love that girl.

Still, it seemed odd that the casino wouldn't want the sound of the machines - which is a major subliminal draw - to prevail. As a musician, I've always loved walking around the slots just soaking in the harmony of the machines. (The 50- second "hidden track" 4:18 into Radiohead's "Motion Picture Soundtrack", which replicates this sound, is sublime to me.) Given the depth of planning and understanding of human psychology that goes into casino design, the choice to obscure the machines' sounds with piped-in Top 40 crap was doubly bizarre.

This got me thinking about the way sounds in games condition us psychologically. I'm talking about both "external" sounds (e.g., soundtrack cues) and those that exist within the game world (e.g., an approaching enemy's battle cry). There are a couple of interesting applications I've noticed:

1. The reward sound. The sound indicates the accomplishment of a task or conquering of an obstacle. The most obvious example is the "Achievement Unlocked" bleep-bloop on the 360, which is, I guess, a sort of meta-cue. Somehow this brief, very recognizable sound signals satisfaction for millions of gamers and keeps them on the hunt for further rewards. In some cases, like in World of Warcraft, the sound itself becomes a synonym for an accomplishment in discussion ("I dinged level 60"). "Victory music," heard after winning a battle or clearing a level (see: the Final Fantasy or Dragon Quest series) also fits here. The guys at Giant Bomb refer to this as a "Victory Gong" or "Win Chime."

2. The failure/try-again sound. This sound or motif, which often plays when the player dies, reinforces the failure to overcome an obstacle the player has just witnessed on screen, and presumably provides motivation for the player to try again. The famous blooping descending riff in Pac-Man or the "chum-chum-chum" motif when Shepard dies in Mass Effect are good examples. This type of cue is seldom a sound that occurs within the game world.

3. The warning sound. This signals the player that danger is approaching and that s/he should prepare. A perfect example would be the ominous three-note riff that plays in Left 4 Dead when a zombie horde is near. This game, in particular, uses an audio cue to powerful effect: tension spikes as soon as the player hears the notes, before the enemies even appear on screen.

4. The ready sound. This sound confirms that a task has been completed and the player can now proceed to another task. Think of newly-created marines saying "Gimme somethin' to shoot!" as they emerge from the barracks in StarCraft. It can also signal that a certain ability is ready to be used: the cooldown period for a charging skill is complete, or a weapon is reloaded. The dialogue cues in Gears of War when the player succeeds or fails at an active reload attempt are an example.

5. The acquisition sound. This is an audio cue to confirm what the player has just seen on screen when the character acquires an object. Collecting coins in Super Mario Bros. is the classic example. The "shick-shack" sound you hear when picking up ammo clips in every FPS since Wolfenstein 3D is another. In some cases, without the combination of visual and audio cues, the player may not realize that s/he has actually picked up the item. This type of cue is related to the reward sound, but occurs much more frequently.

6. The damage sound. This sound, which is often combined with a visual cue, indicates that the player has sustained damage or has inflicted damage on an enemy. The "shwip" sound coinciding with a flash of red in games like the Legend of Zelda or Dragon Quest series indicates the enemy has taken a hit. In an FPS this typically manifests as grunts of pain from the main character, often accompanied by tactile feedback from a rumbling controller and/or flashes of red or blood spatters to indicate the direction damage is coming from.

7. The instruction sound. This type of cue reinforces a visual instruction the game is giving the player, usually outside of the immersion of the game world. In Double Dragon, a "ding-ding-ding" sound accompanies the flashing hand pointing in the direction the player is supposed to go when a group of enemies have been defeated. The Mortal Kombat series uses a musical riff at the end of a fight (along with the famous declaration "FINISH HIM!") to prompt the player to perform a Fatality; the music vamps for a few seconds to remind the player s/he only has a limited time window to perform the move.

I'm sure there are plenty of other types of audio cues I'm leaving out, but these are the ones that struck me as having a particularly effective Pavlovian slant. Interestingly, I find it difficult to play most games with the sound off or while playing other music; I need these cues to help me process my progress through the game. Like the harmonies of ringing slot machines, I guess I'm just subliminally tuned in.

Tuesday, April 20, 2010

(And yes, I realize that for a blog predicated on the idea of not being topical, I'm doing a lousy job so far of being late to the party.)

Fair warning: I had fully intended to remain comfortably mum on the roiling tempest that is Ebertgate 2010, but I just can't hold my tongue any longer.

Not because videogames truly ARE art, and must be defended as life-changing works of beauty and grandeur. Come on. In my experience, debating whether something is "art" is about as productive as a monkey trying to hump a football, and much less entertaining.

Not because I need to voice my righteous indignation at Ebert's dismissive view of videogames and how they affect us, or to point out the various ways in which Ebert misunderstands the games he discusses. Others have already done that. A lot of others. Years ago, even.

"I may be too old to 'get' video games," Ebert tweeted yesterday, "but I may be too well-read." Yowch.

For decades, Ebert has enjoyed a well-deserved reputation for curmudgeonliness. As a budding curmudgeon myself, I consider the man a role model. Not for nothing is he an internationally-recognized icon of the entertainment world. Sure, there's the whole "legendary movie critic" thing, but let's be honest here: it's really about the snark. Dude is a scientist of snark.

Which is why the sentiment above finally ended my silence. The thing is, while it might be considered witty by some - and by "some" I mean Ebert's fellow self-described "ancients" - it commits the cardinal sin of snarkiness that crosses the line between "playful" and "just mean": it's not funny. Or remotely true. It's even more bizarre considering he wrote this just hours before. Feels suspiciously like baiting to me.

(In another tweet, Ebert marvels at how his videogame article could have garnered nearly 2,000 comments from gamers, while a piece he wrote on the health care debate went largely ignored by that same virtuous and vocal audience. Um. I'll let that question answer itself.)

Of course, I could be wrong. I sincerely hope Ebert's latest round of tweets is just good-natured ribbing from an old goat who knows he's pushing your buttons. In fact, I'm beginning to wonder if the problem is that we're so intent on defending our passion and tearing down stereotypes that we're not getting the joke.

Ebert's made the same argument about videogames before, so he knew what he was getting into when he reopened this can of worms. He also knows that no one can win in a debate about "art" - but that provoking the debate, especially among gamer circles, would turn a lot of heads.

But please, let's not ask the "was the whole flap manufactured?" question. Somehow, Ebert doesn't strike me as the type to pull stunts for page views. His piece itself is relatively innocuous; it's not advocating for a government ban on videogames or making grandiose claims about the blinking lights rotting our children's brains. I don't think he has any agenda other than stating an opinion in his usual forceful way.

Look, Roger Ebert is a good sport who has developed a very thick skin over the years. The man fought cancer and lost his ability to eat and to talk and is still happier than you. He's not worried about your internet rage.

So by all means, keep trying to prove him wrong. But please, for all our sakes, do it with a sense of humor, will you?

Wednesday, April 14, 2010

The May issue of Game Informer arrived in my mailbox yesterday, and was a good deal more thought-provoking than one might expect from this cover.

GI's op-ed feature this month was a rant by Scott Jones, formerly of Crispy Gamer, ostensibly about age bias in games journalism. His piece isn't online, but it's certainly worth a read if you have a copy of the mag.

The unceremonious sacking of Crispy Gamer's - sorry for the terrible pun - seasoned editorial staff earlier this year has already been documented and discussed. The short version: a site devoted to mature, intelligent game analysis and discussion, seemingly on its way to profitability, crashes spectacularly when upper management decides to pull the plug. Cue immediate Twitter uproar.

Jones' op-ed attributes the demise of his site only partly to ageism - he and his veteran team are replaced by a cheap 20-year-old intern - but more to apathy among the target audience, who seemingly wanted CG's product but didn't follow through by generating traffic. He doesn't reveal his age at industry events, he says, for fear he'll be dismissed as a relic. He ends his article with a call to action: if you gamers want this kind of mature journalism, do your part. Show up to our site.

Look, I have a tremendous amount of respect for what Jones and writers like him are doing. That's why I love sites like Gamers With Jobs and Gamasutra. If we're going to overcome the stigma video games and gamers continue to endure in popular culture - a stigma that is sometimes well-earned, judging by the imbecilic Spike VGA Awards shows or any given multiplayer session of Modern Warfare 2 on Xbox Live - we need sharp, insightful writing that gives the medium the critical rigor and respect it deserves. We need places for gamers who are not mouth-breathing 14-year-old homophobes to gather and form communities. We need veteran writers who have the pedigree, experience, and talent to turn a critical eye to this increasingly visible (and profitable) industry.

I don't know - and the former editors themselves might not know either - what really caused management to clean house. I don't know how Crispy Gamer's business model worked, or what their marketing strategy was, or what other behind-the-scenes factors led to the layoffs. The company could have had obscene overhead costs, or the suits at the top could have just been heartless dicks. But it's not like this is the first time, especially in this industry, that veteran staff have been kicked to the curb for cheap labor. And presenting the argument that the gamer audience's fickleness and the game journalism industry's instability is somehow due to age bias seems to be conflating one issue with another.

For any media outlet, the onus is on them to identify, target, and engage the audience. Given the success and longevity of sites like Gamasutra and GameCritics (where Jones himself is a critic emeritus), I find it hard to believe the audience isn't there or isn't dedicated. True, mature gamers are a small and difficult segment to reach effectively - and it's probably more difficult to monetize a site geared toward rational adults than teenagers with disposable income and severe attention deficit problems. But people don't just show up to your site because you post great content. Many of the people I've heard talk about Crispy Gamer say the first time they heard of it was when the layoff scandal broke. That was true for me also. That's unfortunate - CG would have been right up my alley - but I think that's more a failure of marketing than of audience loyalty. Sorry, but I'm not to blame for not seeking you out.

But come find me, give me a reason to stick around, and I'll happily give you all the page views and clickthroughs you need.

Tuesday, April 13, 2010

While working on an article about morality systems in games yesterday, I came to the odd realization that I am apparently an unswervingly virtuous individual. This was especially odd given that I consider modesty and humility chief among my many, many virtues.

It's true: I can't make my character evil. I just...have to be the good guy.

Whether it's subconscious compensation for latent post-Catholic guilt, or a more esoteric drive to see myself, via my avatar, as a beacon of righteousness: when given a moral choice, I just can't let my characters embrace the Dark Side.

The rampant killing of virtual human beings and destruction of pixellated property isn't the problem. Like most of us, I've been slaughtering sprites since I was in grade school. And it's not like I haven't committed mass murder many times over in probably 50-75% of the games I own. Assassin's Creed II, whose very title makes clear that the professional extinguishing of life is the player's goal, was one of my favorite games of the past year. Even if Wolfenstein 3D hadn't thoroughly desensitized me, I got over any remaining squeamishness about video game killing when I first chopped a dude in half with Kung Lao's razor hat.

Besides, in most games, there's some kind of loose conceit that justifies, or at least explains, the casual homicide of enemies, human and non-human. I'm a scrappy survivor forced to slaughter tens of thousands of zombies? I consider that a public service. I'm a Sith apprentice rooting out the Rebellion for Lord Vader? You and your Ewok friends can enjoy a lightsaber to the face. I'm the protagonist in any GTA game? Well, you get the idea.

No, what truly trips me up is the games that presume to give you a choice, the games that have a morality system that tracks your progress toward angelic goodness or pure malevolence. The article I'm working on examines these morality systems in more detail. Suffice it to say, my issue isn't that these systems are generally simplistic black-and-white affairs - although that's a problem, too - but that for whatever reason, I just can't bring myself to go down that bad boy path.

Maybe it's just a matter of removing myself from the game world, of taking the whole experience less seriously. The first thing many of my gamer friends do at the character creation screen - after naming their avatar Sir DeezNutz Taintface - is distort him to look like Ugly Shepard. Me? I don't even want to give my guy a handlebar mustache.

I failed in Fallout 3. I failed in both Mass Effects. I'm failing miserably right now in Dragon Age. But I promised myself, after finishing the brilliant Mass Effect 2 in a whirlwind couple of weeks, that I had to force myself to be evil. I had to go back for a second playthrough, this time as a Renegade. I owed it to myself to see how the experience would change, what new story avenues and character relationships would emerge.

Sunday, April 11, 2010

The fact that it's April 2010 and I'm just now starting a gaming blog - how quaint! - should tell you something about how perennially one step behind I am.

At the risk of projecting a certain Get Off My Lawn curmudgeonliness, let me confess to a tiny bit of pride in that fact.

I've never bought the latest gadget on release day. It takes me at least three months to come round to buying a new release game, and only then if it's on sale. For crying out loud, I just got a Steam account a month ago.

In the race to keep up, I'm going to lose. Fortunately, I've never been as interested in keeping up as I am in digesting what's happened.

That's what I hope Infinite Lag will become - a venue to examine the trends, ideas, design concepts, politics, and contradictions that shape gaming and our passion for it. And to do so in a way that's not necessarily obsessed with what's on the horizon, but rather how what has happened has impacted what's happening. Think of it as the gaming industry equivalent of the Slow Movement.