Real time invades blurb ‘burb

Post houses adapt vidgame tech for film

Techies have long predicted that the kind of videogame technology used to render animation that runs in real time would soon surface in other arenas of animation production. Yet the so-called “machinima” clips created with game technology seem largely confined to YouTube, while professional-quality examples have appeared only sporadically.

But one project from an ad agency and others commissioned by the National Hockey League suggest the time for real time may be at hand.

Los Angeles-based ad agency the Ant Farm has been adapting game animation processes to create promotional material for its clients, including a Jeep campaign set in Activision’s “Call of Duty” virtual battlefields. The Ant Farm’s approach involves culling game elements and fashioning shots that evoke the feeling of game play.

Executive creative director Rob Troy says his team looks at the assignment as a live shoot in a location that exists within the game environment. “There’s an airfield with chaos going on and ‘guys’ running around,” he says. “We’ve just ‘directed’ them to run in specific directions. It’s like working with digital actors. Their paths are changed, and so are their vehicles. Then it’s shot over and over again in order to get the correct coverage we need.”

Troy notes that the process isn’t truly a real-time rendering approach — at least not yet. “It’s moving a little slower than real time: It’s not happening at 60 frames per second,” he says. “Basically, it’s like we’re shooting with an overcrank.”

As someone who’s been creating videogame trailers for a dozen years, Troy is happy actual game elements now look sophisticated enough to pass muster in commercial advertising, and he feels that the days of promoting videogames themselves with slick, pre-rendered cinematics may be ebbing. “If you were advertising a theatrical movie, you would show images from the film, and games are no different,” he says.

A more unexpected application of game technology for animation has been seen in recent months on the Jumbotrons in arenas hosting National Hockey League games.

The NHL asked Stan Lee’s POW! Entertainment to design superhero characters representing each of the league’s 30 teams, as part of what the NHL called its “Guardian Project.” The assignment to animate the characters went to Vicon House of Moves, well known for its expertise in motion-capture. But recording movement was only half the battle; turning that data into bigscreen animation was the toughest part of the challenge.

With just a few months to deliver a seven-minute piece featuring all 30 teams’ Guardians, House of Moves chose Epic Game’s Unreal Engine to render the clip. The approach enabled the company to light and render scenes interactively, speeding up the process considerably.

“Our client wanted to execute this in multiple arenas,” House of Moves production veep Brian Rausch says. “The Guardians needed to show up on giant Jumbotrons as well as in less sophisticated venues with different video requirements.”

The company originally considered using a traditional rendering pipeline, but quickly realized that if it had to re-render everything to fit different formats, there was no way the job could be finished in time.

“So we turned to game technology,” Rausch says. “The Unreal Engine gave us the flexibility to tell our story but not (have to) spend thousands of hours of machine time re-rendering. We were able to do larger frame rates and higher resolution without slowing things down.”

By running characters through a real-time renderer, House of Moves was able to produce a frame every 32 seconds. To compare, it can take a computer hours to crunch the numbers and render a single CG-laden frame from a feature film.

The crossover of this technology into different areas of animation may accelerate as more people with backgrounds in film and television adapt strategies from the game world. For instance, House of Moves’ Guardians team included Alberto Menache, a vfx expert whose film credits include “Spider-Man” and “Superman Returns.”

For now, people in the film world typically rely on software like Pixar’s RenderMan and Mental Images’ mental ray, which have evolved over decades, but Rausch expects that may change — especially for animation projects that aren’t photoreal.

“Production is pushing this,” Rausch says. “Everybody needs it better, faster and cheaper. There’s an opportunity right now, even on the film side, and people are stepping up to try to deliver that. Every technology needs to watch its back.” Vidgame technology like the Unreal Engine can greatly speed up rendering time for animation projects that aren’t photoreal.