Share this story

The new science fiction film, Replicas, starring Keanu Reeves, achieved some pretty impressive visual effects, given its relatively modest $30 million budget. The secret: using a suite of game-development animation tools for modeling and motion capture. Replicas may have struggled at the box office, but it just might portend the future of filmmaking, creating a space where smaller projects have the ability to create cutting-edge special effects at a fraction of the cost typically paid by major studios.

The most popular industry tool for CGI rendering is Maya, a 3D computer graphics/computer modeling program that owes its popularity in part to the fact that it's designed to be open to third-party software. That means users can adapt the suite into their own customized versions. (Large movie studios in particular tend to write piles of custom code for their productions.) Maya offers a wide range of special tools to realistically emulate the dynamic properties of complicated things like steam, blowing leaves, tornadoes, hair, fur, smoke, clouds, explosions, and clothing and other fabrics—just about anything Hollywood can dream up.

But Maya is expensive and can quickly eat up a production budget just in the design and pre-visualization stages. A single four-minute pre-viz sequence can cost $350,000 or more. That's why Replicas Executive Producer James Dodson brought in a cheaper alternative for those early stages: iClone, a suite of animation software tools frequently used by game developers.

Dodson is a former vice president of production at 20th Century Fox, working on such CGI-heavy blockbuster films as X-Men and Planet of the Apes. He has since carved out a successful career figuring out "how to do what the big boys do, without big-boy money," he said. "[Filmmaker] James Cameron has had those tools for a decade, but Avatar cost $200 million." He's been an iClone fan for several years and feels the software has finally matured into something professional filmmakers with tight budgets can use to get cool CGI effects for a lot less money.

Man in motion

Keanu Reeves testing the motion-capture system used to create the CGI sequences in Replicas.

Entertainment Studios

William Foster (Reeves) manipulates a digital representation of the human brain.

Entertainment Studios

The same scene in development.

Entertainment Studios

Android body awaiting an upload.

Entertainment Studios

Foster attempts to upload his own consciousness into the android body.

Entertainment Studios

Trying out the new android body.

Entertainment Studios

Game-development software helped power the motion capture for Replicas

Entertainment Studios

iClone offers real-time modeling of the body, face, and hands.

Entertainment Studios

Granted, iClone is not intended to create final renders; you still need Maya for that. The biggest opportunity for cost savings is in that critical pre-visualization stage of development. "Everyone loves pre-viz, because it takes it out of the director's imagination and puts it where we can all see where the camera goes," said Dodson, thereby helping the production team figure out what kind of equipment they'll need (Steadicams, drones, cranes, lifts) to make that vision a reality.

With Replicas, there really wasn't any budget for pre-visualization, yet Dodson knew that stage would be critical for the film's robot sequences (especially the final big action sequence). iClone provided a handy substitute for expensive Maya renderings, letting him move the camera, light his subjects, choreograph the actors, add scenery, and so forth. And the software is now able to capture facial features and expressions along with body movement, in real time, in a single stream.

Reeves was also an executive producer on the film and took a strong interest in the motion-capture aspects: it was him in the suit more often than not, performing the motions required. The film grapples with whether it's possible to upload somebody's consciousness into a synthetic human brain attached to an android body. Eventually, Reeves' character—synthetic biologist William Foster—uploads his own consciousness into the android. So it was vital for the filmmakers to capture not just his static physical features but also his distinctive gait and behaviors.

"It was this elegant balance between mechanized functionality and having a sense that Foster's consciousness has been imbued into this robot," said Dodson. "The mo-cap gave us his personality."

“We could not say, ‘No minivans were harmed in the making of this film.’”

It also proved to be a challenge getting the shots Director Jeffrey Nachmanoff wanted for the minivan Foster drives in the film, which ends up plunging into a river. The pre-visualization modeling with iClone showed that the team would have to remove the back seat and back doors to fit a camera in the minivan. Another shot required cutting the minivan in half. In the end, Dodson had four identical minivans for the shoot: one to cut in half; one to dangle from a crane and drop into the river; one to go into a water tank for underwater sequences; and a fourth for actual driving around. "We could not say, 'No minivans were harmed in the making of this film,'" he joked.

In another scene, Foster is shown manipulating a digital display map of the human brain, moving data around via gestures—something we've also seen Tony Stark do in the Iron Man movies. That took over three months to design. "What does a human brain look like, floating in the lab? Is it transparent? Does it move? Can it rotate?" said Dodson. "We used iClone to start generating ideas from scratch of what the data should look like."

That's how Dodson and his colleagues discovered that their original choice of a blue palette wouldn't work, given the blue-hued lighting used on set. iClone helped them avoid a potentially costly design error. "We originally had this beautiful blue font, which disappeared when the room was lit blue," he said.

Dodson believes tools like this are "a great democratizer. District 9, for example, could have been made for a fraction of the cost today, using this," he said. "Ten years down the line, you're going to be able to do what Cameron did with Avatar, using motion capture to go to full CG characters and create a full CG world. These real-time engines are going to be the driver of that."