To streamline the large volume of greenscreen work required for ABC’s new sci-fi series V (which reimagines the popular 1983 miniseries of the same name), artists and technicians at Zoic Studios combined off-the-shelf technologies with proprietary coding techniques to create Zeus, the Zoic Environmental Unification System. The system, which V director of photography Stephen Jackson describes as “brilliant, really,” provides a clear picture of how the greenscreen footage, shot onstage in Vancouver, will integrate with Zoic’s virtual sets.

Two days of each episode’s eight-day schedule are spent on a greenscreen stage. Lightcraft Technology’s Previzion and Airtrack solutions capture 3-D camera-tracking data, and Zeus transfers the data from Lightcraft’s 3-D engine into Zoic’s database, where it can be pulled up as a 3-D file in Maya. The filmmakers can see the greenscreen elements composited in real time on set, no matter how the shot was captured. “The beauty of it is that everybody can be on the same page,” says Jackson. “You can stand on a greenscreen set, look at a 24-inch monitor and say, ‘Okay, the actors will walk down this hallway, turn left and go into that door,’ just like you’re on a real set. To do that quickly and efficiently is one of Zeus’ major selling points.”

The immediate access to the composite also enables Jackson to tailor the on-set lighting to the virtual set. “We’re giving cinematographers the ability to see the lighting they design for the virtual set in real time, so they can match the lighting on the greenscreen set better than they could before,” says Andrew Orloff, Zoic’s executive creative director/visual-effects supervisor. Making the combination of practical and virtual lighting even more seamless, Zeus incorporates light profiles from lighting manufacturers to accurately mimic the characteristics of particular fixtures in the virtual environments. “We’ve done shots in medical labs on the alien ship that have a big shaft of light, with everything else dark,” says Jackson. “With the composite right there, I know exactly where to put our lights. It’s just a fantastic aid.”

Cinematographer Eric Adkins, who shot the bluescreen production Sky Captain and the World of Tomorrow (AC Oct. ’04), says Zeus’ benefits are easy to appreciate. “The biggest challenge [of shooting for composite] is building in interactivity to give the set a sense of presence,” he observes. “If the cinematographer doesn’t know what set is supposed to be behind the actors, it’s difficult to create a believable environment. Zeus and similar systems provide a lot of reality-based cues, so you can look out for anything that conflicts with the composition. It minimizes the compromises, so you can be more truthful to your intentions.”

On V, Jackson uses Arri D-21s, capturing images in 4:4:4 color space, and works with digital-imaging technician Tasos Mentzelopoulos. “We use two Cine-tal monitors, and Tasos runs a laptop with Iridas SpeedGrade OnSet to color-time every shot,” explains Jackson. “He sends a little thumb drive that has our look-up tables along with the footage, and the dailies we get back from Technicolor Vancouver are bang-on.”

As the footage passes from Jackson’s hands into the editorial pipeline, the Zeus system continues to streamline the workflow. “In a typical visual-effects pipeline, we wait to get the footage, then we track it, key it, render the set and then start massaging the composite,” says Orloff. “With Zeus, we get to that massaging point immediately, allowing us to rapidly generate material for editorial.

“It’s really difficult to lock the cut when you don’t have the visual-effects elements and you’re looking at the actors in limbo [in a greenscreen shot],” Orloff continues. “We can lock down that editorial process earlier. And because we’re working in an Avid-based environment, the auxiliary time code maintains the time stamp that was written onto the original master tapes. The lab usually doesn’t transfer all of the takes to dailies, just the director’s selects. The auxiliary time code is a constant reference back to the original time code, regardless of how the shots are laid down onto the selects reel.”

In addition to Zoic’s Zeus and Lightcraft’s Previzion and Airtrack solutions, other real-time 3-D tracking and composite packages, such as the Mo-Sys 3D Inserter and Brainstorm’s eStudio, are also streamlining greenscreen workflows. 3D Inserter has been used on the series Sanctuary (AC Nov. ’08), and eStudio has been put through its paces on the series DVD on TV, shot by Rick Pendleton. Pendleton, who is using v. 11 of eStudio, recalls that before the software became available, “I’d bring a cheap switcher with me, and I would key in a storyboard or photo we thought the background would end up looking like so I could light [the greenscreen set].”

Zeus and similar systems are a definite improvement, but Adkins warns that they shouldn’t be viewed as a panacea. “Productions tend to think shooting bluescreen or greenscreen leads to all-around savings, but a lot of prep work is involved in making it look realistic,” he says. “You can’t simply say it’s a more efficient way of shooting. If you want it to look good and feel real, you have to re-create every little nuance that might be accidental or inherited in a physical set, and if you don’t allow someone the time to do that, you’re doing the production a disservice.”