Perfection is difficult to achieve in the commercial world. Brutally short production and postproduction schedules hamper the fine-tuning of shots and visual effects, but the visual nature of the commercial — rapid-fire imagery, smash cuts, small-screen presentation — helps to conceal those flaws.

For the Djarum Mezzo cigarette spots “Race” and “Leap,” Sway Studios in Westwood, California, refused to adhere to the “good enough” commercial philosophy. The effects-laden spots, in which couples race through an idyllic architectural setting, were treated as big-screen, mini motion pictures; they were shot on 35mm, scanned at 2K, and visual-effects work was performed at full 2K resolution. The amount of rotoscoping, tracking and compositing was extensive because the world the actors were running through was an entirely computer-generated (CG), photo-realistic environment. “This is the most complex CG spot I’ve done,” says visual-effects supervisor Robert Nerderhorst. “It’s also the biggest project Djarum has done, and its first CG spot.”

The agency’s original storyboards depicted a group of people running in an all-white environment, which would have been much simpler to realize. Director Joseph Kosinski’s take on the context was drastically different. “The goal was to create a setting that was both stylish and timeless,” he says. “Because the agency’s concept was so abstract, I wanted the environment to feel as authentic and tangible as possible.”

Kosinski created specific look treatments, including in-depth previsualizations using 3ds Max, to show the agency. “In our videoconference call with the agency,” recalls Nederhorst, “it was clear they were concerned about the process because they were new to it. It’s a good thing we did the previz as religiously as we did, because we had two days to shoot the entire campaign. With the motion-control rig, we knew it was going to be a challenge, so we really had to stick to our previz and shooting boards. Our script supervisor, Daughn Ward, was constantly communicating with the AD and AC to make sure we had everything we needed. Being in direct contact with those people was key to our success on set.” Kosinski adds, “When the agency arrived on the day of the shoot, they said, ‘It’s all computer generated?’ They couldn’t believe there was nothing on the stage.”

There was something on stage: greenscreen, and it was on the walls, the floor, and even the ceiling. The Sony soundstage measured 150' long, 40' deep and 25' high, large but not quite large enough, which meant the greenscreen was a bit too close to the action. The result was a significant amount of green spill on the actors, who were wearing white. Values on parts of the actors when compared to the background were often the same, which caused headaches for the Primatte Keyer and a keyer written by Sway’s compositing supervisor, Marc Rienzo. “The whole theory behind my keyer is to be able to pick the background green values and then the foreground green spill,” says Rienzo. “It was written with heavy spill in mind — to be intelligent, if you will, about how to separate the foreground.” Nederhorst adds, “Once we realized the keys weren’t going to be perfect, we told Claudio Miranda, the director of photography, to just make the people look pretty and we would make the rest work.” This required a hefty amount of tracking and roto work on the actors; aiding this process were the one-light 2K scans from Pacific Title, which provided much better subject definition, particularly in hair.

“My initial idea,” says Kosinski, “was to shoot this at the Getty Center, but they don’t allow commercial shoots. The client then asked for the word ‘Mezzo’ to be embedded in the complex, so at that point we decided to do a unique design.” The intricately designed mountaintop complex was built by designers Kevin Cimini and Oliver Zeller in consultation with the director. Nederhorst says, “The idea was to do branding, but instead of placing their logo everywhere, we integrated their gold, red and white colors into the environment.”

The distant, cloud-enshrouded mountains were constructed using Terragen, a terrain- and environment-generating software written by U.K.-based Matt Fairclough. High-dynamic range (HDR) lighting also was generated in Terragen based on HDR images Nederhorst and Kosinski had taken in the Santa Monica Mountains. “We took the HDR lighting samples into 3ds Max and lit it with Chaos Group’s V-Ray [so that] all we had to do to change the lighting on the environment was essentially change out the HDR,” explains Nederhorst. “You’ll see really sharp shadows that start to fall off, just like you get in the real world.” (Actors’ shadows are also CGI because of the difficulty in pulling shadow keys.)

“We gave Claudio a series of rendering tests for him to use as key, fill and color-balance references so we could get a good match,” says the director. “From these images, he was able to reproduce the sun angle. He also used a cool fill light, so as we shifted the color balance around, the lighting on the actors always blended with our environment. It ended up matching perfectly, and Claudio deserves a tremendous amount of credit for that.”

After Kosinski edited together low-resolution proxies of the footage in Adobe Premiere, Nederhorst’s visual-effects team went to work. Tracking was done by hand because the encoded Kupermotion data from the mo-co rig used on set made the virtual camera in the CG environment inaccurate. “It doesn’t take into account the shake of the head,” notes Kosinski. Frame rates for some mo-co shots went as high as 150 fps for slow-motion work. Some of those were slowed down even further in post to 300 fps using the Kronos retimer plugin.

A sweeping camera move through the CG environment that follows a runner who hurdles a small, reflective pool of water is actually two motion-control moves stitched together. Explains Nederhorst, “The environment was built for the first part of the camera’s move, and for the second move, when the camera shift happened, the actual 3-D terrain was rotated.”

Compositing was performed with Digital Domain’s Nuke software. “Nuke is blindingly fast and allows for giant scripts and thousands of operators,” says Nederhorst. “Using EXR files, we embedded different channels of data — RGB, alpha, Z-depths and reflections, speculars, normals and so on. Nuke deals with those very elegantly. It also allowed us to build lens-aberration tools and other custom operators like the Rienzo Keyer pretty easily. Nuke is designed by compositors for compositors, and that makes it an ideal choice in a fast-paced, high-end environment.”

Nuke also was used to adjust depth of field in the CG environment. “About 95 percent of the time, we matched the live-plate depth of field,” says Nederhorst. Kosinski cites a close-up of a runner’s foot as a good example: “That’s one we matched exactly to the plate. Even the foot behind her is slightly out of focus. That shot didn’t work completely until we blurred out the foreground as well.”

Color correction and conforming were completed in Assimilate’s Scratch, and then the completed spots were down-converted for the television market. Given that the commercial lasts only 60 seconds, many of the details Nederhorst and his visual-effects team put in each shot might be missed by the casual viewer, but their efforts certainly didn’t go unnoticed. The client referred to the Mezzo campaign as “the best thing we’ve done,” and it was the only commercial nominated for a Visual Effects Society VES Award for Outstanding Compositing.