Category Archives: IDesignYourEyes

Eleven top NASCAR drivers are having a bad day, grumbling into their car radio mics. But once in the crew pit, each driver is offered a cold, refreshing bottle of Coca-Cola. Back on the track, the drivers are so exhilarated they begin singing “I’d Like to Buy the World a Coke,” as bewildered fans listen in over headphones.

The 60-second commercial, which also has two 30-second versions, premiered this last Sunday, Valentine’s Day, during the broadcast of the Daytona 500 on ESPN. It hearkens back to the 1971 commercial “Hilltop,” probably the most famous Coke commercial in history, which introduced the song. The new spot, entitled “Harmony,” features NASCAR drivers Greg Biffle, Clint Bowyer, Jeff Burton, Denny Hamlin, Kevin Harvick, Bobby Labonte, Joey Logano, Ryan Newman, David Ragan, Elliott Sadler and Tony Stewart.

See the “Harmony” spot here, at the end of a feature about the making of the commercial; the spot begins at 4:10.

The commercial does not appear to be effects-heavy, but appearances can be deceiving. It was assembled from a number of separate elements, including CG cars and digitally-altered stock footage. The VFX were created by Culver City, California’s Zoic Studios, which produces effects for commercials, feature films and episodic television, such as ABC’s V, FOX’s Fringe and CBS’ CSI: Crime Scene Investigation.

“The agency went to the NASCAR archives and pulled stock footage,” says Zoic executive producer, commercials Erik Press, “and they cut together what they envisioned as a race.

“Then they filled it in with close-ups of the actual drivers, which were shot on the racetrack in Charlotte, North Carolina. Those were inserted in the edit. [Commercial creative director] Les Ekker shot back plates for footage outside of the vehicles. Our task was to take stock footage, interiors of drivers, and plates of driving shots, and mix them all together and make them appear as one entire race.”

“Mostly the work consisted of taking their ‘hero’ celebrity drivers, and generating driving plates,” explains Neil Ingram, a Zoic producer.

“They wanted us to make these moments inside of the car to feel like ‘found’ footage, like you’re tapping into the live feed while they’re driving. Part of a NASCAR race is that you can rent headphones, and listen to the realtime exchanges of the drivers and the crews. The spectators that we cut away to are listening to the radios, and they’re bewildered by the fact that these drivers are all singing together.

“First we had to make the interior driving spots look realistic. Then we had to work on a degradation look, to make the shots match the practical realtime images that are actually from the cars; there are some of those shots in the spot.

“We had some CG augmentation on shots, and then ran it through compression. The cameras they use in the cars are ICONIX — they shoot back realtime images to a broadcast tower. They’re true HD cameras, but they get compressed with MPEG-2 compression. So we did some experimentation with different levels of MPEG and JPEG damage, to match the look. But these are celebrity drivers and these are product shots, so we had to find a balance between not getting too much degradation, but making them still feel ‘found.’”

“It was a fun job,” says Zoic co-founder Chris Jones, who was creative director for the VFX. “It has all the good elements for a visual effects spot: full-CG cars; full-CG dynamics; full-CG tracks; a lot of clean-up and footage matching; a lot of greenscreen; live-action plates; stock footage integration – it runs the whole range of VFX. It came together well – it’s a really satisfying piece. I’m pleased with it.”

Press says the production was a very positive experience for everyone involved. “It is really sort of an iconic Coca-Cola spot, with ‘I’d Like to Buy the World a Coke.’ They haven’t brought that theme back for some time.

“It was a really smooth production, it went really well. The agency was very happy. It was smooth for them as well — we were always right behind them, providing for them. A really positive experience.”

The spot was directed by Mike Long for Epoch Films; and edited by Matthew Hilbert of Joint Editorial House, Portland.

Mad Men, AMC’s award-winning drama, finished its third season in November, and has been renewed for a fourth. Set in the 1960s at the fictional Sterling Cooper Draper Pryce advertising agency on Madison Avenue in New York City, Mad Men centers on creative director Don Draper (Jon Hamm, The Day the Earth Stood Still), and those in his life in and out of the office; and depicts the changing social mores of 1960s America.

Mad Men has garnered critical acclaim, particularly for its historical authenticity and visual style, and has won nine Emmys and three Golden Globes. It is the first basic cable series to win the Emmy Award for Outstanding Drama Series.

Zoic Studios provided visual effects for a number of shots in the third season, including a memorable dream sequence; plus a variety of so-called “invisible” effects, VFX which the audience is usually (and ideally) unaware are VFX.

Zoic visual effects supervisor Curt Miller says most of Zoic’s work on Mad Men enhanced or augmented the efforts of production designer Dan Bishop (Big Love). Executive producer Scott Hornbacher (The Sopranos) and creator Matt Weiner (The Sopranos, Andy Richter Controls the Universe) are committed to staying true to the 1960s period, right down to minor background details. The level of detail is “amazing,” Miller says, and Zoic is “honored and flattered that they trust us to be a part of their team.”

Visual effects producer Christopher M. Wright agrees that authenticity and detail are vital to Weiner and Hornbacher’s vision. “It is nice to work with a client that’s very particular about their level of detail and their level of quality,” Wright says. “It certainly pushes us to make sure things are right. I have never worked with anyone quite as committed to staying true to the art direction of the time as they are.”

Still taken from Mad Men provided through the courtesy of Lionsgate.

For the third season premiere, Zoic performed a set extension for a scene in which Don Draper (Hamm) and Salvatore Romano (Bryan Batt, Funny People) take a business trip on a Boeing 707 jetliner to Baltimore. The production built a portion of the airplane interior, which had to be duplicated and extended to recreate the complete interior of the passenger cabin.

Zoic artists visited the only vintage Boeing 707 within driving range – the former Air Force One on display at the Ronald Reagan Presidential Library and Center for Public Affairs in Simi Valley, California. The Library does not normally allow photographs to be taken inside the plane, but the production obtained special permission to take reference photos one morning before the public was admitted.

A set piece making up the left half (facing the cockpit) of the plane interior, four rows deep, was built – this was shot from a variety of angles, with extras in period costume filling the seats. Then the set piece was flipped around and shot from the other direction, to become the right side of the plane. These elements were stacked one behind the other to create the complete jetliner interior. The main action between the two leads took place on the practical set, while the rest was assembled, composited and rendered digitally.

After the footage was shot, the production discovered that the carpeting on the set was inaccurate for the period, and Zoic fixed the problem digitally. The upholstery, wallpaper, and every other interior feature had to be recreated and rendered faithfully. Mad Men art director Chris Brown and producer Blake McCormick conducted research to guarantee authenticity. Zoic’s Renaud Talon did much of the work on the sequence.

Still taken from Mad Men provided through the courtesy of Lionsgate.

In another scene with effects produced by Zoic, a train ride through New York in the fall was created. The attention to detail was meticulous, with digital recreations of passing scenery true to the location, the period, and the season. Like all other work done for the show, the scene had to match Mad Men’s justifiably famous visual style. Zoic’s Suzette Barnett worked on the composites.

In a well-known scene, Zoic’s work was not at all invisible. When Don’s wife Betty Draper (January Jones, Pirate Radio) is knocked out with anesthetic during childbirth, she experiences a surreal hallucination.

Still taken from Mad Men provided through the courtesy of Lionsgate.

Jones was shot against a bluescreen, rather than a greenscreen, because it was easier to pull key off of her blonde hair against blue. She walked on a treadmill on the stage, with the intention that she would be composited against a moving background. The background plates were shot at high speed so they could be slowed down to match the actor’s walking pattern, but matching her pace to the background proved difficult. It was decided to keep her movement slightly off-pace from the background, as this contributed nicely to the dreamlike quality of the scene.

“The only tricky part,” Wright says,” was that when she stopped walking, the treadmill still drifted a little. So we had to sort of match up our background to that movement, because in the camera it still looked like she was moving even though the background didn’t move. It looked like she just floated towards us, which was a little over-the-top for what they were going for.”

After Jones stops, a caterpillar enters the frame from above, moving down on a thread of silk. She catches the caterpillar and watches it wriggle on her hand. The caterpillar was created entirely in CG by Zoic as an original creation. Zoic’s Dayna Mauer and Rodrigo Dorsch contributed to the scene.

“It’s a great show to be working on,” Wright says. “It’s high-end stuff, it’s award-winning. Clearly, they are very particular and know exactly what they want. It can be challenging, because with television, there’s a lot of ‘it’s good enough,’ when we get through shots — but with Mad Men it needs to be right.”

To the opening riffs of Metallica’s “Master of Puppets,” two NASCAR drivers jostle for position at the front of the pack. One cuts off the other by the wall, and the rear car speeds up, smashing into the front car. As the front car drifts from the wall, the rear car makes its move, attempting an aggressive pass on the right. But it’s no good – he sideswipes the front car and spins out. He’s slammed by another car and flips high into the air, triggering a massive pile-up. And straight through the smoke and chaos of the pileup – a third driver makes his move and takes the lead. “It’s anybody’s race.”

The 30-second spot for ESPN (see it here), promoting the NASCAR Nationwide series, was created by advertising agency Wieden+Kennedy New York and Culver City, California’s Zoic Studios. The commercial is significant because, despite its unique and stylized black-and-white look, it appears to have been shot in live action. In fact, it’s entirely CG.

Zoic co-founder Loni Peristere, who directed the spot, talks about why the commercial was created digitally, and how Zoic was able to create the illusion of perfect realism.

“The question from Wieden+Kennedy was, ‘we have a project, two scripts, which take place on the track, and would require significant action and stunt work. We’re trying to decide whether we should approach this from a live-action standpoint; or should we approach this from an animation standpoint.”

Wieden+Kennedy insisted the final product be photo-realistic; the agency did not want a commercial that looked like a video game.

But Wieden+Kennedy was insistent that the final product must appear perfectly photo-realistic. Peristere says the agency did not want a commercial that looked like a video game. “It was really important to them that it had the energy, grit and testosterone of the track. They were not interested in making a spot that didn’t have the reality of NASCAR.”

The agency was well aware how far CG realism has recently progressed. “Even in the last 12 months it has come a long way,” Peristere says. “With the advent of motion pictures like Avatar or The Curious Case of Benjamin Button, we are seeing the potential for photo-real characters, photo-real environments, and photo-real action. But could we actually achieve that for a commercial, and could we afford it? What would the timeline be?

“We got boards for both spots, and it became readily apparent why they were even asking this question – they had a 40-car pileup in the middle of the first spot, and a pretty significant crash in the second. Now when you looked at the second spot, you thought ‘well, from a production standpoint you could probably pull that off’; in fact we’d done something similar for Budweiser the year before. But the 40-car pileup featured just an enormous amount of damage to an enormous number of vehicles, which from a production standpoint would be very expensive.

“And the ability to control the lighting and the camera and the art direction would be limited in a live action production. You would be fighting against the sun, making you rush through the shots, allowing you limited control over your color palette. And you would have the expense of wrecking an enormous number of vehicles.”

Peristere discussed the project with other principals at Zoic – fellow co-founder Chris Jones, commercial creative director Leslie Ekker, commercial executive producer Erik Press, and CG supervisor Andy Wilkoff. “We thought it would be fun to rise to the challenge,” Peristere says. “We knew the team we had been building over the last several years had the potential to do incredible photo-realistic work. We’d seen large leaps in the realm of photo-real characters. We came back to Wieden+Kennedy and said ‘yes, yes we can.’”

Deciding to do the spot in CG led to the first question – should the drivers’ faces be represented in the spot? Human characters are the most difficult thing to create realistically in CG. “From a directorial standpoint,” Peristere says, “I felt it was absolutely essential to see the drivers, to understand who they were, and to know what their motivations were so we had a personal connection to the race. I had the ever-present voice of [Buffy the Vampire Slayer and Firefly series creator] Joss Whedon in my head, who says ‘it’s all about the story; it’s all about the people.’

“We enlisted the help of some incredibly talented artists, including Brad Hayes, Brian White, and Michael Cliett.” Hayes and White had worked at Digital Domain on Benjamin Button and more recently on Tron Legacy, and had been a part of the development of a character-based VFX pipeline.

The technique used for “Dominoes” involved projecting the actual NASCAR drivers’ faces onto CG characters, allowing Peristere complete control over movement and lighting while still getting full, photo-realistic facial performances.

“Andy [Wilkoff] and I went to the very last race at Daytona, and after race day we met with the eight stars of our two commercials. We ran them though some technical setups, which involved a three-camera shoot against a greenscreen. I directed them through a series of emotions and actions that related to the story we were telling. We then took those performances back to Zoic, made editorial selects based on those performances, and gave them to Brad and Andy and the smart people to make something cool with.”

Dmitri Gueer, founder and senior editor of Zoic Editorial, was involved in the “Dominoes” spot from the pre-viz stage through the final product. He describes the editorial process as “non-stop,” and uses the facial performances as an example of Editorial’s involvement at each step.

“The pre-viz had the drivers, but we didn’t see their faces,” Gueer explains. “So the drivers were just a placeholder in the cut. When we later got the driver plates, we started picking the selects and placing them in the cut. Since the pre-viz already existed, you needed to find takes that worked for the placeholders.

“When you have the drivers’ faces mapped in the shots, it becomes apparent when we need to give them a little bit more time, or take a little time from them, because something’s not working out; and once you have a set of almost-final shots, the edit takes on a different spin. You need to pick the sweetest spots in the shots; you need to reestablish the pacing; you need to make sure there’s continuity from shot to shot; and that the edit comes together not just as a story, but also that it gels with the music and is captivating to watch.”

“We had the added complexity of a 40-car pileup,” Peristere says, “which involved extensive damage to CG vehicles, but which had to happen organically. That was hand-developed and designed by Brian White, another Digital Domain veteran with an intimate knowledge of physics and kinetics, who was able to use both animation-by-hand and procedural techniques to bring these cars into collision. You’ll see that every vehicle reacts and behaves just as a real car would as it impacts. When we have our big moment where we t-bone the hero car, you actually see it break where it should break, and that’s because Brian White made it so.”

I was looking to invoke the German Expressionist period, so I wanted these incredibly long shadows, with crushed blacks.

The spot also required an enormous smoke simulation. “Whenever these cars spin they generate tons of smoke. We worked closely with Zoic Vancouver, and a number of technical directors up in that office who specialize in smoke; they did the phenomenal nuclear explosion scene in the forthcoming movie The Crazies, for which they developed a lot of the pipeline for this — which involves Maya fluid dynamics, along with some techniques in RF4 Real Flow — so they could generate authentic smoke elements that gave the illusion and sense of a full-scale car accident on a NASCAR track.

“Kevin Struckman, Mike Rhone, and Trevor Adams all put in an incredible number of hours to make these smoke simulations incredibly spectacular, concluding with the hero car penetrating the giant smoke cloud, creating those beautiful little vortices that you see. That’s something that’s pretty tricky in a fluid simulation, and they were able to do a really nice job with that.”

In order for the spot to come together organically, there was an immense amount of compositing. “We brought in real smoke, spark, and pyro elements to underline the CG elements. Also, every single one of the 27 shots in this 30-second spot had upwards of hundreds of passes– lighting, reflections, highlights, lens flares, vignettes, grain – all of this stuff that had to be added as a secondary layer.”

The spot was rendered in full color, but the end product was always intended to be in a highly-stylized black-and-white. “That was a choice we made with Wieden+Kennedy, to create a style, a more graphic look. For me it was heading towards the films Alfred Hitchcock made in the 40s and 50s, and looking back even further to F.W. Murnau and Sunrise, and Fritz Lang and Metropolis. I was looking to invoke the German Expressionist period, so I wanted these incredibly long shadows, with crushed blacks. You’ll see a low sun – I call that the Ridley Scott sun, because Ridley Scott shoots at the magic hour all the time, and we wanted to put that in every shot. You’ll see these incredibly long film-noir shadows with bright brights, and black blacks.

“Then we wanted to include the branding of Nationwide; so we applied the Nationwide presence as a design element. We had an illustrator, Eytan Zana, who did a phenomenal job setting the tone and palette.” Zana worked with Wieden+Kennedy, and with Derich Wittliff and Darrin Isono of Zoic’s design department, applying the Nationwide Pantone color to the stickers, the cars, and the track.

Peristere says, “I think overall, this black, white and blue we put together in the compositing really lends an original look to this spot that’s unlike anything we’ve seen before.”

Zoic VFX supervisor Steve Meyer handled the final finish, color grading and color treatment. “We wanted to have sort of a Raging Bull kind of look, high contrast black-and-white. So the compositors left things a little bit more on the flat side to give range; and then I took that, got the style Loni [Peristere] was looking for, and added some of those little nuances like the road rumble, the extra shake when something flies by camera, that kind of overall stuff.

“It’s a stylized look that you could attribute to real photography. I’ve been in the business for a bit, and it blows me away when I see it. Wow, that’s frickin’ all CG? It’s a very impressive spot. I was glad to be a part of it, because I think it’s going to have some legs.”

In the end, it was up to editor Gueer to assemble the finished shots into the final product. “It was a non-stop editorial process, from the beginning when Loni was assembling the story, to the time when we had all the final shots on the Flame. One of the things Steve [Meyer] did was add camera shakes to the shots, which made them look much better; but it changes the nature of what you’re seeing, even the slightest shake. You go well, wouldn’t it be better if we cut a few frames from this, or extended it by a few frames? When we had the final shots on the Flame, we literally did editorial on the Flame, making it better and better and tighter and tighter.”

“With this giant team of 40 some-odd people who worked on this spot, it’s certainly one of Zoic’s finest hours,” Peristere says, “and we’re incredibly proud to have put it together.”

People look at this spot and say “where did you guys shoot this?” Well, we didn’t shoot it!

Press is thankful to Wieden+Kennedy for trusting Zoic with the production of such an innovative and risk-taking spot. “They had faith in us and patience with us, and that was really great, because it really took that to produce this spot. It was a great experience on both sides. They gave us a lot of creative freedom, to really bring out the best in us. We pushed ourselves really hard to the level of realism and level of detail.

“I mean this kind of work, this animation, the quality level, is something very new for broadcast,” he says. “The extent to which we have gone to produce this spot in a visual style, in CG animation, has really never been done before. It’s a full 100% photo-real CG spot.

“NASCAR is very concerned about representing their world accurately, which was a big challenge for all of us, both from an agency side and a production side. Down to the decals on the cars, and the physics of the accidents, what would really get damaged and what wouldn’t, where would skid marks be made on the track… So people look at this spot and say ‘where did you guys shoot this?’ Well, we didn’t shoot it!

“The music was Metallica – my understanding is they’ve never licensed their music for broadcast commercials before. That was exciting from the get go — definitely a driving force creatively, no pun intended, the kind of energy that brings to the spot.”

Press says the spot has exceeded everyone’s expectations. “We’ve seen that response all the way around, from the agency, from our colleagues in the advertising world, and from ourselves as well – it’s really some of our best work. We’ve really set the bar anew; there’s a new target for us now, which is fantastic.”

In the midst of a vast Midwestern corn field, a friendly yellow industrial robot is on the hunt. Searching between rows of tall, green stalks of healthy corn, the robot discovers its prey, a single weed — tiny and innocent, but if it spreads the entire crop is in danger. The robot strikes, ripping the offending plant from the ground with its steel fingers. The corn is safe once again.

There aren’t really industrial robots prowling the cornfields of America. This is a 30-second commercial spot for Halex GT, a weed-control herbicide produced for corn farmers by Switzerland’s Syngenta AG. The number of businesses that might use Halex is relatively small, compared to most commercial brands – but it’s a lucrative product, and Minneapolis-based creative agency Martin|Williams was tasked with reaching those consumers through a television spot and Internet advertising.

Culver City, California’s Zoic Studios created the spot, directed by co-founder Loni Peristere. But there’s more to the story. Zoic was able to use the original assets it created for the broadcast commercial to create web ads and interactive landing page components, providing the client with Internet content that was much higher in quality than that usually created for online, and at a considerable cost savings.

Zoic commercial creative director Leslie Ekker explains that from the outset the studio pitched the idea of a holistic approach: including the creation of interactive assets as part of the broadcast VFX pipeline. “It’s more and more the case lately when we’re doing commercials, we ask during the bidding, ‘are you interested in an online dimension to this work?’ And the word gets around the agency, and they realize, yes, we need to get these resources from the spot; we can build on this work, and expand on it without very much extra effort and expenditure.”

Creating the Commercial Spot

The Halex commercial (see it here) came to Zoic on a short schedule and with a tight budget. “This job was awarded on a Wednesday,” Ekker says, “and we shot the following Monday, in Florida — after the production company found a location; the agency determined which robot they wanted to use; we sourced and acquired a robotic end effector; designed and machined the actual fingers; and worked out a way to puppeteer it live on-screen for the shoot. All of this in a very few days.

“In fact, regarding the end effector, we acquired the machine Sunday morning, and over breakfast I designed the fingers. During the day I supervised the machining of the fingers at a custom machine shop, while simultaneously running out with the live-action producer and getting a compressor and the air hardware, tools and supplies necessary to create all the physical effects. By 8:30 that evening we had a set of fingers for the machine, fully motivated and ready to go in the morning.

“It’s seldom that we do things with practical effects, but because of my background – I was a model maker for 20 years — it was not very challenging. The schedule is what was challenging. And that end effector is now being used in trade shows by the client, attached to an actual robot, performing weed-pulling demonstrations, live at their promotional booth at agricultural shows.”

The robot was this character, an iconic image they wanted to carry through all of the Halex branding.

Despite the fantasy aspect of the commercial, the spot required a high degree of technical accuracy, as far as the depiction of the product. “We learned a lot about farming on this job,” Ekker says. “The reason we went to Florida was we needed to show a certain height of corn, because this chemical is used on plants of a certain age. Also the fields there are very neat, very clean.”

The commercial had to be very accurate in its depiction of the cornfield, the plants themselves and how they grew, because the farmers to whom the spot was targeted would notice any inaccuracies. “Apart from those limitations,” Ekker says, “the client was wide open to creative suggestions. In fact Loni [Peristere], the director, had pretty much free reign with the storytelling.”

The practical effects in the spot are the end-effector and several attached hoses, and the actual weed that is grabbed by the end-effector. Ekker acted as puppeteer for the practical effect, operating the end-effector from the end of a pipe with counterweights attached to a pulley. “They changed the species of weed after we shot it,” Ekker admits, “but it passes well enough.” Everything else in the spot – the yellow robots, the cornfield, the weed as it grows — is CG.

One of the creative challenges involved digitally reproducing a time-lapse effect, showing the CG corn moving in the breeze as the weather changed and the sun moved through the sky. “We developed some very effective ways to show the translucency of leaves,” Ekker explains, “since we’re seeing them primarily back-lit; and to show the kind of animation that people expect to see from time-lapse plant growth — that kind of nervous, random weaving action.

“The background plate was supposed to be time-lapse, but it was at a very specific angle. Rather than dedicate a digital video camera to this one shot all day, I took our digital still camera, with an intervalometer, and set it up in a 5-gallon bucket buried in a corn field adjacent to where we were shooting. I lined up a shot with a very wide-angle lens pointed up at the sky at an angle.

“I framed it in such a way that we could take those high-resolution frames, and move another frame inside of it with some added distortion to give it the look of a camera pan-and-tilt, so that we could have a feeling of craning down and tilting up as this weed grows in the foreground. The move was created in that larger plate, adding a certain amount of keystoning for lens distortion, and it felt very much like a 3D camera move in time lapse, which would have to be motion-controlled in a normal situation. Luckily, because it was such a macro shot, we could do it with a single frame and a single camera position.

“That proved to be quite successful; we got several hours of time-lapse out of the way, with very low impact on the production. I would just go out and occasionally monitor the camera, change the battery, and make sure everything was okay.”

The practical end effector designed by Les Ekker.

Creating the Interactive Experience

While Ekker and his team were shooting the spot, and designing and rendering the CG, Zoic Creative Director – Digital Strategy Jeff Suhy and his group coordinated the web banner and landing page campaigns in support of the Halex marketing campaign.

“Martin|Williams came to us to build on the development of the 30-second spot,” Suhy explains, “which involved creation of the online assets. We worked in partnership with Martin|Williams in creating some particularly interesting banners, and modeled the robot for those banners; and we created the landing page, an educational experience which conveyed the attributes of the Halex herbicide, how it’s beneficial and its advantages over the competitors.

“The robot was this character, an iconic image they wanted to carry through all of the Halex branding. We animated the robot doing various things — pulling weeds, knocking a tractor off the screen, and other things.

We’re not just envisioning effects… we’re talking about designing the architecture of a fully integrative experience.

“The pipeline here is at Zoic pretty good for this sort of thing, so there weren’t any real technical issues. Les [Ekker] and his team designed the actions, and we on the interactive side designed the experiential elements around that, and how they interacted with the navigation. It’s a pretty seamless experience and I think it worked out pretty well for the client.

“It was cost effective, because we already had the assets; we already had 90% of the heavy lifting done, to get those assets ready for the web.”

Ekker was impressed with the final products produced by Zoic’s interactive team. “We did these little mini-cuts of the spot, in frames that were 75×300 pixels, tall narrow slices of the image. We would just use the essential shots to tell the broad story, and do some close moves within those frames on the greater-sized hi-def shot frame; and we wound up with some very artistic, very effective little story moments that require very narrow bandwidth, so they’re easy to stream online. It proved to be a really clean, elegant way to reuse existing assets.

“We adapted those animations for the landing page, and created some very interesting little interactive demos, with mouse-overs, triggers and hold cycles at the end, so the robot wouldn’t just sit there idly. It would sort of look around and wait for what’s next. And we managed to get a lot of personality into the animation. It was a lot of fun. A very quick, very efficient project.”

Suhy says this kind of holistic marketing effort provides more than mere convenience for the client. “The techniques used to develop this character and to animate this asset would normally have been prohibitively expensive for such a niche marketing campaign. If it were not for the efficiencies of Zoic’s pipeline, this would be reserved only for large budget, big campaigns that could afford to invest the money.

“The real message here is that, even for something as niche as Halex, we can do something that’s really high-end CG.”

Holistic Marketing and the Future

Erik Press, Zoic executive producer, commercials, believes this kind of holistic marketing is the next step in the evolution of advertising. “It’s not just about broadcast anymore. Fewer and fewer eyes are remaining on what we all have known as standard broadcast television, and now they’re moving to the Internet, and that’s what the future is. Part of the conversation at the front of any job is, what are the plans for integrated content? Clients have been really warming to that.”

As Zoic has expanded from its roots as a VFX house, with its own editorial, design and interactive departments, it has been able to offer services that are more encompassing and can meet a wider variety of client needs. “I think people are waking up to the understanding, as we put out who we are at Zoic, that we are problem solvers and educators because of the depth of our resources. There’s a little spark going off in people minds now, and Halex was a great example. There was an ‘aha!’ moment for them, where they said ‘oh, you guys can do that?’

“We want to look at projects strategically. There’s a financial advantage to approaching projects at the outset, knowing the different kinds of media platforms we’ll be creating assets for. It’s a new paradigm in commercial production. We’re not just envisioning effects for a 30-second spot, it’s much bigger than that. We’re talking about designing the architecture of a fully integrative experience. That’s new advertising at its core – the experience.

“I think for us as a company, our goal is to be at the leading edge of that kind of creativity and technology. Zoic is poised so well to have a great comprehensive, strategic view of what it’s going to take to get there.”

This is a moment of unparalleled change in the media world, part of a process of barely-controlled destruction and reconstruction that began over a decade ago. Business models and revenue streams are collapsing, and media creators are turning to the latest technologies to create new opportunities and new businesses. At this year’s Consumer Electronics Show in Las Vegas, technology firms touted a slate of new 3D TVs as a solution to video piracy, and a way to lure fickle consumers back away from free Internet content. But are such promises tenable?

It all started in the music industry, when Napster, the original digital music sharing service, was launched in June of 1999. With music freed from the baryonic prison of vinyl, polyester tape and polycarbonate plastic, consumers could copy, edit, sample, decode and redistribute it and other copyrighted content at will.

Rights holders had always controlled their intangible product by controlling the tangible media – records, cassette tapes and CDs, as well as radio frequencies, for music; television channels and chunky videotapes for video; multiple 40-pound reels of motion picture film for movies; floppy disks and CDs for software; plus dead-tree books and photographs. Suddenly, their control of intellectual property was just gone, vaporized in a mist of ones and zeroes. On one side, many music executives saw digital media as a tremendous new opportunity for both creative expression and for business. Zoic’s Jeff Suhy, a former record company executive, was quoted in the May 2000 Village Voice: “I love that the world is quite obviously changing before our eyes and no one really knows how it’s going to play out!”

Suddenly, control of intellectual property was gone, vaporized in a mist of ones and zeroes.

On the other hand, some rights holders saw any perceived change to their traditional revenue stream as a threat to be destroyed at all costs. They dug in their heels and fought the future – engendering numerous disasters, from Circuit City’s Digital Video Express, which sold consumers DVD movies that “expired” after two days, to the RIAA’s litigious pogrom against file-sharing college kids and soccer moms. And money spent to develop various copy-protection and DRM schemes was almost always wasted, as consumers found ways to defeat protection, or avoided protected products altogether.

But some in the business world saw opportunities, not enemies. When Steve Jobs first laid eyes on the Xerox Alto in the late 1970s, with its GUI user interface and mouse controller, he saw the future of computing. Decades later, Jobs understood that the original Napster, driven out of business by the record companies, was the template for media distribution in the new millennium. With Apple’s iTunes software and online store, Jobs went from computer mogul to media mogul, taking advantage of record companies’ desperation to gain control of digital music, and appointing to himself the power to single-handedly set prices for online entertainment. But iTunes by itself would not have been enough to compete with free MP3s – it was the convenience, portability, style, incredible ease of use, sound quality and price point of the iPod that gave Apple control first of the personal music player market, and then of legitimate online music and video distribution.

Now the media industry has reached another watershed moment of change, as file-sharing endangers the revenue models of film and television creators, as well as publishers and journalists. But media moguls have absorbed the lessons of the music industry’s tribulations in the last decade, and there is a new humility in the face of change — a willingness, even an eagerness, to adapt to the new digital world, rather than to deny it. In the last few years, movie and television creators have moved their product online, to free video sites like Hulu, which will soon experiment with for-pay models; and are offering high-definition, appointment-free content on demand to home televisions through cable companies and Netflix.

There is a new humility in the face of change — an eagerness to adapt to the new digital world.

In 2010, how else are media producers taking control of the future of their own industry? What are they doing to reimagine their businesses, and insure that the media world of 2020 is profitable and stable?

Some of the answers were on display at this year’s Consumer Electronics Show in Las Vegas. Publishers are betting that consumers will gladly pay to read their content on a new breed of flat, portable, easy-to-read e-book products. Just as the iPod saved music, publishers hope that Amazon’s Kindle and Barnes & Noble’s Nook will save literature and journalism, at least until true e-paper is developed.

The greatest buzz at CES was elicited by a whole crop of new HDTVs with 3D capabilities. The motion picture industry and the movie theater chains are increasingly turning to 3D and IMAX as ways to lure audiences into theaters, and the current success of James Cameron’s Avatar demonstrates that even in a serious global recession, moviegoers are willing to pay extra for a high-tech movie experience they can’t get at home.

The new 3D TVs, including the Panasonic TC-PVT25 series that won the Best of CES award this year, promise to provide an in-home 3D experience for only a few hundred dollars more than ordinary HDTVs. In addition, satellite television provider DirectTV announced at CES that it has teamed with Panasonic to create three HD 3D channels, to launch this spring. Working with media partners including NBC Universal and Fox Sports, DirectTV will offer a pay-per-view channel, an on-demand channel, and a free sampler channel, all in 24-hour 3D and compatible with the current generation of sets.

Like the original HD offerings in the mid-1990s, which focused on sports events and video from space missions, the new 3D channels will offer existing 3D movies, 3D upgrades of traditional 2D movies, and sports. Unlike with HDTV however, there is no indication the government will legislate widespread adoption of 3D TV. And there are issues.

3D will likely establish its foothold in the living room is not with sports or movies, but with video games.

The greatest usability issue is the need for viewers to wear glasses. While there are experimental technologies that work without glasses, today if you want to experience high-quality 3D television images you need to wear pricey shutter glasses. Unlike the polarized glasses patrons wear at theaters, shutter glasses respond to signals from the TV, directing alternating frames to alternating eyes. The glasses are expensive – only Panasonic is promising to provide a pair with your TV purchase, and additional pairs will run around $50. At least one manufacturer is already offering lighter, more fashionable, more expensive replacement glasses.

And wearing special glasses while watching TV at home is not conducive to the average person’s lifestyle. As Microsoft exec Aaron Greenberg told GameSpy at CES, “when I play games or watch TV, I’ve got my phone, I’ve got all kinds of things going on… I get up, I get down, I’m looking outside at the weather… I’m not in a dark theater, wearing glasses, staring at a screen.” You cannot walk around comfortably wearing modern shutter glasses, and just happen to be wearing them when you want to watch TV. Until 3D TVs don’t require glasses, consumers are going to have trouble integrating 3D television watching into their lives.

The new 3D TVs also suffer from varying levels of picture clarity and a pronounced flicker, although these issues are expected to disappear as the technology improves. More importantly, 3D media demand changes in how movies and television and produced. Right now, only computer animated films are expressly produced with the needs of 3D in mind, producing stunningly realistic depth-of-field and fine gradations of perceived depth. Film and video produced according to the traditional rules of 2D creates flat, paper-thin figures moving in a 3D environment that can appear shallow or truncated. Sports coverage, intended to be a killer app for 3D TV, particularly suffers from these issues, and 3D broadcasts of sporting events may require drastic changes to the technology used on the field.

Filmmakers are still learning how to deal with changing depth of focus. In the real world, the viewer chooses unconsciously where to focus their eyes; but in a 3D production this decision is made for the viewer. A plane of focus that appears to constantly shift can give audiences headaches and eye strain. A largely different language of cinema is being developed, to produce content in which 3D is a core component rather than a faddish trinket.

And finally, CNN Tech reports that between four and 10 percent of consumers suffer from something called “stereo blindness,” a sometimes treatable condition that makes it impossible to experience 3D movies or television. This is hardly a deal-killer, but one wonders how the spread of stereo music technology would have been affected if 10% of listeners had not been able to appreciate the difference.

Honestly, how 3D will likely establish its foothold in the living room is not with sports or movies, but with video games. Video gamers are already accustomed to buying expensive high-tech peripherals. They are used to content designed for one person, one screen. And when designed properly, 3D does not just add visual excitement to a game, but actually affects and enhances the gameplay itself.

So will 3D television lure viewers away from legitimate free Internet video, and from illegally pirated video files? It is too soon to tell. But there is a key difference to this strategy, as compared to some of the previously unsuccessful responses to piracy and the Internet. As with Steve Jobs and the iPod, 3D TV producers are offering consumers something new and exciting that, once the issues are worked out, will enhance their news and entertainment experiences. Rather than treating customers like the enemy, they are approaching customers as customers. And iTunes proves that people are more than willing to pay for their media, as long as they can experience a clear benefit.

Well, it’s finally 2010. As you know, Pan Am currently offers commercial flights to all the major space stations; every family has pet dolphins in their specially-converted cetacean-friendly homes; computer graphics have finally hit 16-bits, displayed on futuristic CRT monitors; and the United States and the Soviet Union are on the brink of war.

Okay, so maybe the film 2010, Peter Hyams’ 1984 sequel to Stanley Kubrick’s 2001: A Space Odyssey, got a few details wrong. And it’s not really on the same level as its classic predecessor. But it’s still a fun, smart, great-looking sci-fi adventure that deserves a second look.

Once we’ve drawn our lines, once we’ve made it absolutely clear that 2001 continues to stand absolutely alone as one of the greatest movies ever made, once we have freed 2010 of the comparisons with Kubrick’s masterpiece, what we are left with is a good-looking, sharp-edged, entertaining, exciting space opera…

Just as the year 1984 spurred interest in the novel 1984, so 2010 has created renewed interest in the film – Google searches for “2010 movie” have spiked sharply in the last two months, and the film is up 413% in popularity this week on IMDb.

To satisfy those succumbing to the current 2010 mania, I spoke to Zoic Studios commercial creative director Leslie Ekker, who was a member of the miniatures crew for the film.

“The first thing we had to do on 2010 was to build the spaceship Discovery from 2001. Unfortunately, in England, where the ship was built and shot and stored, an accountant had decided years before not to pay for the storage of the ship anymore; drew a line through a number on a list; and all the models were destroyed. There was literally nothing surviving. But we had to reproduce the ship as exactly as possible so that people would recognize it. And the only way we could do it – none of the drawings existed, no information, no photographs—was to rent a laserdisc of the film; freeze-frame it; take photographs of those frames; enlarge them to the point where they were useful for me; and do overlays, tracing the edges of all the details onto a drawing. Then I did a perspective analysis, and created six orthographic views that could be used as construction drawings. I had to do that with the entire Discovery, front-to-back, in order to be able to reproduce it.

“The production was in touch with the original people. In fact, all the visual effects were being produced by Doug Trumbull, who was one the principle people on the team for 2001. He knew all the people involved, and got in touch with the right folks — but nobody had anything left. Pretty sad, considering what a classic 2001 was.

“So first I had to do these construction drawings, and it was challenging, because the shots [in the original 2001] are actually fairly scarce. There aren’t a lot of things from different angles, and of course the image quality was pretty poor. So there was a lot of interpretation. Ultimately, we got it pretty close.

“We made two different scale models of the Discovery, and one large-scale model of the front end of the ship. One model was about 10 feet long, much smaller than the original ones they built in England. They built huge miniatures due to the shorter depth of field of lenses in those days! Ours was designed to rotate, as well. In the scenes where they come upon the Discovery still orbiting, it’s tumbling end-over-end because of precession, the physical force on a rotating body (its gravity carousel) that is 90 degrees to any other forced applied to it.

“The Discovery is dusted down with sulfur, because it’s orbiting around [Jupiter’s moon] Io, which has sulfur volcanoes that erupt into space. So that got stuck to the body of Discovery, it’s all sulfur yellow — so naturally our models were painted yellow, unlike the original.

“The Boss Film model shop supervisor was Mark Stetson, an Oscar-winning feature film VFX supervisor now. In his model shop in Marina Del Rey, we built a lot of different miniatures for the movie. Some were of the Leonov, the Russian ship, and the Discovery; but also of the moons’ surfaces. We built a few models that were pretty interesting.

“One of the ideas they explored in 2010, that actually had a lot of controversy surrounding it, was the concept of life under the ice on Jupiter’s moon Europa. They have since found there is most likely liquid water under that ice, and it possibly could have enough warmth to support life; and it may actually harbor life, maybe in bacterial form. It’s hard to say. That was kind of interesting. One of my jobs on the movie was to help make that life.

“We built the surface of Europa, a small section of it, and filled it with some water, sections of ice, and strange looking plants. We used Madagascar palms for some of the plants, because they’re so strange looking already; they look quite alien. In the shallow water of the pond, built into the tabletop of the model, we had some invisible rigging that could move some very fine feathery plants in an intelligent way, as if they were motivated, under the surface of the water. That’s what you see in the film when you see something moving under the water — it’s actually a very fine dried plant getting pulled around by an invisible rig.”

The design of the Russian spaceship, the Leonov, had to differ from the “American” design of the Discovery. “The common wisdom was that Russian technology looks heavier, and feels clunkier, and has more exposed detail, kind of a brutal design style. [Legendary industrial designer] Syd Mead was employed to design the Leonov, and did some beautiful drawings.

“Peter Hyams, the director of the film, scrutinized the drawings very closely to make sure every single line from the drawing was on our model; to the point where, in a perspective construction drawing, if a sketched line ran off the corner of an object, he wanted a little wire glued onto the object to represent that line. It was kind of strange, but we did it.

“I spent about six weeks just building plumbing in the hub of the rotating section. If you look carefully at the Leonov, there’s this really intricate rat’s nest of pipes of all different sizes, weaving in and out and going off in different directions. And there was one on each side, so they had to match. I had to make matching sets of this very intricate piping, melting and bending pieces of plastic model piping by hand. It took weeks and weeks to do. Then I had to make a miniature version, half that size, for the smaller scale Leonov. It was a lot of fun, but it was also really challenging.

“One of the other things I did was to create the Cyrillic typeface you see on the side of the Leonov, and the other graphics that go on the ship. We had a translator create all the different words we needed, and then went to a type house and had wax transfers made — these were rub-downs we used to use in the graphic days before computers. I had sets and sets of them made in both the different scales, applied them to the ship, and then we painted them into the overall paint scheme of the ship. It’s the only time I’ve had to work in Russian!

“There’s a sequence in the film where the Leonov has to execute an aerobraking maneuver. That’s when a spacecraft just grazes the outer atmosphere of a planet, using aerodynamic friction to slow itself down, rather than burning fuel. It does this with a device called a ballute, which is a half-balloon half-parachute. We were had to make ballutes that were deployed from the core of the aft-end of the Leonov, and they were big inflatable airbags — gas bags, really. I had to develop a way to create airtight bags that were of a very specific shape. The surface pattern on them looked like some kind of fiber-reinforced textile. We had to be able to stow them in a very small volume, from which they would inflate very quickly to a certain size on camera. And then we made a separate set of those same ballutes that were fully inflated to a rigid shape.

“We also needed to make another set of ballutes, coated with pyrotechnic powder, and light them on fire, send them down a wire and film them, to be composited with the rest of the spacecraft for the actual moments of high friction and heat. So it was quite a project, and I was assigned the task of designing and producing these things.

“I had to learn pretty fast how to make airtight structural bags out of very tough, heat-resistant materials. I used very thin Mylar, like space-blanket material; and thin double stick tape to make the seams. I made screen prints of the graphic pattern on the surface. And we ended up using a leaf blower to inflate them. Leaf blowers are great, because they pump huge volumes of air at low pressure. You can inflate something very large without a lot of force behind it, so when it reaches the end of its inflation capacity it doesn’t burst a seam. After about five weeks of effort, that actually worked.

“Then we set about sculpting the rigid versions, which were just foam sculptures that were hard-coated, and painted and stenciled with the same graphic pattern as the airbags. Then we made copies in fire-resistant epoxies, in order to pyro-coat them and do the actual burning sequences. All this work was done at Boss Films’ Glencoe model-making facility, where there’s nothing but condos now. In those days Glencoe was all shipyards and industrial facilities; that’s all gone now.”

Ekker remembers 2010 as a fun, if challenging, experience. He also related an anecdote on how his work on the film helped him in another way:

“When you’re in the union, you have a card in a file that tells what your specialties are. And in the union system, if a model shop is putting together a union crew, they have to just call the union and say ‘send me five model makers,’ and hope they get good people. A lot of people, who say they’re model makers, really are not model makers.

“The workaround was, you would go and request someone who had a skill that was very specific to that person. A lot of us had skills that were very unique-sounding, but they were legitimate, because we had to be able to do the skill. After 2010, my skill card said, “pneumatic inflatable structures,” and “foreign language typesetting for model making” — skills so esoteric, it could only be me. So if, say, someone wanted to hire me, they could call up the union hall, and say “I need a guy who can make an airbag,” and they’d send me up!

Actors Christopher Shyer and Morena Baccarin on the greenscreen set of ABC’s V; the virtual set is overlaid.

Visual effects professionals refer to the chain of processes and technologies used to produce an effects shot as a “pipeline,” a term borrowed both from traditional manufacturing and from computer architecture.

In the past year, Zoic Studios has developed a unique pipeline product called ZEUS. The showiest of ZEUS’ capabilities is to allow filmmakers on a greenscreen set to view the real-time rendered virtual set during shooting; but ZEUS does far more than that.

Zoic Studios pipeline supervisor Mike Romey explains that the pipeline that would become ZEUS was originally developed for the ABC science fiction series V. “We realized working on the pilot that we needed to create a huge number of virtual sets. [Read this for a discussion of the program’s VFX and its virtual sets.] That led us to try to find different components we could assemble and bind together, that could give us a pipeline that would let us successfully manage the volume of virtual set work we were doing for V. And, while ZEUS is a pipeline that was built to support virtual sets for V, it also fulfills the needs of our studio at large, for every aspect of production.

“One of its components is the Lightcraft virtual set tracking system, which itself is a pipeline of different components. These include InterSense motion tracking, incorporating various specialized NVIDIA graphics cards for I/O out, as well as custom inertial sensors for rotary data for the camera.

“Out of the box, we liked the Lightcraft product the most. We proceeded to build a pipeline around it that could support it.

“Our studio uses a program called Shotgun, a general-purpose database system geared for project shot management, and we were able to tailor it to support the virtual set tracking technology. By coming up with custom tools, we were able to take the on-set data, use Shotgun as a means to manage it, then lean on Shotgun to retrieve the data for custom tools throughout our pipeline. When an artist needed to set up or lay out a scene, we built tools to query Shotgun for the current plate, the current composite that was done on set, the current asset, and the current tracking data; and align them all to the timecode based on editorial selects. Shotgun was where the data was all stored, but we used Autodesk Maya as the conduit for the 3D data – we were then able to make custom tools that transport all the layout scenes from Maya to The Foundry’s Nuke compositing software.”

By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot.

Romey explains the rationale behind creating 3D scenes in Nuke. “When when you look at these episodic shows, there’s a large volume of shots that are close-up, and a smaller percentage of establishing shots; so we could use Nuke’s compositing application to actually do our 3D rendering. In Maya we would be rendering a traditional raytrace pipeline; but for Nuke we could render a scanline pipeline, which didn’t have same overhead. Also, this would give the compositing team immediate access to the tools they need to composite the shot faster, and it let them be responsible for a lot of the close up shots. Then our 3D team would be responsible for the establishing shots, which we knew didn’t have the quality constraints necessary for a scanline render.

“By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot, because we didn’t have to provide the 3D support necessary. That’s how the ZEUS pipeline evolved, with that premise – how do we meet our client’s costs and exceed their visual expectations, without breaking the bank? Throughout the ZEUS pipeline, with everything that we did, we tried to find methodologies that would shave off time, increase quality, and return a better product to the client.

“One of the avenues we R&Ded to cut costs was the I/O time. We found that we were doing many shots that required multiple plates. A new component we looked at was a product that had just been released, called Ki Pro from AJA.

“When I heard about this product, I immediately contacted AJA and explained our pipeline. We have a lot of on-set data – we the have tracking data being acquired, the greenscreen, a composite, and the potential for the key being acquired. The problem is when we went back to production, the I/O time associated with managing all the different plates became astronomical.

“Instead of running a Panasonic D5 deck to record the footage, we could use the Ki Pro, which is essentially a tapeless deck, on-set to record directly to Apple ProRes codecs. The units were cost effective – they were about $4,000 per unit – so we could set up multiple units on stage, and trigger them to record, sync and build plates that all were the exact same length, which directly corresponded to our tracking data.”

We found methodologies that would shave off time, increase quality, and return a better product to the client.

Previously, the timecode would be lost when Editorial made their selects, and would have to be reestablished. “That became a very problematic process, which would take human intervention to do — there was a lot of possibility for human error. By introducing multiple Ki Pros into the pipeline, we could record each plate, and take that back home, make sure the layout was working, and then wait for the editorial select.” The timecode from the set was preserved.

“The ZEUS pipeline is really about a relationship of image sequence to timecode. Any time that relationship is broken, or becomes more convoluted or complicated to reestablish, it introduces more human error. By relieving the process of human error, we’re able to control our costs. We can offer this pipeline to clients who need the Apple ProRes 442 codec, and at the end of the day we can take the line item of I/O time and costs, and dramatically reduce it.”

Another important component is Python, the general-purpose high-level programming language. “Our pipeline is growing faster than we can train people to use it. The reason we were able to build the ZEUS pipeline the way we have, and build it out within a month’s time, is because we opted to use tools like Python. It has given us the ability to quickly and iteratively develop tools that respond proactively to production.

“One case in point – when we first started working with the tracking data for V, we quickly realized it didn’t meet our needs. We were using open source formats such as COLLADA, which are XML scene files that stored the timecode. We needed custom tools to trim, refine and ingest the COLLADA data into our Shotgun database, into the Maya cameras, into the Nuke preferences and Nuke scenes. Python gave us the ability to do that. It’s the glue that binds our studio.

“While most components in our pipeline are interchangeable, I would argue that Python is the one component that is irreplaceable. The ability to iteratively making changes on the fly during an episode could not have been deployed and developed using other tools. It would not have been as successful, and I think it would have taken a larger development team. We don’t have a year to do production, like Avatar – we have weeks. And we don’t have a team of developers, we have one or two.

While most components in our pipeline are interchangeable, Python is the one component that is irreplaceable.

“We’re kind of new to the pipeline game. We’ve only been doing a large amount of pipeline development for two years. What we’ve done is taken some rigid steps, to carve out our pipeline such a way that when we build a tool, it can be shared across the studio.”

Romey expects great things from ZEUS in the future. “We’re currently working on an entire episodic season using ZEUS. We’re working out the kinks. From time to time there are little issues and hiccups, but that’s traditional for developing and growing a pipeline. What we’ve found is that our studio is tackling more advanced technical topics – we’re doing things like motion capture and HDR on-set tracking. We’re making sure that we have a consistent and precise road map of how everything applies in our pipeline.

“With ZEUS, we’ve come up with new ways that motion capture pipelines can work. In the future we’d like to be able to provide our clients with a way not only to be on set and see what the virtual set looks like, while the director is working — but what if the director could be on set with the virtual set, with the actor in the motion capture suit, and see the actual CG character, all in context, in real-time, on stage? Multiple characters! What if we had background characters that were all creatures, and foreground characters that were people, interacting? Quite honestly, given the technology of Lightcraft and our ability to do strong depth-of-field, we could do CG characters close-to-final on stage. I think that’s where we’d like the ZEUS pipeline to go in the future.

“Similar pipelines have been done for other productions. But in my experience, a lot of times they are one-off pipelines. ZEUS is not a pipeline just for one show; it’s a pipeline for our studio.

“It’s cost effective, and we think can get the price point to meet the needs of all our clients, including clients with smaller budgets, like webisodes. The idea of doing an Avatar-like production for a webisode is a stretch; but if we build our pipeline in such a way that we can support it, we can find new clients, and provide them with a better product.

“Our main goal with ZEUS was to find ways to make that kind of pipeline economical, to make it grow and mature. We’ve treated every single component in the pipeline as a dependency that can be interchanged if it doesn’t meet our needs, and we’re willing to do so until we get the results that we need.”

The level of the technology available to produce computer graphics is approaching a new horizon, and video games are part of the equation.

Creators in 3D animation and visual effects are used to lengthy, hardware-intensive render times for the highest quality product. But increasingly, productions are turning to realtime rendering engines, inspired by the video games industry, to aid in on-set production and to create previz animations. Soon, even the final product will be rendered in realtime.

Aaron Sternlicht, Zoic Studios’ Executive Producer of Games, has been producing video game trailers, commercials, and cinematics since the turn of the millennium. He has charted the growth of realtime engines in 3D animation production, and is now part of Zoic’s effort to incorporate realtime into television VFX production, using the studio’s new ZEUS pipeline (read about ZEUS here).

Sternlicht explains how realtime engines are currently used at Zoic, and discusses the future of the technology.

“The majority of what we do for in-engine realtime rendering is for in-game cinematics and commercials. We can take a large amount of the heavy-lifting in CG production, and put it into a game engine. It allows for quick prototyping, and allows us to make rapid changes on-the-fly. We found that changing cameras, scenes, set-ups, even lighting can be a fraction of the workload that it is in traditional CG.

“Right now, you do give up some levels of quality, but when you’re doing something that’s stylized, cel-shaded, cartoonish, or that doesn’t need to be on a photo-realistic level, it’s a great tool and a cost effective one.

We’re going to be able to radically alter the cost structures of producing CG.

“Where we’re heading though, from a production standpoint, is being able to create a seamless production workflow, where you build the virtual set ahead of time; go to your greenscreen and motion capture shoot; and have realtime rendering of your characters, with lighting, within the virtual environment, shot by a professional DP, right there on-set. You can then send shots straight from the set to Editorial, and figure out exactly what you need to focus on for additional production — which can create incredible efficiencies.

“In relation to ZEUS, right now with [ABC’s sci-fi series] V, we’re able to composite greenscreen actors in realtime onto CG back plates that are coming straight out of the camera source. We’re getting all the camera and tracking data and compositing real-time, right there. Now if you combine that with CG characters that can be realtime, in-engine rendered, you then can have live action actors on greenscreen and CG characters fully lit, interacting and rendered all in realtime.

“People have been talking about realtime VFX for the last 15 years, but now it’s something you’re seeing actually happening. With V we have a really good opportunity. We’re providing realtime solutions in ways that haven’t been done before.

“Now there’s been a threshold to producing full CG episodic television. There has been a lot of interest in finding a solution to generate stylized and high quality CG that can be produced inexpensively, or at least efficiently. A process that allows someone to kick out 22 minutes of scripted full CG footage within a few weeks of production is very difficult to do right now, within budgetary realities.

“But with in-engine realtime productions, we can get a majority of our footage while we’re actually shooting the performance capture. This is where it gets really exciting, opening an entire new production workflow, and where I see the future of full CG productions.”

What game-based engines have Zoic used for realtime rendering?

“We’ve done a several productions using the Unreal 3 engine. We’ve done productions with the Killzone 2 engine as well. We’re testing out different proprietary systems, including StudioGPU’s MachStudio Pro, which is being created specifically with this type of work in mind.

“If you’re doing a car spot, you can come in here and say ‘okay, I want to see the new Dodge driving through the salt flats.’ We get your car model, transfer that to an engine, in an environment that’s lit and realtime rendered, within a day. We even hand you a camera, that a professional DP can actually shoot with on-site here, and you can produce final-quality footage within a couple of days. It’s pretty cool.”

How has the rise of realtime engines in professional production been influenced by the rise of amateur Machinima?

“I’ve been doing game trailers since 2000. I’ve been working with studios to design toolsets for in-game capture since then as well. What happened was, you had a mixture of the very apt and adept gamers who could go in and break code, or would use say the Unreal 2 engine, to create their own content. Very cool, very exciting.

“Concurrently, you had companies like Electronic Arts, and Epic, and other game studios and publishers increasing the value of their product by creating tool sets to let you capture and produce quality game play — marketing cameras that are spline-based, where you can adjust lighting and cameras on-the-fly. This provided a foundation of toolsets and production flow that has evolved into today’s in-engine solutions.”

It’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future.

How has this affected traditional producers of high-end software?

“It hasn’t really yet. There’s still a gap in quality. We can’t get the quality of a mental ray or RenderMan render out of a game engine right now.

“But the process is not just about realtime rendering, but also realtime workflow. For example, if we’re doing an Unreal 3 production, we may not be rendering in realtime. We’ll be using the engine to render, instead of 30 or 60 frames a second, we may render one frame every 25 seconds, because we’re using all the CPU power to render out that high-quality image. That said, the workflow is fully realtime, where we’re able to adjust lighting, shading, camera animation, tessellation, displacement maps — all realtime, in-engine, even though the final product may be rendering out at a non-realtime rate.

“Some of these engines, like Studio GPU, are rendering out passes. We actually get a frame-buffered pass system out of an engine, so we can do secondary composites.

“With the rise of GPU technology, it’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future. Artists, rather than waiting on renders to figure out how their dynamic lighting is working, or how their subsurface scattering is working, will dial that in, in realtime, make adjustments, and never actually have to render to review. It’s really remarkable.”

So how many years until the new kids in VFX production don’t even know what “render time” means?

“I think we’re talking about the next five years. Obviously there will be issues of how far we can push this and push that; and we’re always going to come up with something that will add one more layer to the complexity of any given scene. That said, yes, we’re going to be able to radically alter the cost structures of producing CG, and very much allow it to be a much more artist-driven. I think in the next five years… It’s all going to change.”

In 2009 Jeff Suhy joined Zoic Studios, the visual effects house in Culver City, California. How the former A&R executive found himself working alongside the creators of spaceships for Battlestar Galactica and vicious monsters for Fringe is not only the story of one man’s career, but of the trajectory of the entire entertainment industry over the past three decades.

In the first part of this two part interview, Suhy described the path of his career and how he came to Zoic as Creative Director – Digital Strategy. Here he discusses the current state of the record industry, and what the catastrophic changes there portent for the entertainment industry as a whole.

The entertainment industry is going to be very different in five years, but there will still be an entertainment industry. Do you think you got out of the music industry just in time? It seems like in five years there won’t be anything even remotely resembling a music industry.

I’ve been thinking that for about ten years. Nevertheless, it still seems to exist. I have a lot of friends who are trying to help shape the future of that business, it’s certainly going to be different – the recording business, we’re talking about, it’s not the CD business anymore. It’s the recording artists, and distributing those artists, and subsidizing tour support to develop an artist. The quote-unquote “record companies” are going to do it, maybe agents.

Digital technology has certainly enabled a lot of bands to record and distribute themselves; some of the barriers to entry are gone, and it makes less of a case for the record business. They certainly can’t take 85% of the revenue from your sales anymore – but there’s not that much revenue [anyway]. Certainly the forces against them are strong, but there’s always going to be a need for artists to have help shaping and getting their message out there, and there’s gonna be someone to fill [that need].

The record industry can’t take 85% of the revenue from your sales anymore – but there’s not that much revenue anyway…

It won’t look like what it does probably now even, but there will always be a quote-unquote “record business,” just like there will always be a television business and there will always be a film business, even though those things are going to be changing pretty dramatically too.

And radio.

Mmm hm.

Didn’t a lot of what you were talking about with the corporatization of the music business have to do with radio – Clear Channel, Viacom?

Deregulation in the radio business allowed these companies to own tons of radio stations, and start to put on the pressure to homogenize music. If you’re a record company, and you want to get an artist out there, you have to work with Clear Channel if you’re going to have any success. You used to have to work with MTV. If you didn’t get a record on MTV back in the 80s and 90s, it was almost impossible to get a break and become huge. And now that stranglehold is those Clear Channels and those big companies that own the space.

They’re becoming less powerful. That’s the good news, because people are finding music in other ways. They’re finding it through Pandora and referral technologies, iTunes, all these different ways to discover music. It’s fascinating to watch. Luckily I’m not in the middle of it anymore. I can watch it from the outside, and root for the forces of creativity over the forces of corporatization.

So what’s coming in the next five years as far as digital technologies related to digital marketing and advertising?

I have Netflix on my PS3, and I’m watching Lost right now on my PlayStation 3, streaming in high definition, glitch free. This was the big problem on the Internet all the years I was doing streaming media — there was this buffering, and pixelization, and poor quality. And at the end of the day people were like, “yeah, well, I’m never going to want to watch TV over the Internet, because it’s a crappy little small-screen experience; and I want a big screen, and I want great quality.”

And now, not only is there parity, but there’s instantaneous delivery, as opposed to waiting and buying a DVD, or waiting for your TiVo to record your show. You have the ability now to just get it.

And not only that, but you can interact with it. And that’s the future. Media over IP, on the big screen, and being able to interact with it. It’s pretty simple. It usually is – people always over-complicate things, but that’s the future.

And the mobile device — being able to have the same thing on your mobile device that you have on your big screen, so when you’re traveling you can just reach in and grab whatever show you want to watch on your iPhone or whatever it is that you have. That’s where it’s at.

That model of subscribing to content and not actually taking physical ownership of it is becoming more and more acceptable…

But how are they going to make money?

Good question! Maybe I’m being optimistic, but I feel like there’s a cycle that we’re about to go back through. Back in the early days of television, these shows would have brand integration right in the shows, where you would have the host of the show literally walk off to a set on the side and say, “have you ever thought of using Clorox…”

Exactly! You had these brands integrated into television in the early days, before they started creating commercial spots. And that was what paid for television.

These brands and products out there are always going to try to find a way to get exposure to their market. And when people are watching television over IP, if their demographic is all doing that, they have to find a way. Just like they are trying to find a way to get social media to work for them. It’s not an easy equation, but it’s being solved. Little by little things keep happening that get us closer to those advertising dollars and those brand dollars finding their way online. There are companies out there like Generate and other companies, that are working to create branded content that has a high level of quality.

I produced Bud TV for Anheuser-Busch, and that whole project was the first IP TV project where original content, which wasn’t an advertisement, was being developed for a brand. We created a whole bunch of shows. It was a great early experiment. It didn’t go so well, because of the age-verification, and the fact that with an adult beverage you had to be 21 and we had to use your driver’s license to verify you. Everyone was going to YouTube at that point. Traffic on the Internet is like water, it will flow around any kind of obstacle; and we put too big of an obstacle in front of it, so it never really took off. But it was the right idea, and that’s where it’s heading.

Brands are gonna associate themselves directly with TV shows, and production companies and development studios are going to be creating shows and getting ad dollar buy-ins in sponsorship form straight up front.

So that’s for television; and for movies, you’re going to have to pay for them, just like you do now. You just get them over the Internet. Like I’m doing with my Netflix subscription — I can watch shows on my PS3, but I’m paying a monthly subscription. TiVo, you have a subscription; Rhapsody, you have a subscription. That model of subscribing to content and not actually taking physical ownership of it is becoming more and more acceptable, whereas before that was really tough to swallow.

But it seems to me that all the differences between movies and television are based on how those media were originally delivered. Now that those delivery systems won’t exist, won’t the difference between TV and film cease to exist? Won’t you end up with a continuum of some things that are episodic, and some that aren’t, of different lengths?

I think the expectations and templates are breaking down. But people still want to have that Lost kind of episodic reality, or the Sopranos, where you’re following the story of these characters for years. The writers go away for several months and conjure the next season, and they come back with 20 more hours of this idea to share with their audience. That’s one methodology, and however that manifests itself, seven-minute episodes or hour episodes, that will be different content for different types of shows. Some will have multiple storylines happening concurrently, that you will be only able to experience online, where you’re able to click on characters or things within the show and get parallel storylines.

With film, it’s a different type of experience. It’s one complete story, that is digestible within an hour-and-a-half, two hours, and that’s just a different type of experience.

Will you ever go to a theater to see one, in five years, ten years?

I think you will probably with 3D, something like that. There will be different up-sells. Like there’s this new cinema in Pasadena in the newspaper today, which is $29 a ticket. You have this full lounge recliner and a blanket and a pillow, and there’s a little table between you and the person you’re with, and you ring the bell and they bring you martinis. It becomes more of a whole experience, going out. That to me sounds very compelling, and makes me want to go out to a movie. That’s something I want to try.

With Avatar, the 3D showings are sold out, with a higher ticket price that people are willing to pay for a better experience. Otherwise, you can just watch it on your plasma screen when it comes out on TV in a couple of months, pay-for-view, whatever. These release windows are all going to be changing, where you have the theatrical release; the international release; the DVD release; then pay-per-view, then HBO, and then eventually it goes to network. All that’s going to compress and change.

You get 24 hours to watch your show — it’s The Man putting his thumb down on me.

Both the music industry, and the entertainment industry in general, are having tremendous trouble adapting copyright to the new digital age.

With regards to the stakeholders in the traditional media business, people always say to me, why don’t they just do this or do that, set up their own distribution system. The problem is this — there are the publishers; there are the record companies; the artists; the artist management; people who have master licenses, different sorts of rights to the music, publishing rights and what-not; and they all have to agree on a new model. And everybody wants a bigger piece of the future, and to be less [expletive deleted] than they have been in the previous version.

And everyone that has a piece of that pie wants a bigger percentage, because the pie is getting smaller, and because they feel they’re not getting what they’re supposed to get out of the deal. Until they can all agree, that pie gets smaller and smaller and smaller, as everyone clings to the traditional physical product rights realities.

It almost takes, like the Roman Empire, a complete collapse for it to become something different. As long as those systems are in place that define what the record business is, it’s never going to substantively change.

I’ve talked to a lot of different brands who don’t want to even talk to the record companies. They don’t want to have anything to do with it, because it’s this labyrinth of rights and issues, and everybody wants a ton of money for every little thing. Or they want a bigger piece of this, or control over that, and it’s just a mess.

That’s how the entertainment business evolved over time, with these different people having different elements of control; and now they’re all being forced to simultaneously make massive decisions about how this is going to change. No one can agree, and they’re never going to like each other very much because they’ve always been in conflict with each other, competing. The record companies were always the 800-pound gorilla, and now they’re calling for help; and people say “gee, we’ve got the big bully on the block down a little bit,” and nobody really wants to help them.

You’ve got these big live promoters – that’s where the action is now, is on the live scene – they’re the new center of power, these Live Nations, these companies that are signing Madonnas and people like that. They put them on tour, they make the real money there, and the record becomes a loss leader to generate interest in the live performance. People will spend $45 for a t-shirt for Kings of Leon at their live event, but they won’t spend $5 for the album. They’ll go get it off a file-sharing service for free. But if they have a live disk from the show they were at, they’ll spend $45 for that.

People still want music, they still want content, they still want media. But the systems in place to support the production and distribution of those things are not flexible enough to accommodate what consumers want. Rights restrictions, DRM — people don’t want that. Eventually, that has to go away.

It will only go away when the whole thing blows up. I want an MP3 of my song in my car, on my iPhone. I want to have it on my computer. I want to listen to it wherever I am and not have to think about compatibility between devices. I want movies in an AVI file, so I can watch them on any device anywhere. I don’t want to have to deal with the rights and crap. Like on DirectTV you get 24 hours to watch your show if you order it On Demand – that’s never going to work. It’s The Man putting his thumb down on me.

In 2009 Jeff Suhy joined Zoic Studios, the visual effects house in Culver City, California. How the former A&R executive found himself working alongside the creators of spaceships for Battlestar Galactica and vicious monsters for Fringe is not only the story of one man’s career, but of the trajectory of the entire entertainment industry over the past three decades.

In the first part of this two part interview, Suhy describes the path of his career and how he came to Zoic as Creative Director – Digital Strategy. In the second part, he discusses the current state of the record industry, and what the catastrophic changes there portent for the entertainment industry as a whole.

So, you started out at the 128th best university in the country [Louisiana State University and Agricultural and Mechanical College].

Is that what it is? [Peals of laughter.] That’s awesome! Out of how many, 150?

I was a track athlete in high school and I was recruited by a number of schools. My only real criterion was that I go to a warm place, and the warmest place that recruited me was LSU. So I went to LSU on a track scholarship.

Where did you grow up?

Chicago area, suburbs of Chicago.

So what was it like going to the South?

It was great. I was born in Tallahassee. So my family is from the South, and we somehow found ourselves in Chicago, because my Dad was transferred a lot via work. … My goal in life was to escape the Midwest; and I really wanted to come west, but I really didn’t have any reasonable scholarship offers out of the West. So I went south into the heart of the beast. And I stayed there for five years.

I ran the college radio station there – I was music director, I should say. I ran it from the industry perspective, as opposed to the actual operation of the station. And I worked at a record store. We bought a bunch of imports, and I started to learn about all these independent and import artists, and started programming that stuff on the radio. We started working with some of the labels to bring the bands through Baton Rouge.

I discovered you could have a record store radio station, and you could promote music and actually turn an artist that no one had ever heard of into something that people actually wanted to see. These bands would come touring through the US, and would have a date in Atlanta, then they’d go to New Orleans, then they’d go to Houston, and maybe they would have a stopover in Baton Rouge for the night. What they’d discover was that the shows in Baton Rouge were bigger than the shows in the major markets… because we were promoting the artists on campus. We ended up creating a successful scene there.

This was the mid-80s, right?

The mid-80s’ yeah – ‘84 to ‘88 would be the time frame. Then I started talking to SST Records, they wanted to bring me out to L.A. I’ll tell you the whole story, even though I know zero of this story should end up on [the blog post.]

So I moved out to L.A. thinking I was gonna work for SST Records, and when I got here they were bankrupt. I had nowhere to work and nowhere to live. I had a couple of hundred bucks in my pocket. And my Dad said “you’re an idiot.” My uncle gave me a place to stay on the floor of his apartment. I was resigned to survive L.A., even though I was having a really hard time.

I took a job at Larry Flynt Publications, as marketing coordinator, because I found it in the newspaper the day I got here and realized I didn’t have a job. [Suhy describes his job censoring pornographic material for ads, with NSFW details.] That was the most glamorous part of that job.

My feeling was, where is the creativity going? I wanted to follow the creativity.

As you might imagine, I was pretty diligent while taking the money from that job — I think $18,500 a year was my salary — taking that money and surviving until I could get myself into the music business, which is why I came out here.

The heavens opened, and I ascended to A&M Records in a miraculous scenario that changed my life. I stayed there ten years, and became vice president of A&R there, during that 10 year period.

And then A&M was acquired by Universal, and they fired everybody including me, even though I was so great. I had about a year-and-a-half on my contract to figure out what I wanted to do with my life, which was fortuitous, because I didn’t have to work. So I spent a lot of time on the Internet. I was really into technology and computers; I had an Apple II Plus when I was in high school in ‘82-‘83. So I was always trying to figure out technology, write programs, and hack things.

Then Napster came along when I was on my hiatus, and I went a week without sleep; I was obsessed. And at the end of that week I realized … I was going in to music & technology.

So I found a couple of guys…, and collectively we started a company that was ultimately called Nine Systems. … We worked with all the entertainment companies, and we built a software platform over a period of seven or eight years; and that was ultimately acquired by Akamai… which is a pretty major tech company, in December 2006. I stayed there for two years, and then escaped the MIT-PhD-math world and came back into the entertainment business, which is where I am now at Zoic. To combine my vast production and content experience with my now vast technology experience, and find ways to help media companies solve the riddle of the digital media era.

Can you talk about what you’re doing right now?

Right now we’re working with ad agencies on everything from banner ads, to other basic web implementations for brands. We’re working with some online brands in the redesign of their web sites and rebranding efforts. We are working with game companies to develop new ways to market their video games to consumers. It’s all little pieces of a big puzzle.

We’re developing original IP right now, which is a product called Media OS. We’re very optimistic that’s something a lot of our clients are going to find very useful to manage and build online media experiences.

But why Zoic?

Good question. As I was thinking about what I wanted to do next, I met [Zoic Studios founders] Loni [Peristere and] Chris [Jones] and [CFO] Tim [McBride], and realized there is a kindred spirit here. There is a support structure here to have that entrepreneurial, “invent-something-new” environment, combined with a stable, thriving creative organization that is very client-focused and very flexible. It isn’t all rigid and CFO-driven — it’s very creative-driven. It has … a start-up kind of vibe, but it’s well-established. Zoic is trying to leverage “visual evolution” into the new age of digital media, and I saw that was a great fit for me, I could help that happen.

Nobody wanted to hear anything about technology; hopefully if you just close your eyes and litigate against it, it will go away, you know?

I spent many years of my life at A&M being very artist- and very creative-driven; creating media, understanding pop culture, and understanding how people respond to media, how to market media; everything that was very media-oriented and entertainment-oriented. And I love that environment, everything being driven from a creative perspective. And I saw it dying in the late 90s, as corporate methodology was coming into a business that was once very naïve and gut-instinct-oriented. If you didn’t have a hit with an artist, it was an artist-development environment, where if everyone in the company believed in the artist, you would keep trying to foster their success, even though they wouldn’t have necessarily have any immediate returns on their first record. I just love that environment.

The record business became sort of a “home-run-or-forget-it,” a hit business. And the economics changed; the value of the art changed; it became much more of a commodity, much more commercialized. It became much less appealing. My feeling was, where is the creativity going? I wanted to follow the creativity. I wanted to use my experience in developing artists…

I had a certain skill set, but I had never had a chance, because of the myopic nature of the record business, to be able to use my technology background and interest in technology, because [the industry] was very phobic. Nobody wanted to hear anything about it; hopefully if you just close your eyes and litigate against it, it will go away, you know? I was doing all kinds of interesting stuff in technology, and it was not a receptive environment to that type of thing.

I also got tired of going to clubs, and I got more interested in sitting in front of my computer. I knew there had to be a future with music online and content online, and I wanted to have a deeper understanding of that, to the root. So I dove from production A&R into software, and let my geek side come out. That was very rewarding, and I enjoy that business and enjoy software and Internet content and digital media, all that stuff. I love what’s happening right now, it’s a very exciting and dynamic time.

I see a lot of companies and people struggling with how to make sense of it, and companies trying to market their artists, or market their media, their brand –I know where these people come from because I was there. It’s tough to wrap your head around these new models. I enjoy combing the new sensibility and contemporary thinking in digital media with an analog state of mind, which used to be and still is in some degree the prevalent way of thinking in the media business.

The best way to do that was to start a company, and develop this software that nobody had and which became really valuable, and was purchased for $160 million by Akamai. I did time at Akamai, which was fascinating, because then I got really deep into the technology. But I also discovered I don’t really want to go there, that’s not really where it’s interesting for me, it’s too much; and I needed to find a place that had an understanding of both [creativity and technology], and that’s why I’m at Zoic. It’s a company that embraces technology but has a traditional understanding of and adoration of creativity. Understanding those things is the future, and I’m in the future now, that’s why I’m here.