In 2006, the Russian movie Night Watch made a strong impression on American producers. Director Timur Bekmambetov’s innovative use of visual effects created a definitely unique movie experience, enough so to land him his first American directorial gig with Wanted (which opened June 27 from Universal). In this adaptation of a comic book created by Mark Millar, a young “nobody” (James McAvoy) finds out that he is the son of a legendary assassin. He enters a mysterious fraternity where he is trained to become a perfect killing machine, a human being able to bend the laws of physics and gravity to his own advantage.The movie required more than 800 visual effects shots, a massive effort initially supervised and produced by Jon Farhat. However, during post-production, Farhat fell very ill and had to be replaced by Visual Effects Supervisor Stefen Fangmeier. “I had directed Eragon, and, at that time, I was exploring new directing opportunities,” Fangmeier says. “I didn’t want to get into a vfx assignment that would tie me up for too long. This project was perfect in the sense that they only needed somebody for four months to come in and finish up.”

Stepping in a colleague’s shoes is never easy, but on Wanted, Fangmeier ended up facing many other challenges. “When I came in, most of the shots had already been assigned to a variety of vendors. The majority of the visual effects were being created by Bazelevs Studios, Timur’s own company in Moscow. They did almost 500 shots encompassing a very large range of effects. We also had Hammerhead, Hydraulx, PacTitle, Hatch FX and CIS Hollywood and Framestore in London. So, I had to delve into shots that someone else had conceived, with visual effects already well underway and with key creative people based in Moscow. Since Bazelevs were in charge of two thirds of the shots, I primarily focused on the work that was being done there.”

Adaptation
The Moscow-based studio had produced some spectacular shots for Day WatchNight Watch, but Wanted was their first American production. This new experience didn’t go without difficulties, as Hollywood doesn’t do things quite the same way as a Russian director employing his own company on his projects. and

“They really have a strong talent pool there, they also have the software, but they didn’t have any experience interfacing with a major studio, dealing with constant editorial changes, and meeting a schedule of deadlines, etc.” Fangmeier observes. “For instance, when I got involved, they only had 12 final composites out of 500 shots, and we were already close to the original deadline (the movie was initially due for release in March 2008). One of their issues was to get the director to buy off on concepts and shots. They had to date done many different versions for quite a few shots. At one point, somebody needed to make decisions and get the shots done. So, part of my job was to establish priorities, to select the 50 or 70 shots that could be completed each week, and to push them forward. For the remainder of the post schedule, we needed to finalize 45-50 shots per week in order to meet the deadline! It definitely put a lot of pressure on everybody… So, this project was a creative challenge on one hand, but on the other also a significant production challenge.”

After his first three days in Moscow assessing the production, Fangmeier requested that American Visual Effects Producer Steve Kullback should join him in order to wrangle the production management side of things. VFX Producer Juliette Yager had already been brought on board by production. “If there is one thing I appreciate after 15-and-a-half -years at ILM, it is the importance of very rigorous production management!”

Moscow's Bazelevs Studios created the sequence in which assassins ride a train rooftop for a clear view of their target in an office building. CG is used extensively. Courtesy of Bazelevs Studios.

Time Manipulations
Bazelevs used an NT-based pipeline that included SOFTIMAGE and Maya for 3D, RenderMan and mental ray for rendering and Nuke or Fusion for compositing. The company was responsible for a great variety of shots: CG rats, CG bullets, digital doubles, assassin POV effect, fluid simulations, etc. Some of their key effects included the many speed changes that Bekmambetov had envisioned for his movie. The shots were filmed at very high speed, and then digitally altered to modify the frame speed, some time from normal to very slow to ultra fast to normal again, all within a single shot. 2D artists worked from templates that the film editors had designed in Avid. Using time flow algorithms, they changed the speed of the shots while trying to retain the original image quality. A task not as easy as it sounds as the time warp process generates a lot of artifacts.The speed changes allowed the camera to follow a bullet up to the point where it hit its target. Bazelevs created the bullet in Maya and added reflections and shadow on the environment to better integrate it. “In one shot, a bullet flies around Angelina Jolie’s head in full close-up,” notes Fangmeier. “When the bullet passes by her, you can see its shadow on her face and then her hair slightly moving, and finally her eye blinking. We also worked hard on the depth of field to keep the bullet realistically in focus, while Angelina would go from blurred to sharp to blurred again. Since the shot was in very slow motion, we needed all those subtle details to sell it… ”

Moving VFX
One of those stylized shots forms the climax of a sequence in which the two lead assassins travel on a city train rooftop in order to get a clear view on their target in an office building. “The actors inside the building were shot in an interior office set. We then added a CG exterior, a CG window, CG glass debris, and the city background. For the exterior shots on the train, James McAvoy and Angelina Jolie were filmed on a partial train rooftop set in front of a greenscreen. We then extended the set in CG to get a complete train. In order to create the cityscape, five cameras were bolted on top of a real elevated train traveling through downtown Chicago. The plates were then tiled together to create a cyclorama, and later combined with the foreground elements. We also added CG cars in the background, and created an entire bridge that the train goes under. It was a fairly complex combination of 2D and 3D elements.” Another major speed change occurs during the opening scene where a sniper gets shot in the head — with the camera following the bullet exiting the victim’s forehead in gory slow motion. The actor’s face was extracted from the plate and re-projected on a CG head that was deformed by the CG bullet animation. Fluid dynamics made the CG blood follow the bullet’s motion. “The movie is gory, but those shots are so stylized that the audience understands this is not reality. After all, the story is based on a comic book and we had to preserve that quality.”

One of the most memorable images in the movie features a character jumping through a window, with thousands of glass debris sticking to his body. In the longer shots, the actor was shot greenscreen and a tracked CG double was animated through a CG window, starting off a rigid body simulation created in Maya. A close-up of the same action was produced using an entirely CG head.

A character jumps through a window with thousands of glass debris sticking to his body. The actor was shot greenscreen and a tracked CG double was animated through a CG window. Courtesy of Bazelves Studio.

Individual VFX sequences
Meanwhile, in America, other vendors were hard at work on specific, isolated sequences. “Hydraulx was responsible for a complex effect that was meant to go completely unnoticed,” Fangmeier says. “Colin Strause and his team built a CG weaving machine for a sequence in which our hero needs to slow time down in order to find an object that is attached to one of these tens of thousands of threads. At Hammerhead, Jamie Dixon supervised the car chase and created the scene in which a CG Ford Mustang flips over a live action limo. As for Hatch FX, Deak Ferrand and his team did two extensive matte paintings for the prologue.”In London, Framestore was called in to create the climactic train crash sequence that takes place on a bridge. In-house Visual Effects Supervisor Craig Lyn oversaw the challenging assignment. “There were several obstacles that we had to overcome,” he notes. “The first being the tight production schedule that we worked under. We completed over 117 shots in a three-and-half-month period. This included our pre-production phase for the build and look development of our digital assets, which included both a CG train as well as a full digital environment of a gorge. While pre-production was going on, we had to lock animation, which took place over a three-week period. Due to the compressed schedule, lighting of the shots occurred concurrently to the digital environment build, a less than ideal situation.”The biggest challenge was a shot that ran more than 40 seconds: a train carriage falls down into the gorge, impacts a rocky outcrop, and then scrapes down the side till it comes to a rest. At the end of the shot, the CG train has to seamlessly transition into a live action plate of the carriage. The shot ran the full length of Framestore production schedule. By the end of the show, almost the half of the crew was dedicated solely to delivering this one shot. Framestore’s software pipeline was predominantly Maya-based for animation, lighting setup and digital environments. Vfx work, which involved dust, smoke, debris and rigid body dynamics, was done in both Maya and Houdini. On the rendering front, the team utilized a hybrid solution of both RenderMan and mental ray. The compositing work was done entirely in Shake.

Framestore had to work on a tight schedule to create the climactic train crash sequence that takes place on a bridge. The studio completed more than 117 shots in a three-and-a-half-month period. Courtesy of Framestore.

Full CG Environment
The team built the train from assets supplied by production, and then detailed it out, based on reference photography. The digital environments turned out to be a much tougher challenge. “We had to create a fully CG gorge that was seen from any number of angles,” Lyn observes. “The gorge was built using low resolution meshes in combination with higher resolution ones for the more detailed areas. Texture maps and matte paintings were then projected onto the surfaces from multiple camera angles, and we were able to reuse many of the common angles for multiple shots. The break off pieces for both the train and the gorge were a combination of several techniques. Hero debris was animated traditionally, while the smaller ones were done using rigid body simulations from both Maya and Houdini. The deforming shapes of the bridge being ripped apart, and the train being squashed, were sculpted by our modeling crew, and then used as blend shapes.”Framestore’s rendering pipeline was HDRI based with reflections and heavy ray–tracing done in mental ray. That data was then passed back into RenderMan for the final renders.

The team also built low resolution digital doubles for Jolie and MacAvoy. “The trickiest bit was Angelina’s hair, which was supposed to be long and flowing,” Lyn explains. “We didn’t want to go to the trouble of a CG hair build and groom, since she was only in a couple of shots. Instead, we did a simple bluescreen shoot in an alley behind one of our buildings of a wig on a broom handle, and then tracked it in 2D!”

This outrageous sequence concludes a movie filled with unique moments and imagery. Indeed, Fangmeier was repeatedly impressed and surprised by some of the concepts that Bekmambetov came up with. “Timur has a great imagination for this type of things. There are some really good moments in the film where you feel: ‘Wow! This is a neat idea. I’ve never seen it before!'”

The tale of the Monkey King is as much a part of Chinese culture as Mickey Mouse is to American life, and the chance both to tell this story and have martial arts superstars Jackie Chan and Jet Li appear together in the same film for the first time was just too good an opportunity for director Rob Minkoff to pass up.

“The chance to interpret the character in this film, get Jet Li to play it and then kind of present this character to the West, it’s almost like the story of the movie,” says Minkoff, who directed both Stuart Little movies and co-directed The Lion King.

The Forbidden Kingdom (opening April 18 from Lionsgate) begins with American teen Jason Tripitikas (Michael Angarano), a martial-arts movie geek who is beaten up by local bullies and wakes up in mythical China. Tasked with returning a mystic weapon to the Monkey King, who’s been imprisoned by the Jade War Lord (Collin Chou) for more than 500 years, Jason is aided in his quest by kung fu master Lu Yan (Chan), the Silent Monk (Li) and the beautiful Golden Sparrow (Liu Yifei).

But bringing to life Forbidden Kingdom required a lot of work in a very short timetable, especially when it came to using visual effects to mix the film’s Hong Kong-style martial arts action with the storybook fantasy of the original myth.

Minkoff says he wanted the visual effects to evoke the feel of classic Hong Kong films. It also needed to balance the story’s sense of storybook fantasy and realism. “The audience is a little more sophisticated, so some of the fog-machine effects with the dry ice obviously weren’t going to cut it with us,” he adds. “We obviously wanted something that was slightly more contemporary.”

Minkoff says the effects work ended up staying largely in Asia, thanks to Exec Producer Rafaella DeLaurentiis, who was impressed by the high quality and low cost of some work done by a Korean house. “She thought that would be an interesting option for us,” continues Minkoff. “It’s a Chinese story, Asian-themed, and would require a sensitivity that might be a natural fit with Korea.”

Work ended up being spread around a number of vfx studios, with a trio of South Korean houses leading the charge: Macrograph, DTI and Footage.

But first, the film had to go through a short prep of eight weeks and then into a tight, 101-day shooting schedule in China. Ron Simonson came onto the project about a month into shooting as the senior visual effect supervisor, and says the biggest challenge was getting up to speed on what was being shot on a set full of green safety pads and wire rigs, and making sure it would work for the visual effects artists later on.

“It was ‘how to rig the wires and the pads so it works best for us’ and make sure we get all the pieces shot to replace all those things,” says Simonson. “Basically, it’s just kind of like, ‘OK, this is what you want to do. Can we maybe move this a little bit that way and move the camera a little bit that way cause that’ll work better?'”Simonson worked closely with stunt coordinator Woo-Ping Yuen and Minkoff on shots, using an on-set previs team to quickly test ideas for changes in or additions to scenes in the edit.

Shooting in China had the benefit of being authentic and costing much less than other locations, though there were some differences and issues with technical requirements. Minkoff says they had some concerns about the ability to hang high-quality greenscreens that were solved with DP Peter Pau’s suggestion of covering all four walls of the stage with plywood and green paint

Of course, having a good greenscreen stage also upped the ante. “The number of sequences that we ended up setting and shooting on the greenscreen stage just made the numbers shoot up,” confirms Minkoff.

Minkoff’s background in animation also helped meet the tight deadlines. “Rob being able to articulate what kind of effect we were talking about and then having it worked up there and dropped into the edit, into the Avid, and see how it worked, really expedited the process,” Simonson says.

Simonson was on set with a crew of about 10-12 visual effects artists, including previs and postvis artists who were essential in planning and executing shots. “Some of the bigger shots were created quite late in the game,” Simonson says. “We’d be looking at the edit and Rob would say, ‘We need some way to get from here to there,’ and we would draw it up and it would save a lot of time getting it to the animators.”

The film also called for a lot of diverse visual effects that made it difficult to, as Simonson suggests, achieve some economy of scale. “There’s big environment stuff, there’s water, we have fire, we have CG weapons, we have the chi energy effect,” he reveals. “All that separate stuff all took a lot of R&D.”Most of the work was split up between the three lead houses in South Korea, with Asia Legend in Hong Kong doing a lot of wire and object removal, Simonson says. Other contributing houses were Xing Xing in Beijing, Frantic Films of Vancouver, and Illusion Arts Digital, Stingray VFX, Svengali and Digiscope of Los Angeles.

Coordinating all this was difficult, Simonson says, as the locations of the houses crossed the International Date Line as well as language and cultural boundaries.

Working on a Hollywood film also had benefits to the Korean houses, who emphasized in bidding on the project their ability to work hard and produce quality work even it meant the crew went without sleep for months on end, says Minkoff. While he says they definitely didn’t want anyone to work that hard, “it seemed like there was a sympathetic kind of attitude about their ambition, which was to break out of the Korean market into the larger Hollywood market,” he says.

While much of the vfx work was along the lines of wire and object removal, Simonson cites as a favorite the movie’s opening scene in which the camera swoops through the sky until the figure of the Monkey King appears to be standing atop a cloud. The Monkey King, played by Li, then proceeds to fight his opponents as they stand on the very tops of mountains protruding through the clouds.

“It took a lot of look development to get a level of realism, but also stay in the kind of storybook land vision that that scene is,” he says. “I opted to shoot real clouds for the fly in and all the mid-ground mountains, background mountains, mist and everything else was CG.”

A battle sequence set in a field of cherry blossom trees featured no real trees — just a stick in the ground with every blossom glued to the trees, Simonson offers.

The film doesn’t entirely take place in ancient China, and replicating modern Boston — where Jason begins his journey — required a combination of real plates and stitching together still photography in Nuke to create the cityscape.While Chan and Li are formidable weapons in their own right, the script also featured a powerful staff and the witch-like Ni Chang, played by Li Bing Bing, who uses a whip and even her own hair as a prehensile weapon in battle.

“That was again a lot of coordinating with the fight guys on set and coming up with ways to mimic the hair and the whip so (the actors) could react to it,” Simonson says. “We used ropes and lines attached to Bing Bing so that when Jackie’s grabbing it they could later replace the rope with the hair.”

The hair in particular was a difficult effect to work out. “Everyone was worried over whether the hair was going to work,” says Simonson. “We were trying to figure out alternative things to shoot in case the hair didn’t work while we were doing the R&D. But, fortunately, we got the test done early enough that the director was comfortable with how the hair was going to work and they went from five or six shots of the hair to 35 shots with the hair once they were comfortable with it.”

Simonson says that the various houses worked on about 900 shots overall, though with some sequences getting cut from the film the final on-screen tally is around 750. “About 25% of that is wire removal, rig removal. There’s a lot of background cleanup. In all these beautiful Chinese exteriors, there’s power lines and stuff in every single one of them, so all that stuff had to be removed.”

While shooting in China was different in many respects, Minkoff also says there were fewer hoops to jump through than when making a movie in North America or Europe. It also lent authenticity to the story of the Monkey King.

“That was the attraction of doing it,” he says. “If you have to go shoot China in Palmdale, what fun would that have been?”

Group of game-industry veterans think plotlines should be the priority

At the end of a long day working in Hell’s Kitchen, N.Y.P.D. detective Max Payne returns to find his home being ransacked by armed junkies. High on a new designer drug called Valkyr, they open fire on the cop, who stumbles over the dead bodies of his wife and newborn daughter. Killing the murderers doesn’t quench Payne’s thirst for revenge, and he sets out to find the sources of Valkyr and make them pay.

It sounds like the setup for a movie—and it is, now. “Max Payne” will be released in 2009, courtesy of 20th Century Fox, with Mark Wahlberg in the starring role. But the story didn’t start as a screenplay; it debuted seven years ago as the plot of a videogame and spawned two interactive sequels before making it to movie theaters.

Since its earliest days, the videogame industry has been enamored of Hollywood, and with turning big-screen stories into interactive worlds—with a range of success. Atari’s E.T. game is said to have ushered in the videogame industry crash of 1983, but blockbuster franchises have come out of Harry Potter, Shrek, and Lord of the Rings. More recently, Hollywood has been mining videogames (and their huge male fan base) for box office gold. The results have been just as mixed.

“Few games have translated well to film,” says Michael Pachter, videogame analyst for Wedbush Morgan Securities, in New York. “’Doom’ was a flop, as were the second ‘Mortal Kombat’ and ‘Super Mario Bros.’ movies. ‘Resident Evil’ has done well, as have the Lara Croft films, so I’d say it’s hit and miss.”

Now, some of the people behind “Max Payne” are trying to change that. In June 2007, Hollywood producer Scott Faye, owner of Depth Entertainment; Scott Miller, also head of game developer 3D Realms; and Jim Perkins, former CEO of game developer-publisher Arush Entertainment, formed Radar Group. Rather than creating a game, then licensing it as a film, or vice versa, Radar will cultivate story lines—“storyverses” in company parlance—that transcend any one medium, whether linear or interactive. From there, they can spin out movies, videogames, comic books, and anything else that might emerge.

“I think that because we’re starting at the outset, both cultures will have an incredibly solid foundation for an ongoing evergreen franchise,” says Faye.

In addition to Max Payne, Perkins and Miller have helped develop highly successful game franchises including Duke Nukem, Prey, Doom, Blood, and Shadow Warrior. Together, their games have sold more than 35 million units globally. The pair have also founded, expanded, and sold three successful publishing companies — Arush Entertainment, to a foreign-distribution company in 2004; Gathering of Developers, to Take-Two Interactive in 2000; and FormGen, to GT Interactive in 1996—generating a combined $1.5 billion.

They’ve invested some of those proceeds into Radar, which has three games in development: “Earth No More,” an environmental-disaster action story; “Prey 2,” an alien-invasion game with a Native American protagonist; and Incarnate, a horror story in which evil must be hunted down and imprisoned (and whose concept came from Hollywood screenwriter Frank Hannah, who wrote “The Cooler”).

Usually, a movie based on a game gets green-lit only after the game has been released and built an audience. But Depth Entertainment is already shopping Radar’s stories around to studios—even though the games are still a few years away from hitting shelves. Merchandising and expanding an intellectual property from the get-go has been a long-standing Hollywood strategy, but the concept is still new in the game business, where all the focus generally remains on creating the game.

The typical game developer turns to a publisher to cover the costs of producing a game and subsequently surrenders ownership of that property. Once the game recoups the publisher’s loan, the developer begins to earn royalties. Radar is instead taking original ideas, partnering each with a game developer—it will work only with independent shops like Human Head Studios and Recoil Games—and then cutting distribution deals with publishers. The startup is working with retained adviser Gallipo Group, a new videogame venture-capital company, and expects to have $90 million in funding by this May.

By 2011, Radar plans on releasing three or four games per year, with eight to 12 projects in development at any one time. The franchises are expected to launch on PC, Xbox 360, and PlayStation 3 platforms and gradually expand to Wii, Nintendo DS, and PSP.

One thing the company principals won’t ever do is license a Hollywood property. They’ll leave that task to companies like Brash Entertainment, which is sinking all of its funding into movie properties like Saw, Speed Racer, and Space Chimps. Miller believes that’s a doomed enterprise. But without a Hollywood association to fall back on, Radar’s games will have to be stellar to win over fans.

High-end VFX for Method Studios in Los Angeles

Working in England, where he was Head of 3D Commercials for Framestore CFC, Andy Boyd developed a passion for creating high-end visual effects. His portfolio includes two well received Rexona commercials that feature digital animals running wild in the urban jungle. One of these creatures was even featured on the cover of 3D World #89 along with an article highlighting Andy’s furring technique.

In the summer of 2007, Andy set off for Los Angeles and a new job working at Method Studios. In six short months, Andy has tackled several high profile projects that range from a Hummer commercial to a high-profile Super Bowl ad for Bridgestone. In each project, Andy uses Houdini to help him meet tight deadlines while creating effects that can be easily revised in response to client and director feedback.

Particle Splashes

The first advertisement Andy worked on at Method Studios was a car commercial. A digital Hummer was being driven through a pool of water and Andy needed to supply the splashes. Given the tight timeline, he decided to use Houdini’s particles instead of a full blown fluid simulation because the water didn’t need to settle. He used Houdini’s particle fluid surfacer to create the splash geometry which he then rendered in Mantra using Houdini 9’s physically-based rendering.

Little Minx Project

Andy’s next project was a short film called As She Stares Longingly at What She Has Lost by director Phillip Van. Set to melancholy music, the film is part of the ‘Exquisite Corpse’ project launched by Little Minx in partnership with RSA Films. Andy worked with a team of talented artists to create an entire forest, a waffle cloud, a waterman, and vines.

For this project, actors were shot against blue screen and all the environments created digitally. The trees needed to be highly detailed in order to give an ominous feeling to the scene. Realizing that he would have to manage all this detail as efficiently as possible, Andy took advantage of the Mantra: Delayed Load feature. Trees were set up as scattered points on a grid with parameters that would populate the tree with details such as branches and vines at render time.

As Mantra rendered the scene, the geometry needed for each section was loaded in. Then as Mantra moved on to another section, the geometry was removed and new pieces were loaded. This approach allowed Andy to put as much detail into the scene as he needed without any memory limitations. At one point in production, he created vines that would creep up the trees but this shifted the focus away from the characters and did not get used in the final film.

By choosing Mantra, Andy was able to add motion blur, depth of field and volumetrics without significantly impacting his rendering time. Many of his colleagues at Method Studios were used to adding depth of field later using compositing techniques and were impressed that he could combine camera effects in one render pass. All of the shadows were created using the new deep shadow technology so that raytracing would not be needed.

Going to the Super Bowl

The ads that play during the Super Bowl have become as much of a spectacle as the game itself. Super Bowl ads are scrutinized in the press and companies pay a lot of money to showcase their products on the big day. For Andy, a Bridgestone ad called Scream would be his introduction to the world of Super Bowl ads.
In this spot, Method Studios had to create a digital squirrel that almost becomes road kill as he retrieves a fallen acorn. The squirrel, a number of other forest animals and a female passenger all scream out in fear while the driver, confident in his Bridgestone tires, easily swerves around the frightened animal. Working under a tight six week schedule, Andy would need to help create a number of digital animals including the squirrel that would be cut against a live-action squirrel. To make things even more challenging the squirrel’s scream would be a close-up shot in HD that would leave nothing to the imagination.

Andy’s experience creating furry animals at Framestore CFC came into play with one key difference. In England he was rendering with RenderMan and had access to programming talent to build all the fur procedurals needed to achieve a realistic look. In Houdini, Andy needed to create his own system using the Mantra fur procedural. Luckily the grooming features of the fur could be created using Houdini’s CVEX language instead of coding in C. This was a time saver because the CVEX didn’t need to be compiled every time a new feature was added.

Andy needed to add lots of detail to the squirrel because of the HD broadcast. He imported the animated squirrel into Houdini and fixed smoothing problems using Houdini’s procedural modeling tools. He then assigned and groomed guide hairs that would be used by the fur procedural to create the final fur. These curves were then run through a Wire dynamics simulation for added realism. The procedural was then used to generate about 1.5 million hairs – all at render time. Andy also used a CVEX shader to set up clumping and painted a number of different attributes on the squirrel’s skin to control the final look of the fur.

Andy’s confidence in the fur tools he created in Houdini helped him handle such a high level of realism, in such a tight schedule. Being able to have complete control without relying on programming talent showed that even smaller shops can create film-quality work while respecting client budgets.

Going to the Super Bowl

The ads that play during the Super Bowl have become as much of a spectacle as the game itself. Super Bowl ads are scrutinized in the press and companies pay a lot of money to showcase their products on the big day. For Andy, a Bridgestone ad called Scream would be his introduction to the world of Super Bowl ads.
In this spot, Method Studios had to create a digital squirrel that almost becomes road kill as he retrieves a fallen acorn. The squirrel, a number of other forest animals and a female passenger all scream out in fear while the driver, confident in his Bridgestone tires, easily swerves around the frightened animal. Working under a tight six week schedule, Andy would need to help create a number of digital animals including the squirrel that would be cut against a live-action squirrel. To make things even more challenging the squirrel’s scream would be a close-up shot in HD that would leave nothing to the imagination.

Andy’s experience creating furry animals at Framestore CFC came into play with one key difference. In England he was rendering with RenderMan and had access to programming talent to build all the fur procedurals needed to achieve a realistic look. In Houdini, Andy needed to create his own system using the Mantra fur procedural. Luckily the grooming features of the fur could be created using Houdini’s CVEX language instead of coding in C. This was a time saver because the CVEX didn’t need to be compiled every time a new feature was added.

Andy needed to add lots of detail to the squirrel because of the HD broadcast. He imported the animated squirrel into Houdini and fixed smoothing problems using Houdini’s procedural modeling tools. He then assigned and groomed guide hairs that would be used by the fur procedural to create the final fur. These curves were then run through a Wire dynamics simulation for added realism. The procedural was then used to generate about 1.5 million hairs – all at render time. Andy also used a CVEX shader to set up clumping and painted a number of different attributes on the squirrel’s skin to control the final look of the fur.

Andy’s confidence in the fur tools he created in Houdini helped him handle such a high level of realism, in such a tight schedule. Being able to have complete control without relying on programming talent showed that even smaller shops can create film-quality work while respecting client budgets.

Flying Free

The tools and techniques used to create fur for the Super Bowl squirrel were quickly put to use on Andy’s next project. In a commercial developed for Washington Mutual Bank, digital hair would be needed for a bald man who imagines driving a convertible along the coast as his hair grows back in front of our eyes. The tools used for this project were easily re-purposed from the fur project except the guide hairs would require more styling control and the wire dynamics would be much more dramatic.

“The flexibility of Houdini’s approach makes it easy to start from an existing solution instead of building every project from the ground up,” says Andy. “When working with tight deadlines, this gives us more time to focus on the creative needs of the project. For example, Jack Zaloga, Junior TD, was able to pick up the fur system from “scream” and right off the bat was render hair blowing around without any prior fur/hair experience.”

These projects demonstrate how far commercial VFX have come. These projects can be a real test-bed for tools and techniques that must achieve feature film quality in the new HD world. Tight deadlines rule the day and artist ingenuity is a critical part of the process. One can only imagine what Andy and Method Studios will pull off over the next six months.

“Call of Duty,” praised for its unique online multiplayer leveling system, was named overall game of the year and console game of the year. It was also honored as the top action and online game. It features an intense single-player mission revolving around global terror as well as a diverse set of multiplayer modes.

“BioShock,” which had a record-setting 12 nominations, won awards for art direction, story development, music and sound.

“The Orange Box,” a compilation of five distinct games, was named computer game of the year. Its mind-bending physics puzzler, “Portal,” was honored for game design, character performance and game play engineering.

The awards were handed out by the Academy of Interactive Arts & Sciences at the D.I.C.E. (Design Innovate Communicate Entertain) Summit. Winners were selected by panels of engineers, designers and others in the industry.

Since 1996, the Interactive Achievement Awards have recognized outstanding games, individuals and development teams that have contributed to the advancement of the multi-billion dollar worldwide entertainment software industry. More than 160 titles were played and evaluated by members of the Academy’s Peer Panels. The panels are comprised of the game industry’s most experienced and talented men and women. Each panel is responsible for evaluating one award category. Interactive Achievement Award recipients are determined by a vote of qualified Academy members. Award voting is confidential, conducted online and supervised and certified by VoteNet Solutions, Inc. The integrity of the system, coupled with a broad-based voting population of AIAS members, makes the Interactive Achievement Awards the most credible, respected and recognized awards for interactive entertainment software.

About the D.I.C.E. Summit:
The D.I.C.E. Summit is a high-level interactive entertainment industry conference that brings together the top video game designers and developers from around the world and business leaders from all the major publishers to discuss the state of the industry, its trends and the future. The three-day event will be held in Las Vegas, at the upscale Red Rock Resort, February 6-8, 2008. Online registration for the D.I.C.E. Summit 2008 is open now. Please visit http://www.dicesummit.org for more information and to register to attend the interactive entertainment industry event of the year.

About the Academy of Interactive Arts & Sciences
The Academy of Interactive Arts & Sciences (AIAS) was founded in 1996 as a not-for-profit organization dedicated to the advancement and recognition of the interactive arts. The Academy’s mission is to promote and advance common interests in the worldwide interactive entertainment community; recognize outstanding achievements in the interactive arts and sciences; and conduct an annual awards show (Interactive Achievement Awards) to enhance awareness of the interactive art form. The Academy also strives to provide a voice for individuals in the interactive entertainment community. In 2002 the Academy created the D.I.C.E. (Design, Innovate, Communicate, Entertain) Summit, a once yearly conference dedicated to exploring approaches to the creative process and artistic expression as they uniquely apply to the development of interactive entertainment. The Academy has over 12,000 members, with the board comprised of senior executives from the major videogame companies including BioWare/Pandemic, Electronic Arts, Epic Games, Insomniac Games, Microsoft, Nintendo of America, Sony, THQ and Ubisoft. More information on the AIAS, the Interactive Achievement Awards and the nominees can be found at http://www.interactive.org.