Tag Archives: cgi

My final blog post will be on the 1999 animation The Iron Giant. It’s one of my favourite movies ever and I remember watching it like ten over times when I was a kid. Produced by Warner Bros. Animation, it is directed by Brad Bird, who later went on to write and direct other animations such as The Incredibles and Ratatouille.

The movie is based on the book The Iron Man by Ted Hughes. Unfortuately he did not get to see the final version as he passed away while it was still in production. The movie is set in America in 1957 and it deals with things like cold war paranoia, weaponry and innocence. The central core to the movie, as Bird had told Warner Bros. when explaining the idea was that the giant was a gun with a soul.

The Iron Giant uses both traditional and computer animation, and it was animated like an assembly line. Bird did not use the then-current mode of feature production when it came to assigning animators. The practice at Disney had long been to assign a specific character to one animator so that an animating supervisor would only be responsible for drawing one character. He decided to play an an animator’s strength and assigned them entire scenes based on emotion or action, regardless of which character appeared.

As for the giant, they used computer animation to create him as CGI would give him the mass and solidity and also give the impression that it’s from a different place. Bird says that the “separation between the 2D-animation and the CGI is something that helped establish the fish-out-of-water facet of the story.”.

As Bird did not want the giant to look so perfect that it lost the hand-drawn look – something that creating it in CGI would do, they took months to create a computer program that wobbles the lines of the giant to make it as if it was done traditionally by 2D animation. Existing special software was also extended and modified to accomplish some things, like the aiding in the shading of the giant, varying the lightening and darkening of some frames and altering grain patterns to affect the giant’s realistic inclusion in the 2D animated world.

Canadian director James Cameron directed The Terminator (1984). He is well known for his use of cutting edge visuals and effects technology. The Terminator is his first groundbreaking sci-fi blockbuster movie in the visual effects arena. He pushed the boundaries of special effects with The Terminator. It was during a period of time where Hollywood was experimenting with new means of visual effects through the production of films that fused the genres of science fiction and horror.

Seven years later, Cameron came back to direct Terminator 2: Judgement Day. Judgement Day came back even bigger than before, in terms of CG. It was the first film to feature a computer generated main character. The VFX in the film was completely top notch for that period of time. Not only was there the CGI Terminator, it also morphed and regenerated body parts. And on top of that, it could also turn into a mercury like liquid metal that seeped through little cracks. The movie paved the way for all the other VFX-laden movies.

Most of the effects was provided by ILM and the creation of the visual effects took 35 people altogether that included animators, computer scientist, technicians, and artist. It took ten months to produce, for a total of 25 man-years. And despite the large amount of time spent, the CGI sequence was only a total of five minutes on screen. But all this work was worth it because the visual effects team won the 1992 Academy Award for Best Visual Effects.

For the scene featuring Sarah Conner’s nuclear nightmare, the people from 4-Ward Production constructed a cityscape of Los Angeles using large-scale miniature buildings and realistic roads and vehicles. The pair, after having studied actual footages of nuclear tests, then simulated nuclear blast by using air mortars to knock over the cityscape, including the intricately built buildings. 4-Ward created a large layered painting of the city augmented with a radiating blast dome and disintegrating buildings created with an Apple Macintosh program called Electric Image. They also contributed a number of shots showing molten steel spilling out of a trough onto the floor, and used real mercury directed with blowdryers to create the eerie shots of the shattered T-1000 pieces melting into droplets and running back together.

Davy Jones stars as the protagonist in the second installment of the Pirates series. He is completely CGI and everything about him is so believable it’s crazy! Of course the team responsible for this had to be none other than Industrial Light and Magic.

The production shot real actors on set and digitally replaced them. In order to do this, each actor was scanned and modelled. They wore a motion capture suit which enable them to be replaced in post production. ILM was unable to rely on traditional MoCap or hand animation as there were multiple issues. It had to be done in special studios with multiple cameras and the cameras and tracking markers are special expensive equipment used only in a calibrated environment. Also, the data needed to be cleaned up tremendously as the data stream has both noise and errors. The whole process is complex to set up, and it’s also expensive and highly specialized therefore it wasn’t used. ILM created an innovative new system called Imocap and that allow onset and on location motion capture to elicit the most believable look and performance possible out of actor Bill Nighy.

He wore a pair of gray ‘pajamas’ with reference dots placed around the suit and his face, and his performance was captured entirely on set as he interacted with other actors. This improves the performance of the other actors as they would have someone ‘real’ to interact with, and it also gave the animators a highly detailed reference.

Being ILM, they made a breakthrough with Imocap when they only had to film with a single onset film camera instead of multiple cameras when using MoCap. A single camera removes the many restrictions motion capture process gives. With Imocap, motion capture could be done on set. The approach is to model the actor’s range of motion and then they used an elaborate system to fit the range of possible motions the actor could do, to the data from the single camera source.

Besides Imocap, the other challenges ILM faced with the character of Davy Jones was his 46 flopping tentacles. ILM wanted the tentacles’ curling and movement to reflect Davy Jones’ mood, not just lifelessly bob around, but they didn’t want an animator to have to manually manipulate each and every one frame-by-frame so to solve this, their programmers added a sort of inter-tentacle motor to automatically move them around. Mathematical expressions and/or keyframe motion fed to motors in the joints between the cylinders making up Davy Jones’ 46 tentacles caused them to bend, curl, writhe, and perform in life-like ways. “Stiction” kept the tentacles from sliding.

As the computer knows what the actor’s limbs could do from any one frame to the next, it can ignore a lot of mathematical possibilities and add to the solution. Once the solution is constrained by this virtual range of possible motion, a single camera can produce a very powerful motion capture data stream. While the motion capture system worked extremely well, the lip sync was not done this way and instead hand animated.

For the tentacles, an articulated rigid body dynamics engine was utilized to achieve the desired look. Each tentacle was built as a chain of rigid bodies, and the articulated point joints served as a connection between the rigid bodies. This simulation was performed independently of all other simulations, and the results were placed back on an animation rig that would eventually drive a separate flesh simulation.