A Breakdown of the Visual Effects Used in the Star Wars Franchise

In 1977, George Lucas and his team changed the way we saw and created movies forever. It was an epic challenge, because Lucas had envisioned an entire universe full of creatures, spaceships and technologies that didn’t exist yet. In fact, even the visual effects technology to portray this universe was in its infancy. So how did they do it?

Trilogy I: Episodes IV through VI, 1977 – 1983

In these first films, much of the visual effects work involved actual physical creation of models, puppets and sets. Computer generated imagery (CGI) barely existed in 1977, and Lucas and his team went on to transform the visual effects industry and set the new standard for the years to come.
For the 1977 Episode IV: A New Hope release, Industrial Light and Magic (ILM) was formed. During the production period for the movie, ILM—headed by special effects designer John Dykstra— constructed robots and spaceship models from scratch. Their goal was to achieve the new, higher-level realism Lucas envisioned.
When Lucas conceived of the Star Wars stories, there was no way at the time to make any blockbuster movie filled with aliens and explosions set in space seem real. In the late 1970s filmgoers were lining up to see grittier, more realistic films like Taxi Driver; no one was ready for a movie with a completely fantastical story set in space that was, in spite of all of that, realistic looking. In order to achieve a sense of realism to attract viewers, Lucas and his team found a way to use animation, amazingly detailed miniatures and computer-controlled motion photography to generate special effects.
For example, in the first trilogy, the exploding Death Star was created using cardboard and bits of titanium, and Yoda was an animatronic puppet. The scenes of Luke in Uncle Owen’s home dug out of the ground were shot in Tunisia, a North African country on the Mediterranean Sea. Also, Luke’s landspeeder was able to look as if it was gliding across the sand by using wheels hidden with mirrors that reflected nothing but the sand around them.
Each object and location physically created in this trilogy was created not only to look impressively futuristic, but to blend old- and new-looking elements. The goal was to make the places and things built for the film seem real by showing them not as shiny and new, but looking well-used. This idea was new at the time of the first trilogy, but has appeared in almost all sci-fi and fantasy films since for one very important reason: this technique provides context and a sense of ongoing lives, people and places for viewers, which gave the futuristic/alternative society more credibility and believability that this universe could exist and be livable.

The Prequels: Episodes I through III, 1999 – 2005

Having already created the basic special effects tool kit for the first trilogy, the ILM team approached the second trilogy in a new way and shifted from using physical elements like animatronics, models and practical effects to digital effects.
In contrast to the physical world of the first trilogy, the tool kit for the prequels was almost all computer-based. In Episode I: The Phantom Menace, the landspeeder was back, but it and all of the other effects were digitally created and added by a visual effects (VFX) compositor team in post-production. In other words, in the original trilogy ILM invented and physically created the places and things they wanted to show; in the second trilogy, they built the same fully rendered spaceships, cities and characters using computer-generated imagery (CGI).
Understandably, as filmmakers and art directors were no longer limited by the physical world around them, they were free to create entire worlds digitally. This freed them from the laws of physics to the expensive and time-consuming process of building things by hand. However, the results weren’t always positive, and this divorce from the physical world may be a key reason why the prequels weren’t as loved as the original trilogy.
For example, Jar Jar Binks was among the first all CGI creations to hit the screen. The actor who played him, Ahmed Best, was one of the first actors to test the limits of the motion capture technology Lucas was using. For many viewers, these technologies were not yet quite advanced enough to seem utterly real, and this may be one reason viewers did not connect with him as much as they had other alien characters in the past. As CGI characters have progressed, these kinds of problems no longer exist—and we have the Star Wars franchise to thank for the stunning realism of characters seen in films like Avatar and The Lord of the Rings trilogies.
The above 3D Animation was created by Platt College student Ryan Ritterbusch.

Episode VII: The Force Awakens: New Hope for Visual Effects

The new approach to the Star Wars franchise makes the most of both the more traditional world of physically-based effects and modern, high-end computer-based effects. Experience has shown that a careful, harmonious blend of live-action and digitally-created sequences is ideal for that feeling of realism and connection with characters.
Therefore, it was no coincidence that images seen by the public as The Force Awakens was in production included actual life-sized models of the Millennium Falcon and X-wing fighters. The attention to detail in these old-school physical pieces creates a far more convincing and awe-inspiring setting for digital effects, which are themselves far more impressive than in the past.
[Spoiler alert]
In the opening scenes of Rey salvaging the downed Imperial Star Destroyer, we see a flawless green-screened Rey interacting with a totally digital environment. As she slides down the sand dune in live action, the wreckage on the horizon is added in digitally. As Finn navigates through the base, he runs between actual physically-built TIE fighters and hallways, but the environment is then extensively digitally enhanced for the final shots. Even the TIE fighter’s take-off is a green screen effect using a detailed model.
The burning wreck of the TIE fighter is live action but features composited elements, and when the ship collapses into the sand, Finn is actually staring at a green screen. And while there was in fact a huge Millennium Falcon model, in many scenes it is laid into the final piece digitally. Many scenes even require multiple digital layers; for example, the Millennium Falcon flees several TIE fighters in the desert. Most battle scenes in the film are enhanced both digitally and with pyrotechnics.
These scenes were achieved with a digital shot of the models and then a digital desert environment that was created using details from an extensive survey of a real desert environment. Full digital shots over a plate are also essential to the Force Awakens in chase scenes in the desert environment. This method of physical surveys of environments is used throughout the film, such as in the snowy environment, the woods and in the lake country of the UK where Maz Kanata’s place is located.
Digitally-created characters are present in the new film as well; the same process of layering digital effects was used to create Maz Kanata when actress Lupita Nyong’o wore a motion capture suit to sit in while filming. Multiple points were mapped between her face and the CGI character, which allowed Maz’s movements to look more natural. When asked about Maz, director JJ Abrams commented in this interview that, “Maz needed to look and feel and be just like one of those creatures. And given her mobility, and given the role that she played, it became clear that that was one creature where we should use the tool of CG. But the performance was all Lupita. She was there on set, and we did capture sessions afterwards as well…” This process was also used to digitally craft Supreme Leader Snoke.

Visual Effects in the Star Wars Franchise Will Continue to Change the Industry