The Bee Movie

The Transformers universe was a multi-media bonanza long before the first live-action feature hit theaters in 2007. Hasbro launched a toy line featuring the Cybertronians in 1984, with animated adventures and comic books soon following. The marketable conceit behind these sentient robot forms was their ability to alter massive metal physiques from roughly humanoid to vehicular shapes. Director Michael Bay and Paramount took on the challenge of transforming “Autobots” and “Decepticons” – good and evil denizens from a faraway world who have taken their battle to Earth – into a franchise that has delivered mega box-office returns for five consecutive entries.

But with Bumblebee [named for a beloved good-guy Transformer], Paramount has veered off from the bigger-is-better mandate. The film is a prequel of sorts, revealing Bumblebee in a smaller form – as a VW Bug instead of a Chevy Camaro – as he existed in 1987, when Bee is scavenged from a junkyard by a young girl named Charlie [Hailee Steinfeld]. Together they must fend off investigations from pesky humans, as well as another block of evil Decepticons.

Laika animation studio founder Travis Knight [ICG Magazine, December 2018, Exposure], who recently made his animated directing debut with Kubo and the Two Strings, was recruited to helm his first live-action feature. Guild cinematographer Enrique Chediak was approached by Bumblebee’s producers; the Ecuadorian-born cinematographer, who studied film at NYU before winning at Sundance for shooting Hurricane Streets, also lensed Turistas (which utilized the services of underwater shooter Pete Zuccarini, who reteamed with him on Deepwater Horizon and Bumblebee) and 28 Weeks Later, then shared a BAFTA nomination with Anthony Dod Mantle, DFF, ASC, BSC, for 127 Hours.

“I had worked with [Transformers producer] Lorenzo di Bonaventura, and when he asked me to consider this, I wasn’t sure about doing a full-blown CGI robot movie,” Chediak admits. “But then my agent said Travis was going to direct. I had seen Kubo with my daughter, so that got me interested enough to read the script, which I liked for all its humanity; even with all the big action sequences, the main thrust was the relationship between the girl and the robot.”

For Knight’s first live-action film, he wanted to benefit from having Chediak get involved early, and the DP had a generous fourteen weeks of prep. “I wound up looking at locations with Travis and production designer Sean Haworth [Deadpool, Ender’s Game], who was devising looks for robots that worked for the film while also pleasing Hasbro,” Chediak continues. “Initially my idea was using Alexa 65, but I couldn’t find appropriate lenses [to fully cover the sensor]. Then I tested Panavision lenses with Dan Sasaki and came across these Super Speeds from the 1970’s. They were much softer than modern glass, with aberrations and imperfections that gave a beautiful period feel.” (Digital acquisition was via the ALEXA Studio XT.)

Reprising his VFX supervisor role from Transformers: The Last Knight was Jason Smith, with Industrial Light & Magic (ILM). “ILM has so many veterans who have worked on this franchise,” Smith recounts. “But this was less about creating spectacle and destruction, as in past films, and more about enhancing the relationship between Bumblebee and Charlie. If we’re successful, the audience will stop thinking about the visual effects and just relate to the characters.”

In a departure from previous Transformers films, Knight and Chediak wanted a naturalistic approach to lighting. “That meant we would be seeing front-lit Transformers,” Smith describes, “instead of going to town on them with CG versions of that ‘movie light’ feel, where we have a ton of rim lights to bring up visual interest like you would on a car commercial.”

Towards this end, a practical version of Bumblebee was built for use on stage and location. “It had a head, chest and upper arms that we could position, with all the chrome, detailing and weathering,” Smith adds. “There’s an efficiency and beauty you get from having something real in frame, and that is especially true when it comes to the caustic lighting created by his chrome. His eyes could light up, which gave Hailee a really good reference she could emote with. We find that if the eyeline isn’t exactly right, it breaks the connection in a major way.”

While Chediak says moving the mockup around set was difficult (owing to its weight), he found it useful for composition and illumination. “[The mock-up] let me establish how I wanted Bee lit, so ILM had a solid reference to match,” Chediak says. “Sometimes we only used part of the torso, but that still gave them enough to be able to replicate in CG. The robot’s yellow coloration would sometimes let me capture a bit of color reflecting on the faces of the actors, so those little touches helped, as did the practical blue eyes on the robot, which we could use with the torso or separately. We still had the old ‘ball on a stick,’ so after doing a reference pass with the torso, a performance pass with that in place would follow to ensure proper eyelines.”

For other scenes in which Bumblebee moved around on set, a unique approach was devised. “We brought in a stilt performer who worked on five- and six-foot yellow-painted extensions,” Smith reveals. “This let us shoot extended dialog scenes of girl and robot in conversation while maintaining that crucial eyeline connection.”

A stilt performer on yellow-painted extensions helped VFX maintain an eyeline connection with Charlie and Bee.

Except for one project that required a month-long digital intermediate, Chediak’s preference has been to avoid using elaborate LUT’s and rely on rec.709. “I like to let the lenses do their magic, then use a bit of color control by altering the temperature in the camera as needed to get our desired effects,” he shares. “There were the occasional fixes in the DI, but working this way made it very straightforward in post.”

Digital Imaging Technician Daniel Hernandez says the image was run through LiveGrade, save for scenes with cars. “Sometimes we did light color correction with CDL’s, but mostly it was rec.709,” Hernandez reports. “We downloaded on set, backing up to RAID drives. The studio required special safety/security measures, so RAID’s would go to them, but we separated the drives from the chassis; even if the drives ever disappeared, the data couldn’t be accessed.”

EC3 handled dailies, but with a non-standard approach. “Instead of looking at Pix,” Hernandez adds, “we had postproduction create stills that could be viewed on my color-corrected monitors, ensuring [Chediak] could see whether it matched to what we did on-set. It’s an easier and faster way to work when you haven’t got any facilities near set, while guaranteeing you get to see the real color.”

Shooting began with three weeks on stage, followed by location work in California. Chediak says he decided to interpret the 1980s rather than slavishly duplicate them with period-correct tools. “I felt using lots of digital lights was the right call because they give you so much more control,” he relates. “On dimmers, you don’t have the tungsten problem with units warming in color when dimmed. And the enormous time spent gelling older lights is time I’d prefer to spend shooting the movie. For fill and general lighting, I relied on LED, but when we were bringing directional light through windows, I went with HMI’s and Maxi-Brutes with very narrow beams.”

The garage in which Charlie and Bumblebee hide out and bond together was mostly handled on stage, though entries and exits utilized a house built by Production on an empty lot near Santa Cruz. “On the stage interiors, I used a lot of fluorescents with a blue hue,” Chediak continues. “I mixed those with tungsten light, thinking that bluish felt kind of spacey while the warmer light seemed to ground things – this was a place that was home to a robot from another world as well as to the girl. I’m very happy with how those scenes look and play.”

Smith took great pains to duplicate Chediak’s lighting scheme and palette for ILM’s CG work. “We used the same digital lighting tools as usual, but with the conscious goal to faithfully match the established lighting reference. Occasionally we saw ways to enhance what was there but always in collaboration with Enrique. If I thought that darkening the top of Bee’s head against a dark ceiling might help in post, I’d discuss that plan with Enrique on set – our collaboration was one that carried through all the way from set through post.”

As with filming many a flesh-and-blood film star, shooting Bumblebee worked best with a particular lens. “We had a dedicated 20-millimeter lens for the robot,” says A-camera/Steadicam operator Bela Trutz. “It was even used on close-ups because that was wide enough to see him at all times. Enrique is not one of those cinematographers who are afraid of ditching the camera,” he shares. “And it wasn’t usually done just to address height difference between [the principals], but more often to create more dynamic shots.”

For a number of elaborate moving shots that required both practical and digital effects, a Technodolly was deployed. “It is almost like a next-generation approach to motion control,” marvels Trutz, “and was great when a lot was taking place as a robot moved through the scene. Instead of putting in numbers and exact positions, I could just take the camera through its paces, moving up and down on the dolly, for whatever move needed to follow the actors or get to a particular part of the set. If that shot was approved, then the very same move could be played back on subsequent passes, repeating precisely as we captured other elements needing to be shot separately – sometimes seven or eight for a single shot.”

Trutz says the workflow required enormous coordination, “since the visual-effects guys would be cueing the special-effects guys for separate physical events that had to happen at specific moments each time we repeated the move,” he describes. “It was quite the dance to witness, especially when you might have a car present during the first half of the original camera pass, but then be shooting a giant robot during the second part!”

Chediak says he relied on the Scorpio crane with Oculus head to work at great speed and with enormous freedom of movement. “The Oculus was the only tool that could work on a regular dolly without track while still letting us go wherever we wanted,” the DP states. “In the forest, we could bring the Scorpio on a four-wheel-drive cart that acts as a dolly; again, you don’t need to lay track. And with the robot’s face thirteen feet from the ground, it also let us move up or down from there to the girl’s face quickly and smoothly.”

Production moved north to Mare Island in Vallejo for a sequence involving both multiple explosions and multiple cameras. “This had been a World War Two submarine pen, and there was action taking place down in a pit and on a level above,” Chediak describes. “In order to make our day and week, it was necessary to put the second unit down below with a Libra and crane while we remained above, with each unit running two cameras.”

Hernandez says the team relied on long-range wireless systems to tie in with second unit’s cameras while on the island, “plus there was a special team for the aerial unit, so we could receive signals from the helicopter being flown overhead.”

The other key development for the Mare Island shoot was a huge custom light source built from LED’s in a construction frame. “We didn’t want that classic backlight night look, and instead wanted a source that gave a more organic look that extended across this huge expanse,” Chediak continues. “With this rig hung from a crane, it completely illuminated this space, and then we augmented by using specific lights to emphasize key architectural parts of the frame. We could change light levels with dimmers while still being able to expose for both the top level and those second unit guys working down below. It took a week just to transport the crane to location and days of pre-lighting to get it ready, but that light saved a tremendous amount of time on this very expensive location.”

For speed and movement, Chediak used a Scorpio crane with Oculus head / Photo by Jaimie Trueblood, SMPSP

One key sequence has Bumblebee, damaged during battle, plunging into the dry dock, with Charlie going in after him. Underwater Director of Photography Pete Zuccarini, who shot the scene at Universal’s Falls Lake, worked out the choreography in advance with a stunt performer.

“It helped determine issues like how wardrobe will affect movement and how much can be done with any given shot,” he explains. “It was impressive that [Hailee Steinfeld] was able to do almost as much as the professional swimmer while delivering what looked to be a very tender moment when interacting with the robot.” Zuccarini captured Charlie as she finds Bumblebee via a sweeping reveal. “I began the shot trailing behind, then overtook her while panning around to face her while descending beneath, winding up looking into her eyes, which would have taken an elaborate rig up on land but was possible for me owing to the buoyancy of water.”

Zuccarini also worked with Second Unit Director of Photography Peter Collister, ASC, to capture wider views of the action, shooting wide-open with the ALEXA MINI. “We created an industrial feel in the lighting from above that put some highlights into the water, but wanted to keep things low-key, since this was a descent into inky blackness that was meant to feel spooky,” he adds. “To get that mood, we used four HMI’s to light different areas of particulate matter in the water, giving Charlie a visible phenomenon to swim against rather than trying to light things up. This creates a greater feeling of depth underwater if you snoot the light off effectively, so the character seems to pass through a level of murk then back into darkness.” The dramatic underwater “one-r” owed a lot to Best Boy Rigging Grip Jason Blaise Cunningham, who built a rig that let the camera descend on a track alongside Charlie at speed while keeping the lens right on her face all the way down.

Bumblebee’s bot-on-bot action often utilized previsualization and postvisualization from Proof and ILM, but that was augmented during shooting. “I’d talk with [special-effects supervisor] Scott Fisher to get an idea about how the blasts were going to develop,” Smith relates, “figuring out whether they would be fireballs or smoke hits and how big they needed to be, along with the kinds of debris that would be flying out. Once those conversations were done, we’d line up a couple of other cameras to give us options. Sometimes the alternate angles wound up superseding the previs. Also, for much of the Bumblebee pyro, we went much wider than in past Transformers, where we were usually right in there, close up with the robots, so you don’t see the entire effect.”

Some visual effects were handled via an in-house team of compositors that worked in camera linear space, but most work was done at ILM, and their partners, with their custom pipeline. While ILM finished their shots at 2K, Smith doesn’t rule out another kind of release. “[Deluxe Company 3 Senior Colorist] Stephen Nakamura is doing the color finish, and he has lots of experience with HDRI mastering, so it’ll be interesting to see how that works out,” Smith shares. Nakamura is conducting DI efforts using BMD DaVinci Resolve.

Chediak’s takeaway is that Bumblebee will succeed in capturing for audiences the soul of the titular machine. “I was really happy all through the experience of shooting this picture for Travis and feel this took the series in a direction that will turn out to be popular and emotionally engaging for the audiences. The bond between robot and girl calls back in a good way to some fantasy films of the 1980’s, but we were able to use modern techniques to deliver the machine characters with greater sophistication than could have even been imagined back then.”