Framestore’s VFX team ventures into space with new techniques for lighting, virtual cinema and photo-real CG.

GRAVITY - Every Way is Up

A primary challenge of making ‘Gravity’ was the director Alfonso Cuarón’s intention to tell an action story about human characters that takes place in a weightless environment. He also wanted to avoid any looks or scenes in the movie that resembled science fiction or fantasy, intending from the start to create an illusion that the cameras had literally been taken into space.

After Alfonso first approachedFramestorevisual effects studio in London about the story and his vision for ‘Gravity’ in 2010, VFX supervisor Tim Webber felt sure that the best way to proceed that would give the director enough freedom and the environments enough coherence from beginning to end, was to build and light a completely digital environment. But Tim's first task was to convince him of this idea, because Alfonso had initially wanted to use as many practical sets and effects as possible.

Integrated Production

It was a challenging plan with many details to tackle, especially since the weightless environment and need for extreme realism would limit the VFX team’s options for real world reference. But Tim believed that computer graphics was the only way Alfonso could take his cameras into space - as virtual cameras in a CG space environment. It was also the only way the characters could realistically exist in this environment - in this case as perfectly accurate, CG animated space suits with live action heads composited into the helmets.

Most important, from pre-production through production and into post, Alfonso and the DP Emmanuel Chivo Lubezki would need to develop their ideas and work together continuously with Tim Webber and the team from Framestore. Every decision one of them made, affected the others’ work even more than on a typical effects-driven project.

The integrated nature of the production closely linked previs to the art department, modelling and animation to rendering and compositing. The physical and emotional aspects of the story called for extreme realism and subtlety in assets and texturing, and enough control over the lighting and effects to match the director’s and DP’s decisions. In other words, viewers should be touched by the beauty of the view of Earth from the International Space Station and the precariousness of character Ryan’s survival, not concerned about how these things could be achieved on screen.

During preproduction, a huge amount of effort went into previs, which took about nine months to complete. Most aspects of the project were mapped out digitally, including shooting angles, asset design, the lighting, as well as the action. Tim said that because so many shots and sequences would depend on CG animations, everyone was keen that all shots were worked out in advance in detail to know how sets, assets and characters would move and work together. It was essential that the CG-animated portions look completely photo-real to the point of feeling like real life. Keyframing was to be the main technique for both character and camera animation.

Virtual Directing

Alfonso also had a 3D camera to use on screen, inside a virtual set, to compose his shots and plot the action. Due to his preference for very long shots with slow subtle camera moves, his shooting style was an important influence on the decisions Framestore’s team made. It demanded a high level of detail, accuracy and realism in the CG modelling, texturing and consistency of lighting across all shots of any sequence.

Tim also said, “Alfonso made good use of the camera’s capability to float around, rotate and spin in a virtual environment. Characters could roll upside down and the camera could go above, below or around them. In particular, when you have those extended shots, it meant we could keep the camera work very fluid with plenty of opportunities for uncommon camera moves.” Due to the shot length, Tim estimated that the whole 90-minute film only has about 350 shots.

‘Gravity’ was, in fact, the DP Emmanuel Lubezki’s first chance to work with virtual photography and he found the extensive use of CG let them take the long, continuous shots to the extreme. For example, they could move from an objective wide view to an extreme close-up of the lead character Ryan’s face in a single shot, which he felt gave the audience a better understanding of the character’s experience.

CG sequence supervisor Stuart Penn said, “This virtual camera work was done before the shoot at Shepperton Studios as previs on Framestore’s motion capture stage. Alfonso worked out the moves, sometimes quite complex, that he wanted to tell the story and it was up to us to work out how to shoot it, choosing techniques and rigs that could handle those moves and give him as much flexibility as possible. At the same time, it provided the animators with framing reference and was played back on set as a constant reference for the shoot.”

Extreme Light Control

Because Lubezki and Tim both understood during previs how complicated the lighting would be and how much it would affect the photo-real quality of the shots, they worked together to determine digitally exactly how the lights would affect the faces of the characters. They realised they would have to be able to match this effect on set in order to composite the live action and animation perfectly. This way of thinking runs in reverse to the usual CG workflow, where the artists typically try in post to match their work to the live action.

Needing lights that could move fast and change colours in an instant, Lubezki thought of adapting real-world techniques used in LED light effects and projections. He and Tim began testing different lights, in which the main issues were inconsistencies such as flicker and colour hue aberrations. Tim applied his knowledge of CG lighting to develop a physical system - a light box lined with highly controllable lights, large enough to hold a performing actor - that could both function on set and produce results that Framestore’s team could work with in post.

Manex Efrem and the special effects artists constructed the Light Box based on Tim’s and Lubezki’s design specifications, building it 20ft high by 10ft across, on a platform. Their team of operators also had to be able to move, open or close the walls of the box to change its dimensions depending on the scene and the action of the story. On the inside, 196 2ft x 2ft panels lined the walls, fitted with a total of 4,096 LED bulbs that would cast different types and colours of light on the actor as required. The operators could also alter the lights at different speeds.

Tim said the panels and LEDs worked like the pixels on a computer monitor and allowed them to make lighting adjustments in a way that would otherwise be physically impossible. He said, “It enabled us to add interesting, realistic complexity to the lighting, with subtle variations to both colour and texture.”

Art, Science & Drama of Light

Furthermore, images depicting scenes from the story could be projected onto the walls, such as planet Earth or the starfield, or the International Space Station [ISS] itself. While this was mainly done to help Tim and Lubezki reflect the correct lighting onto the characters, it also let the actors see the view that their characters were meant to see as they performed their roles.

The primary light sources for most sequences are either the sun or moon functioning as hard lights, with the Earth acting as a softer bounce card. CG supervisor Theo Groeneboom said, “Getting the balance right was a tricky combination of science and art to make sure we could operate the lights predictably in a way that the crew and, most important, the DP could understand. First, we measured the response curve of the LEDs via the camera to create a function that let us transfer values from arbitrary computer-values to light stops and colour temperatures.

“We also developed ways of compensating for the colour-difference when viewing the LEDs from different angles. Once light positions, animations and intensities were verified and matched to our pre-light, the DP took over, adjusting the lights on the fly shot by shot, interactively adding or moving lights, changing colours and temperature - all still attached to the pre-programmed moves and running in real-time."

The Light Box was a great innovation for the production, and reinforced the collusion of the VFX team, the director and the on-set crew including the DP, camera crew and special effects team. However, the tight space inside made the logistics of the actual camera work fairly challenging. The camera had to be small and maneuverable enough to capture the shots they wanted but still record accurate and consistent images.

Robot DP

The production used automobile manufacturing robots hired fromBot & Dolly, an industrial automation specialist in California. A custom-built motion-controlled camera head was attached to a robotic arm via a 3-axis remote head, and the crew could use this to position and control the camera inside the light box by computer. By programming their camera moves into the computer, the camera could be manipulated quite precisely for pan, tilt and roll, working at variable speeds.

Alfonso’s virtual camera data was not passed directly to the robots. Instead, Framestore's tech-vis team took the cameras from Alfonso’s previs animation, restaged and made them work within the parameters of the robot. Bot & Dolly provided some tools and the specifications of the robot, which the team used to visualise the limits of the robots’ behaviour within Maya.

CG supervisor Chris Lawrence and CG sequence supervisor Stuart Penn explained further, “During the six weeks of pre-shoot and the shoot at Shepperton, we formed a small tech-vis team with Bot & Dolly to program the robotic motion controlled camera and other special effects rigs, sharing tools and passing scenes across a network. We would create technical previs animation in Maya, building flexibility into the system allowing us to make adjustments to the moves on set during the shoot, and give it to them as a package that they would upload to the robot. It was amazingly quick and let us get changes on to set almost instantly.”

Before each shot, the actors, Alfonso and Tim Webber would talk through the previs so they knew what was required. “We would talk about how far they could depart from it - for instance, if their dialogue was running long - and what specific beats we really needed them to hit,” he said. “They were terrific at taking this on board, always hitting a physical mark when they had to, in spite of being strapped into a rig with the lights whirling around them. It really shows in the results.

Rigged for Action

”Then we would rehearse several times at half-speed so they could build up the muscle memory of the key physical positions we needed them to achieve in the shot. When they had it down, we could simply focus on the performance Alfonso wanted. To help make sure our eye-lines were always correct, we would put a red dot on the walls of the Light Box to give them something to follow. Sometimes when Ryan was going to be very close to the ISS or inside it we would render the whole interior and play that back for lighting, also providing Sandra with objects to look at and follow as they moved around her.”

Even though the lights and cameras were moving and altering in a carefully planned and measured way, this method of production still gave the actors scope to act and perform as they needed to for the script and story. The special effects team built various body rigs onto a turntable in floor section of the light box. The rigs were then used to turn and hoist the actors as they worked through their scenes with the director.

The main criteria for rigging in ‘Gravity’, unlike most films in which rigs need to support actors as their characters fly, fall or leap, was that it could support the appearance of weightlessness that was a major factor in the story. Therefore, regular wire rigs and harnesses would not have been successful because the actor’s body usually appears to be hanging from them, instead of floating. Nevertheless, the production did use some very unconventional wire rigging, created by special effects supervisor Neil Corbould. This was a 12-wire system, operated by an overhead pulley system attached to a very thin, light harness the actress could wear under her outfit as she floats about the passages of the ISS.

Space Suit Dynamics

The description above outlines what the actors were doing on set. But in fact a large portion of the character action is portrayed through computer animations. Whenever we see them in their space suits, we are seeing either CG space suits with live action heads composited into the helmets, or completely CG characters. The production decided on this method because of the bulky restrictive design of the NASA space suits.

“The suits were entirely 3D CG in all shots except for one interior sequence inside the Soyuz capsule. For the NASA EVA suits, we used photo references of the real world suits used on the space shuttle missions,” said Ben Lambert, digital modelling supervisor. “We aimed to precisely follow every detail from stitch detail down to the fabric. We simulated the cloth surface to give subtle details and add realism that would tie it to the plate photography, literally modelling the weave of the cloth in our shading to ensure that the light reflected correctly, again, adding realism and connecting us back to the photography.

“A physical costume was created of Matt Kowalski's EVA suit, used to show us the range of motion reference, that is, how mobile an actor could be when wearing it. Physical helmets that would be worn on set for tracking were designed and created as well. To make sure our CG helmets matched these absolutely precisely, we used photogrammetry to capture them as 3D scans.”

When animating the suits, the animators had to take the drama and story into consideration as well as realism. “We certainly tried to respect the limitations of a real space suit. We studied a lot of NASA reference and used our own performance reference wearing the art department’s space suits but, particularly in the action moments, the motion needed to be very dynamic and violent and we had to push the animation beyond reality,” animation supervisor Max Solomon said.

Tracking the Helmets

With the modelling and animation in place, compositing together the live action performances and faces with the CG bodies - creating what the audience sees - was a critical step in the production. When live action plates arrived with the performances recorded, the workflow began with object tracks of the helmets to give the artists starting points for rotoscoping, and they would work manually from there.

Compositing supervisor Mark Bakowski said, “These rotos would be passed back to animation to help dial in the CG performance to match the plate. We'd discuss the pros and cons of how far to push any plate in terms of animation requirements, against how much stress the plate could stand up to. Both sides would have to give some. On completion of animation, the compositors would use a projection set-up to place the head within the helmet. Any rough edges or misalignments would be hidden by choosing selective split points, warping plates or patching and stabilizing as required.”

The tracking department would accurately capture the helmet's movement in 3D space using motion capture witness cameras to record footage from several different angles, producing motion data that contained the nuances of performance that may have been missing or misinterpreted if viewed only from the main camera. After that, a layout process combined the recorded motion with the previs as a basis for animation. Tracking was accomplished in 3D Equaliser and compositing was all carried out in Nuke.

Connected Composite

“We had complete control to re-position the actors relative to each other and move them around in space, constrained of course by the camera view we had shot of the face. The animation was done frame by frame and was very detailed. It had to feel like it connected completely to the actor’s facial performance, not something separate,” said CG supervisor Chris Lawrence.

All the shading of the characters and the environments they moved in was physically plausible to make sure that the way light bounced from one object to another was accurate and felt realistic. In composting it all together they added the visors to the helmets, which had carefully timed condensation on them to coincide with story points about Ryan’s diminishing air supply and reveal her tension.

Mark Bakowski said, “We spent a lot of time building templates for all components of the composites - the visors, the lens FX, depth of field, everything. We updated these regularly and continuously talked about them so that in the end, they were largely logical and easy to dial in. Also, Tim Webber knew how our templates worked, so we could speak a common feedback language that, I think, was what made these composites possible despite the complexity of the workflows and the shot length.

“When the crunch was on for any specific sequence, we threw a lot of people at it but always under the control of a main compositor who knew it best and ran it. For example, one team ran the Earth in the background and another dimensionalised the faces, but it was all pulled together at the end by one person.”

Lighting Up with Arnold

During the production the team developed their own in-house physically plausible shading system, using the Arnoldrenderer for the first time. They made this decision mainly because the old system could not cope with the amount of data needed to sustain the level of detail necessary in their assets and could not consistently match the lighting interaction visible in reality. Arnold is a purely brute force ray tracer allowing use of huge amounts of data to calculate multiple surface light interactions consistently.

“Arnold had never been used in production at Framestore before,” said lighting supervisor Paul Beilby, "although we realised that in order to produce Gravity as required we would have to completely rebuild our pipeline and rendering method. The TDs and shader writers would have to learn a completely new rendering approach during production - a huge step into the unknown and a hell of a ride but it paid off in the end.”

Adopting Arnold resulted in detail and light interactions that were more accurate than any they had used before. Lighting was more intuitive, and their approach to solving lighting challenges had a direct comparison to solutions used on set rather than the previous more abstract technical solutions. For example, if Lubezki added a bounce card on set, Framestore added a bounce card in their scene, producing a direct match. This allowed a common dialogue between the technical directors, VFX supervisors, the director and the DP. Furthermore, once look development was complete, the artists were confident that the asset would look photoreal in all lighting situations from all directions without per shot alterations.

However, because rendering with this level of detail and lighting accuracy is computationally so demanding, they had to put considerably more effort into render optimization and quality controlling the lighting to avoid wasted shot iterations. In fact, most shots on Gravity would only be rendered once at full production settings.

Floating Ryan

The lighting from the CG projections in the Light Box and the live action proxy ISS sets only took the lighting process so far. As the environments shot around Sandra were simple versions of what would be visible in the final film, simply replacing them would have meant that she wouldn't sit believably into the new, fully populated environments. The compositors solved the problem in post with a two-step process that was very useful for the complex sequence following Ryan as she enters the damaged ISS, removes her suit and travels down along the corridor.

First, they created full body tracks of Ryan as she moves through the Space Station and used these to render a complete CG Ryan into the interior environments, which now had many more props with more detailed textures than the Light Box. The rendered CG Ryan was viewed alongside the original plates as lighting reference and the lighting within the new environment was adjusted until the CG Ryan was lit exactly as the live action Ryan was. The CG character could then be replaced with the real one, which now felt much more closely linked to the environment.

“Because the shots were so long and Ryan moves through many different lighting scenarios, this was a lengthy process. The close body track of Sandra's movements also allowed us to cast shadows from her CG double onto the environment as she passed through the CG sets and also reflect it on shiny surfaces,” said compositing supervisor Anthony Smith. “The technique worked particularly well for the moments where she directly interacts with a part of the environment, such as holding a handle or pushing off a wall.

Plate Joins

“In the second step, the compositors used mattes generated from 3D position renders defining the 3D location of any point on the visible surfaces of a 3D rendered image, to selectively colour correct parts of the interiors and enhance the feel of the environments once lighting was complete. We also selectively and very subtly graded the plate elements to further help them feel part of the environment. For example when Ryan moves her arm in front of her face in the airlock, we cast a shadow onto her face because she had not made this movement on set.”

All of the live action footage inside the CG ISS had many hidden plate joins, from a simple action such as Ryan removing her helmet for example, to a complex move such as pushing off the ISS wall with her foot and travelling down the tunnel. The joins could be achieved by warping sections of plate, morphing between plates, or even removing and completely reanimating sections of her body in the composite.

Anthony said, “In the airlock we heavily accentuated the breath elements on the visor - actually my own breaths blown onto a cold pane of glass, and captured on a DSLR camera in the Framestore capture studio. In the composite, these elements were timed to match Sandra's breathing, mapped onto the geometry of the visor and used to heavily diffuse the light that passes through them. During this part of the shot, we graded Ryan's face to look quite pallid, animating the grade away once the helmet is removed.

Warp and Flex

“Once the suit is removed extensive work was done on top of the existing animation of Ryan's float to make sure that her centre of gravity appears to shift as her pose changes while maintaining the feeling that between each interaction with the airlock, her float is completely linear. Some of her body parts were replaced with CG, such as her arms as she pulls off the suit, and her torso was warped to simulate the struggle to remove it.”

Throughout this shot, the overall hue slowly changes, beginning with a cold blue as she enters with no oxygen left in her suit, and ending with a calming warm hue as she curls up into the foetal position and warm light bounces off her skin and around the airlock. Although the grade was added in the composite, using the plate to heavily influence the lighting of the airlock helps the viewer believe that they were shot together.

This shot also includes droplets of water that hit the lens and distort the view beyond, where the camera pulls focus to allow for a join between two plates. The window Ryan looks out of had layers of smudges, fingerprints and subtle scratches added, as well as the duplication of the live action reflection to simulate the various layers of glass. Ryan's body was also warped to give it more flex and loosen up the waist, which was fixed into the 12-wire rig for some set ups.

Fiery Transition

During the fire sequence, many of the labour intensive techniques described above were used - warps, morphs, patching and reanimating Ryan's limbs and her location through the environment. Using elements from a fire element shoot for the CG walls and to augment explosions, layouts of fire were completed during an initial compositing stage and passed back to lighting to allow the fire textures to light the environment.

“Unlike the airlock and first ISS interior shot, the fire sequence ends by transitioning into a fully live action plate as Ryan enters the module connecting to the Soyuz capsule, but the sense of connection with the CG environment was held for as long as possible by adding a few 2D flames licking through the gap in the door as Ryan closes it,” Anthony said. “The shot was finished with smoke, glows, embers and heat haze from the flames, built using animated distortion planes in 3D space in the composite, rendered and used to distort the final image before the lens effects and grain were applied.

“All of the shots had the plate elements of Ryan converted for stereo - much more involved than the faces in helmets and requiring a lot of roto and clean up work. A convolution filter was applied to every frame to allow areas to bloom and glow realistically and chromatic lens aberration added. Having previously removed all of the digital noise from the source footage, we also added film grain to the final composites to give a cohesive look of a sequence that was originally shot on film.”

Light Stage Captures

Collecting the data from the actors that would be needed for animation and compositing involved Light Stage captures of the faces in various static poses to help build complete digital doubles used for action sequences in which Ryan was a fair distance from camera. “We used Mova to record the performance, and we combined it with five digital film cameras to record textures frame by frame. In the end the motion control setup we had worked so well that we had to use this less than expected – but it was useful to get us out of a tight spot or two!” said CG supervisor Chris Lawrence.

However, the 12-wire rig caused substantially more clean up work than a traditional two-wire waist harness would have needed. The wires on the rig were relatively thin but, due to the number, they obscured several areas on the actress at any one time, all requiring clean up. When Sandra was rotating, we had to retain the fine ribbed texture of her vest in areas where where it was creasing and stretching with her movements and, of course, across her face and hair, subtle changes caused by the clean up would be particularly noticeable. The holes cut into her clothing allowing the wires to attach to the plate she was lying on, and ankle straps supporting her legs, brought still more clean up work along with quantities of tracking markers positioned over her skin and clothes.

Into Orbit

As a storyteller, Alfonso Cuaron wanted the Space Station to follow a course over the surface of the earth that would logically show how it would travel and what the characters would see, given their speed, including sunrise and sunset. Fortunately, NASA’s reference material included photos, films and even time lapse sequences of Earth that the astronauts themselves had captured on space voyages. However, because the director also wanted to create an interesting, beautiful view, some continuity challenges arose.

Earth supervisor, Kyle McCulluch said, “To plan the orbit, we started with the Hubble, knowing that we then needed to get to the orbit of the ISS. This required a cheat in altitude, as they orbit at different heights, but the decision was made from a storytelling perspective. There were little cheats and big cheats. Little ones included simply changing the angle of the orbit throughout the story to move over a more interesting landmass and make sure we had something beautiful in the background.”

The sunset-sunrise schedule also wasn’t arbitrary – it was all planned out with the rhythm of the story to make sure we were hitting the right intervals between setting and rising. They needed to show the changes in the amount of light that would be reflected from the surfaces at different times and from different spots.

“The path takes us around the earth pretty fast,” Stuart Penn noted. “We knew the key spots we wanted to see and where we had to start, so it all started to fall into place. Alfonso wanted to start over Mexico and that brought us over the United States. Then we see the UK as we come over Europe and then it starts to go to night, and it all begins to work. We shifted the angles slightly to get us moving over India, then down over Malaysia and Singapore and then we come up the Chinese coast then over the arctic. In the portions of the film that feel like a break or gap, during the dream sequence for example, as she’s not quite sure how long she’s been asleep, we could change position slightly.”

Space Travel

The interior and exterior design, textures, details, design and animation of the ISS, Hubble and other spacecraft were all based on the abundant information that NASA has made available to the public. The virtual set design and build was another area on which production worked very closely alongside the VFX team, in this case with the production designer Andy Nicholson and his team. Nearly all of the sets, including important parts of the ISS interior, are computer generated. Andy said that in spite of his experience in working with visual effects supervisors on set extension and CG backgrounds, the fact that entire sets, including all props and backgrounds, had to be designed, built and rendered to a high level of photo-realism completely changed the nature of the project.

All teams were relying on extensive research into NASA photography and technical data. Again, they wanted the result to look as if they had taken the cameras out to the International Space Station, space shuttle and other space craft, and were aware that some viewers would have been following the space voyages over time, known what to expect and been able to recognise the details they were trying to recreate.

During previs, they began setting out the environments, getting the director’s approval as they worked. The artists would rely on the facts as far possible and then adapt the set if necessary to suit the action and story. Nevertheless, because their work formed the basis of the digital build at Framestore, the designers found designing assets that would never be built physically quite difficult.

The sets were populated with hundreds of props, all of which had to be researched, designed and then modelled. These items were compiled and organised into a library of 300 or so props that could be used for set dressing inside the two space stations. Artists could interactively place props and then store them to a call list that would be used at render time. Some props such as cables, tubes and wires had posable rigs, others had multiple look variants that were switched between.

The texturing was even more important than usual to give a realistic finish to all assets and was enhanced with heavily layered detail. An interesting characteristic was wear and tear, and other signs showing that the Space Station has been lived in by different people for about 12 years, so some textures were applied to the designs which Framestore’s texture artists would develop to coordinate with the proximity and moves of the camera.

From All Sides

“One of the major challenges on ‘Gravity’ was the length and range of the shots. The normal procedure of assigning level of detail based on distance from camera could not be applied, as any object that begins as a background object could then become a middle distance asset and then a hero asset over the course of the one shot. Consequently we had to treat almost every object as a hero asset and develop a level of detail that would retain photorealism at any distance, from wide to extreme close-up,” said Paul Beilby, lighting supervisor.

“The weightless mobility of the camera also meant that we could not assume certain areas would not be seen, as you would on an earthbound production, so the assets also had to maintain photorealism and detail from all sides. During shot execution we would swap some off-screen assets with pre-generated image based lighting of the bounce light from these assets, but the shot length limited the number of times we could do this.

“We were only satisfied with an asset when it exactly matched a photo reference, so the optimum level of detail would be equal to the best reference available from NASA. My favourite moment from the look development process was when we - finally! - had to explain which image was the NASA reference and which image was computer generated.”

The space stations undergo heavy damage when they are hit by space debris, but other than the initial damage on some of the stations, the modellers didn't know exactly how or where the assets would have to be destroyed as they are hit. The final orientation of the ISS in relation to the debris field remained flexible in previs, and the models could be used at any orientation and distance. Therefore, when modelling, they ensured the assets were built with enough separated pieces in a logical manner before handing them to the effects department.

Zero-G Blow Up

The unfamiliar physics of weightlessness and the space environment required altering the effects simulation tools for fire, explosions and liquids and so on to create FX that were both realistic and readable for the audience. For some effects such as the zero gravity water, the artists found good references online to match their work to, and their solver worked pretty well.

But when they had to consider and define how zero-G fire or an explosion in space without oxygen might look, the only reference they found was of a match burning in micro gravity, which created a blue sphere. “We used that as our main reference for the blobs of fire floating in the ISS, as they are of a similar scale. For extra realism, we added more detail to the simulation and turbulence to make it look more interesting and closer to real fire on Earth,” FX supervisor Alexis Wajsbrot explained.

“But for the main, big fire simulation inside the ISS, we had a three day shoot with the special effects artists to get some more specific reference. We emitted fire directly onto a metallic ceiling so that the buoyancy had a minimum impact to the fire, rolling the camera at different orientations. This created a blobby fire crawling along the ceiling. To reproduce it in our fluid solver Naiad, we had to turn off buoyancy, gravity and all kinds of drag, creating a simulation with very low detail that was almost ‘boring’. We had to balance between what the solver believes zero gravity fire should do, and what we believe is interesting to show to an audience.”

Stable Sims

The fire explosion outside the ISS required many iterations to try to capture the look - how spherical it should be, how fast it should dissipate? Should it create a shockwave? There was a lot of back and forth with Tim and Alfonso to, again, both find the right balance between realism and making it look compelling.

“The look of the destruction was easier to nail, even if we didn’t have that many references,” Alexis said. “We all had a fair idea - as soon as a piece of debris started spinning or moving it should not slow down, accelerate or stop unless there is an impact with another piece. After turning the gravity, all drag and damping effects to zero, our solver was actually working very efficiently and did create exactly the kind of effects we were looking for.

“It actually even made the TDs’ life easier because the simulations were very stable. No forces were interacting with the ISS unless it was being hit. One issue that emerged, though, was that all the debris tended to fly off screen very fast, and so the challenge was to always keep something interesting on screen, localising and art directing our impacts quite a lot.”www.framestore.com