Shark!

Audiences have a love/hate relationship with sharks. Whether in real life or on the screen, they instill fear in our hearts. Yet, it seems that we cannot get enough of these denizens of the deep.

The most recent feature film to bait us is Shark Night 3D. The film focuses on a group of college students who decide to have some fun in the sun at the lake house on a Louisiana island. Soon they discover that the lake has been stocked with hundreds of massive sharks.

Shark Night is not considered a VFX-heavy film—it contains approximately 100 effects shots. However, there is a great deal of water in all the shots, making the work even more complicated. And on top of that, it was all done in stereo.

Six boutique facilities around the globe worked on the effects.

Because of the complex water simulations, a pipeline was designed so that artists from the different vendors could collaborate on the same shots. The longest one is a huge splash in slow motion with over 300 frames; this sim took more than a week of a continuous machine crunch with a 96GB computer.

Here, Gregor Lakner, VFX supervisor on the movie, describes the work in a Q&A with CGW chief editor Karen Moltenbrey.

Can you give us a taste of what we will see?
Reliance Media Works, along with a few sub-vendors from around the globe, created the complex water and shark shots for the horror flick. The film, directed by David Ellis and released through Relativity Media, was shot in Shreveport, Louisiana. Shooting out on the water in often-difficult filming conditions, the VFX post team had its challenges laid out for it, and in stereo!

How was the movie shot?
The movie was shot in stereo using Pace rigs. We also used a Sony F35 as the main camera and a Sony f950 for underwater.

What role did visual effects play in the film?
I led a team of digital artists tasked with designing and animating six different shark species—which, in the movie story, during winter floods infest fresh-water lakes and threaten a group of party kids on an isolated island. We were constantly straddling the need to create utterly realistic-looking sharks, while at the same time knowing that they had to behave and do things that real sharks are seldom seen doing.

What kind of challenge did that present?
Organization. The computer-generated Mako, Bull-shark, Great White, Hammerhead, Tiger and Cookie-cutters were modeled, textured, and rigged (including a complete muscle system) in such a way that artists from various studios were able to work on shots simultaneously and exchange shot-related data of any sort whenever needed. A customized version of Tactic (from Southpaw Technology) was the backbone assets management system used to maintain good data organization on the show and within some of the vendors. In terms of a standard software workflow, the show for the most part relied on (Autodesk’s) Maya, (Pixar’s) RenderMan, and (The Foundry’s) Nuke.

One another involved the parallel water pipeline. A real challenge for the team were the above-water shots. Because of all the floating debris, caustics, and often extreme close-ups, the underwater shots needed detailed attention in all steps of the shot execution. But they were relatively easy compared to the 20 or so shots in which sharks breach the water surface.

The complexity of these shots required that a separate workflow be created, which spanned across multiple facilities in the US and Europe. Camera match-moving and animation originated in Maya. Approved animation files were imported into Exotic Matter’s Naiad fluid dynamics system, at the time the only commercially available software that proved to be sophisticated and robust enough to handle water simulations of the scale required for the project.

The simulation for the most complex slow-motion shot, in which an 18-foot-long Mako shark in a spectacular jump knocks one of the lead characters off a moving waverunner, took over a week of continuous machine crunching. The plate of the actor was shot with a Pace rig and two Phantom cameras at 400 fps. Simulation was performed at 200 fps.

Generally, for simulation purposes alone, the production had available two 92GB machines--one at Reliance Media Works in San Francisco, where the overall production hub was, and one at Crater Studios in Serbia. Water was also simulated at Lightstream Animation Studios in Petaluma, California, and Shadow FX in Serbia. Naiad generated large amounts of data, which, for speed and memory optimization purposes, had to be rendered with Arnold. At the time when Shark Nights postproduction started, Arnold’s integration with Maya was still in the early stages. Therefore, the team decided to use Softimage XSI--which already had a good integration of the Arnold renderer--as the main lighting platform for these shots. The ins and outs of all different software platforms were connected and supported by a number of in-house written plug-ins and scripts, a crucial step in making the pipeline working.

What kinds of effects were done?
There’s close-up creature animation, underwater FX work, and above-water simulations, which was the most complex of the shots.

What were the major obstacles in terms of the effects?
First, we had to create an efficient water pipeline that could be used among various vendors. Also, the animation pipeline (models, rigs, textures, assets system) had to be created so that artists from different studios were able to work on shots in parallel. To achieve this, we hand-picked a set of talented artists, technicians, and boutique vendors that were able to create the tools (including the pipelines and the custom-built machines) and use them in an efficient way to complete very complex shots on a tight indie budget.

No question, the water simulation was by far the most complex task we had to tackle. Because of stereo, we were not able to use a traditional 2D layering approach: in stereo, each and every droplet can be seen in a relationship with all the others. Therefore, we had to develop close to complete simulations for each shot and run them as a whole. Such simulations would run, in few cases, non-stop for over a week, not giving much space for any kind of error. For the longest shot, we were able to run only three repetitions of the entire simulation.

What was so unique about your water simulation? What did you need to achieve with the water?
We had to create a parallel water pipeline: Maya, Naiad, Softimage, Arnold (were not able to render large files with RenderMan), Nuke.

What extra challenge did the stereo throw your way?
Stereo requires perfect alignment of the two images. Only this way can one in a reliable way reproduce live-action cameras in CG and get a correct depth match of CG elements and live-action elements. Not being able to achieve such perfect match forced us to re-create larger portions of live-action water in CG.