Eight Cars, Six Cameras, Full Circle: Shooting Mercedes-Benz at 20K for One of the Largest Screens in the World

By Bryant Frazer / March 19, 2018

How The Artery, Astronauts Guild and VR Playhouse Pulled Off Their Biggest 360 Project to Date, Rolling Red Weapons at 8K

For 360-degree video on a grand scale, the gold standard right now may be the big screen at Atlanta’s Mercedes-Benz Stadium. The massive stadium space is crowned by a multifaceted dome which is ringed on the underside by a first-of-its-kind halo display running 1,075 feet in circumference — a total 61,900 square feet of screen space.

How do you leverage a display like that? If you’re Mercedes-Benz, you commission a gorgeous, uncompromised promotional piece showcasing some cars. We’re talking about a full, 360-degree environment shot live at Willow Springs International Raceway, 90 minutes north of L.A., with smoke and dust whirling seamlessly across the full image as the cars wheel around on the track — not a digitally enhanced job composited from multiple plates in post. Extra difficulty? To fit the really big screen, the project demands a final resolution of 20,160×1080. No, there are no extra zeroes in that raster: the project required a 20Kx1K deliverable. And director Vico Sharabani, founder and executive creative director at The-Artery, relished the challenge — not least because the project was the first of its kind. “It’s not a 360 video, and it’s not VR,” he says. “It’s experiential — a cylindrical panorama. That’s a totally different ball game.”

The production brought The-Artery together with creative collective The Astronauts Guild, immersive-content production company VR Playhouse, Hollywood 360 gear specialist Radiant Images, and Red Digital Cinema. Well in advance of the actual shoot, the team got its game together, figuring out how to create a piece that would be more than a science project — it had to be a pixel-perfect presentation up to the exacting standards of high-end commercial cinematography and then some. “We were doing something that some people call signage,” Sharabani says. “But I didn’t want it to look like a big sign. I wanted it to be a high-quality film. Bringing that quality to a 20K deliverable was a big challenge.” and it started by selecting the cameras. We went through quite a few options but we decided not to cut corners and just go with the best quality cameras we could possibly get.”

Seeing Red … at 8K

The first important decision was which cameras to use. Working in conjunction with The Astronauts Guild and VR Playhouse, Sharabani considered a number of options but eventually elected to go for the big image in a big way: “We decided to not to cut corners, and we went with the best quality cameras we could possibly get.” They settled on six Red Weapon cameras with Helium 8K S35 sensors shooting at 60fps, all of them mounted on a Sense 9 360-degree rig from Radiant Images. That rig would deliver the kind of resolution and image quality Sharabani was looking for.

The next task was figuring out the precise optical configuration of the project. Early on, Leo Vezzali, who executive-produced the project at VR Playhouse, began running previs scenarios, simulating the location shoot as well as what the footage would eventually look like when enlarged to the stadium screen and viewed from different angles. “Based on camera orientations and layout from Astronauts Guild, we started testing different lenses and seeing what they would look like in a virtual environment,” Vezzali says. “We had CG models of Mercedes driving in a circle, matching the creative. We put a virtual camera rig in the middle and shot it from that standpoint, and that’s how we started to understand where the problem areas would be as the cars came across stitch lines.”

For Scott Connolly, chief creative technologist at The Astronauts Guild, that kind of careful advance work is a priority on complex jobs. “On really complicated, hyper-resolution projects, we focus on working as a team to make sure every piece of the workflow is handled in prep,” he says. “This job was handled and everything was ready to go several days — several weeks, really — beforehand. It’s all about meticulous planning.”

The previs guided creative decisions in very practical ways, including the precise configuration of the camera rig and the choice of lenses for the shoot. “It starts from really picking the director’s brain and figuring out exactly what he or she wants this to be,” explains cinematographer Evan Pesses, head of advanced imaging at The Astronauts Guild. “Then we go into previs and work it out on a computer, where there are infinite options and opportunities. That scales down into our testing phase, which teaches us what the computer can’t. For example, we may think the cameras can be 37mm away from each other, but in actuality we have to build a custom right-angle cable in order to sync them all and keep them at that distance. So at the computer phase we figure out we’re aiming for X, and then in the testing phase we figure out the real-world properties are Y. Let’s see how that works.”

The previs indicated a need for glass with a wide enough field of view to suite 360-degree cinematography, but with optics precise enough to resolve imagery that would hold up on the giant screen. “Normally in VR, you make a 360 spherical rig and the wider the lens, the closer the crossover can be,” Pesses explains. “But at the same time, the closer the crossover, the less resolution we actually use [on the cameras]. We had to find lenses that had proper, acceptable crossover, but were also optically strong enough that, when we blew this image up to 60 feet tall by 1,100 feet around, it resolved that high, hyper-resolution quality.” Cooke S4/i 14mm lensesfit the bill. Pesses says it was challenging to get hold of six matching 14mm Cooke S4/i primes for the shoot, but 14mm was clearly “the magic number.”

Staging the Scene … and Avoiding the Stitch

With cameras and glass selected, the previs continued as Sharabani and crew worked to understand how to make the most of a unique viewing proposition. For one thing, viewers inside the stadium would not see the full 360-degree expanse of the screen; only a portion of it would be visible from any viewing position in the stands.

“Instead of seeing the complete 360 video, you see just a sliver of it, and that is crucial to understanding how to handle it,” Sharabani says. “Especially when we’re framing cars, there are multiple considerations. If they come too close to the camera, you’ll have stitching problems, and they will be cropped. If they are too far, their presence on screen is diminished. So I treated previs as a precise, scientific exercise. It was very accurate. I went to the stadium to see the vantage points and get a feel for what the audience experience was going to be. It was a very interesting process, both creatively and technically, before we even got to the set.”

For The Astronauts Guild, the awareness of stitch lines was a constant factor. “We wanted to make sure everything was going to stitch together for us,” Connolly says. “The previs was so specific because we had four different set-ups, and we had to be very specific about where the cars were in relation to the stitch lines. The client was Mercedes, so we couldn’t have an optical stitching error that couldn’t be repaired on the car. At that point we’d have to remake the car from scratch — and that’s just not possible. So we were able to figure out with the previs how we could change the configuration of the rig as well as choose the specifics of the lenses.”

As an example, for the last set-up, which was to feature eight cars all driving toward the center of the scene, the six-camera rig wasn’t going to be ideal. Instead, the crew planned to shoot twice, each time using four cameras to shoot four cars in a 180-degree scene. “We lined up all eight cars and shot one direction with four of them coming right down the centers of the lenses, and then we flipped 180 degrees and shot four coming in the opposite direction,” says Connolly. “Cameras 9 and 10 on the axis allowed us to stitch those two 180-degree views together seamlessly.”

Light Tests and Preserving Magic Hour

Prep became less virtual and more physical as the shoot drew closer. Pesses remembers spending time with Sharabani doing light studies with still and VR cameras. “The cars would go around and around, and we’d only use a certain slice of that footage,” Pesses says. “So the key was figuring out that the black car should be against the open sky and the silver car should silhouette against the backlight coming from the mountains for good contrast. It was about making these beautiful, high-end cars look beautiful and high-end in every direction, all the time.”

“For us,” adds Connolly, “the number-one most important thing is that we be as invisible as possible so that Vico gets to rehearse these very complicated moves as much as he can during the bad light — so that when we get to the good light, everyone has rehearsed so we know exactly what we’re doing.”

Sharabani was careful to investigate the best possible approaches to every moment. “There is a particular direction of light that is most complementary for cars, but the beauty light does not extend through 360 degrees, so there were many challenges,” he notes. One step he took was to select a location with a mountain to the west that would hide the sun about an hour before sunset actually took place, extending magic-hour lighting conditions. The planning paid off, as Sharabani actually had time to direct two full sets of shots — one in mid-day and one in magic hour. “It was very interesting to be able to over-deliver in a job that was so ambitious and innovative,” he says.

Smoke thrown up as the cars whipped around the camera rig was captured beautifully in the 360-degree panorama.
The Astronauts Guild

On-Set Workflow, Including 360-Degree Playback

On-set workflow for shots that had six cameras rolling simultaneously at 8K and 60fps relied on a tested combination of speed and expertise. “We start by getting super-fast drives, which is something we always budget for in one of these situations,” Connolly says. “And then we’re working with one of the best DITs in the industry, who’s really used to handling that many mags — remember, each reel is potentially six mags. And then, to be honest, we just had a really high-powered computer system with fast input properties and brought them all in a full reel at a time.”

Because pre-production was so extensive, there were no disruptive surprises during the shoot. But the crew had to make sure the client was comfortable on the day. That meant figuring out how to enable 360-degree viewing on site. The Red cameras’ ability to record ProRes proxies at the same time as .r3d raw master files made that possible. “We did some on-set viewing,” says Connolly. “We’d run the take and pull all six mags, and the first thing the DIT did was remove the ProRes proxies from the mags and give them to DJ [Derin Turner, VR Playhouse head of production and 360/VR supervisor], who was on set. DJ would do a rough stitch that he would eventually put in a headset to show the clients. They could see footage maybe 30 minutes to an hour after a given take.”

“That was a very nice usage of VR — not as a deliverable but as a confidence monitor,” Sharabani says.

Post Work at 20K: Breaking Us in Two

It required some ingenuity to get the project through post-production, as well. According to Vezzali, the camera-original .r3d files were opened in Redcine-X, converted to sRGB color space (via the appropriate LUTs) and transcoded to OpenEXR sequences at full 8K resolution. RLE compression was used to reduce file size without compromising quality before bringing the footage into Foundry Nuke for stitching.

“We ended up doing all our [stitching] work inside Nuke and Cara at full-resolution,” Vezzali says. “Because of the way the camera solver and stitching works, we couldn’t just limit it to our [final] vertical resolution. We were literally having to work at the full-aperture 8K for the camera solves. Nuke and Cara leverage the full aperture of the sensor to understand where the common features are in 360, so you can’t just crop it out right away, though that would have been brilliant. We ended up doing it in full-res and rendering a cropped, half-res proxy just for preview.”

Even playback was difficult because of the amount of data involved. “We were looking at a lot of it frame by frame, then using RAM-caching to QC 300-frame sections,” Vezzali says. “We would send it to Vico at The-Artery via Dropbox, and they would look at it in the [Autodesk] Flame on their end and give us feedback.” The-Artery’s color-correction was going to take place in Blackmagic Da Vinci Resolve, and Vezzali knew working resolution would be limited to 16K because of GPU processor limitations at the time. Accordingly, VR Playhouse split the stitched image into two files, each rendered out of Nuke at a more hardware-friendly 10K.

The Artery finished the project on a Flame system as usual. “Nuke was a great solution for VR Playhouse to work on the stitches,” Sharabani says. “On the other hand, our ability to work in real time on Flame, viewing on a large screen and seeing oddities or making very precise cleanups, was enormous. You can’t do that in Nuke, but in Flame it’s very natural. And remember, when you look at 360 video nowadays, you’re looking at 4K resolution. This is five times that — and it’s being projected on a screen that is 60 feet high and 1,100 feet long, so you see everything. Things that would be very hard to detect in a normal 360 video are magnified to extremes. So it was very important to finalize the image in the Flame.”

Success Story

Despite the creative, technical and conceptual complexity of the job, the project was a success thanks to clear thinking about both production and post from everyone involved starting at the very front end of pre-production. “When Vico and I talked originally about post-production at the beginning of the process — and we’ve done this on other shows as well, at different scales — it was about consistently trying to find the sweet spot in terms of the balance of tools,” Vezzali says. “Let’s figure out the most efficient and intelligent way we can manage the entire process upstream. What is your creative concept, what are you looking to accomplish, and what is the end deliverable? The rest of it is creating an appropriate pipeline to maximize fidelity and quality all along, at every stage, without compromising.”

Asked if there were any surprises when he finally saw the finished product at full scale, Sharabini says nothing was unexpected. “It was a moment of great pride and satisfaction,” he allows, “but we were so well-educated and the process was so detailed that I wasn’t surprised by anything. It was definitely a ‘wow’ moment, even for us, but it was more like, ‘Everything we worked for came out exactly how we planned it.’ It was at the intersection of creativity and innovation and technology, and I really get a kick out of those projects.”

Finally, as Pesses notes, the quality of the work on screen is the bottom line. “People generally don’t care how tough it is for you to do things,” Pesses says with a chuckle. “It’s either good or it’s not. It has to be good no matter how difficult it was.”