Arrested Development at Shapeshifter

For season 4, Arrested Development shot on the RED Epic. Using a typical television workflow, these original files were being converted to an uncompressed format, such as DPX. But Dakota Pictures, the production company that produced Arrested Development, was looking for a way to improve the efficiency and flexibility of the process. The greatest challenge was related to what I call "the Netflix phenomena," by which all the episodes deliver at the same time. That distribution model leads to a more flexible creative process and that leads to a requirement for a more flexible post process.

Shapeshifter has a lot of experience creating round-trips between our Resolve and the facility's many Avid systems. (In addition to audio mixing and visual effects, Shapeshifter has 20 Avid editorial systems, including Media Composers, Symphony and DS.) This made them a perfect choice for completing the final ten episodes of Arrested Development's fourth season.

In an interesting aside, I actually started at Shapeshifter as a training consultant on Resolve when the company first bought it two years ago. I've been working on Resolve since Version 6, way before Blackmagic Design bought the company, so I have as much experience on it as anyone. Ultimately, Shapeshifter hired me as a colorist.

Most of Arrested Development was done on a Windows-based Resolve, a system built on an HP Z800 server with an expansion chassis and three GPUs. We have a nearly identical Mac system as well but I chose Windows because it was an Avid AAF round-trip and our Symphony systems are Windows-based as well.

We had six weeks total on the 10 shows, and we had to deliver the last five shows over six days. It was a pretty intense schedule. By that last week, I was using both systems, one show on the Windows and one on the Mac, and I'd toggle back and forth between the two as needed.

VFX were coming in from several places including Shapeshifter, Gunslinger and Dakota Pictures. Each episode had about 50 visual effects in it, so VFX were a big part of the show and really dictated the flow of the finish. My assistant Alexander Schwab normally works the night shift, so in many cases, he'd be doing the color correction of any VFX I hadn't seen yet or applying color to any of the shots that changed if I wasn't there. Towards the end, I'd be doing the primary grading and he addressed any notes or changes or VFX, which was a very big part of it. In many instances, by the time the VFX all rolled in, I'd be involved in another show, and Alexander's work kept me from being interrupted.

Typically, this type of comedy contains about 22 minutes of content for a 30-minute time slot. Because of the Netflix model, these episodes were much longer, between 31 and 36 minutes. For a show of this length, we hope to have two days per episode for color correction, and wherever possible, we gave the Arrested Development episodes two days each. Because I was running two systems and even during the hectic last days, each show was in a different state of finishing, I got at least 10 hours on each show.

Another challenge was how the editors created handheld effects. The decision to shoot on RED Epic at 4K resolution was because they knew they'd have flexibility to zoom in for tighter shots or create the look of a handheld camera. In doing the round-trip, most of the effects the editors were creating transferred quite well from the AAF out of the original Avid MC to Resolve. But some required finessing in the Symphony. In those instances, we changed the workflow. What we did was to conform the show on the Resolve as soon as it came in; this ended up being the only way we could link conveniently and successfully to any media. In addition to the RED Epic, the production also used GoPro and several other smaller camera formats. We also had TIFFs, which is how the VFX were delivered. Each episode had three to five different formats and being able to easily relink to the color corrected media was critical.

Our understanding of the AAF round-trip process helped a lot. We do it all the time and have sorted out pretty much any issue that can arise - and there are so many potential pitfalls that you have to learn them. When you're following a more typical process of rebuilding the conform in the edit bay, doing a mix-down and sending that for color correction, you skip over tricky relinking steps. But, in this case, rebuilding the conform was what was causing the difficulties in post to begin with.

Once the conform was completed in Resolve, we did a render at low-res and our Symphony Editors David Berlinsky and Chris Povall would split-screen that with our reference, and any camera move or VFX that didn't work, they would recreate in the Symphony relinking with the original media. Once created, the shot would go to Resolve as DNxHD media. On an average, three to five shots in each episode needed to be recreated in the Symphony and the rest was done in the Resolve. Once color was completed, I would render the media out as DNxHD media for the Avid, finishing the round-trip back into the media via AAF. The final assembly of the show would be in our Symphony so we could put titles in and whatever other minor changes were made. Our editors here are also very comfortable with color correction and minor effects, and there were times when they would fix a few things along the way. There was tweaking being done all the way through the process. The vast majority of the color correction was done on the Resolve. All of the sessions were supervised by Post Production Supervisor Lincoln Sevier and an Editor.

Supervision was important for many reasons. Because they couldn't gather all the actors together for more than a couple of days, each episode is focused on one character; several characters have more than one episode. So in one episode, you may see another character in the background and then three or four episodes later, the action is from the point of view of that background character. You'll see the same party or festival, but see everything from another point of view. The first character might even wander by in the background and you might not catch it. That's why none of these shows can possibly be fully absorbed in one watching. There's a lot of that going on...and that required a lot of guidance from the producer to make sure I knew what was what.

I was also given the directive that any blow-up larger than 160 percent needed to be sharpened, and they were doing a lot of these; they would extract part of the image to create those hand-held effects for example. But we discovered that, because we were linking to the original 4K RED media, we usually didn't have to sharpen the shots. They looked terrific. The only sharpening we did was when they blew-up parts of the frame that weren't really in focus. The important thing is that we found that by working with the original media, we not only made the process more efficient but quite a bit better quality wise.

I was also told that any location that shows up in multiple episodes - which are many - I had to make sure the look was the same as in the previous or following episodes. To make it more complicated, I didn't do the first five episodes, which means I was matching somebody else's color, and I didn't know which tricks or tools they used in correction. As a result, I was constantly referring back to finished episodes throughout the entire color correction process. By the end, I had ten color corrected episodes as references that I was constantly going back to, to make sure that if viewers watch the 15-episode season in a different order - which will happen - that the continuity of color is still there. I often needed to manipulate the metadata, such as changing ISO and other data in the RED to get the latitude for the color matching.

It was so much fun to see how the characters have aged over seven years, and how the writers handled that seven-year gap. They were very conscious of it and it's incredible to see how well they managed it. They also selected a great number of flashbacks, which are integrated into the episodes, and there are watermarked, which is a gag, as if they ripped it off from a real show and it isn't fully licensed. This season of Arrested Development is full of little things like that the viewer has to watch for. There are dozens if not hundreds of hidden gags and spoofs. Still, every day, I'm finding more.

Thanks also to Debra Kaufman for coordination and additional editing on this piece. Follow her on Twitter @MobilizedDebra

When Blackmagic asked broadcast engineer, systems designer, and all-around straight shooter Bob Zelin to take a look at the new Teranex Mini line, he told them no. Converter stories are too boring. Just one problem, though: his clients couldn't see pictures out of their SDI gear going into HDMI inputs of consumer 4K TVs. BMD promised that one of the Minis would do the trick, and it did -- but Bob was pleasantly surprised to find even more than he was looking for! Find out why he calls the Teranex Mini lineup a no-brainer, and what it might be able to do for you. Read more...

Master facility designer Bob Zelin has been building studios for his clients based around the offerings from Blackmagic Design, and he's here to tell you: this is for real. The pieces are here now, and they fit together beautifully -- and he's got even more clients lined up for Blackmagic-centered studios. Read on as Bob helps you put together a plan to build your own.

A film scanner being released in 2014 is kind of a strange thing, but not when you get the full Ultra HD story. Blackmagic Design's new Cintel Film Scanner can create Ultra HD content in real time for a fraction of the price of older scanners, and many movies and television shows over the last thirty years were shot on 35mm or 16mm film - natively Ultra HD content, just waiting to be scanned and re-released into the world.

Pixie Dust flows from Blackmagic Design's Teranex 2D Processor to integrate and upconvert newly discovered vintage SD footage, photographic stills and new interviews into a re-edited and remastered 2001 documentary memorializing the 1963 NCAA "Game of Change" between Mississippi State University and Loyola. Over 60% of the new DVD/Blu-ray includes new footage or pictures, and the challenge was to make the old SD footage look as good as possible in an HD project.

Stop using the high price of HD waveform/vectorscopes as an excuse for not checking your video. Join Bob Zelin for a closer look at a real-world installation of Blackmagic's new SmartScope Duo, a practical, flexible, and yes, affordable approach to broadcast-quality monitoring.

MTV took a step towards 3D broadcasts at its annual Video Music Awards, feeding a 3D signal into the Regal Theatre adjacent to the event's downtown Los Angeles venue. That 3D signal was a real-time 2D-to-3D conversion thanks to Blackmagic Design's new Teranex 3D processor, which was shown at NAB 2012. Priced at $3,995, the Teranex 3D processor provides a 3D exemplified by depth of field and no artifacts. For MTV, the 3D broadcast is yet another step in experimenting with the right technology for the right event.

Blackmagic Design used IBC to debut seven products, including a second version of its camera, the Blackmagic Cinema Camera MFT. The UltraStudio 4K, UltraStudio Mini Monitor and Mini Recorder, and DeckLink 4K Extreme, as well as Mini Converter 4K and Videohub Master Control emphasized the company's focus on 4K and Thunderbolt-enabled products. The company also added support for the ExFAT file system, for Windows users.

On July 24th, Blackmagic Design announced its acquisition of long-time film/TV telecine, film scanner, and post production technology leader, Cintel International. Blackmagic Design President Dan May spoke with Creative COW's Debra Kaufman about what this means for the company's product line up as well as for the endurance of Cintel's R&D and the future of film.