Friday, September 27, 2013

Julian Smirke, First Assistant Editor on Star Trek Into Darkness was the guest at last night’s Boston Creative Pro User Group. He talked about being a first assistant editor, the process of editing a movie, and how scenes are developed. He also showed example clips from the movie.
The following are my notes from the meeting:

Julian opens with “I'm a first assistant editor, most of you are probably wondering what that means.” As an assistant editor, his task was primarily to assist one of the two film editors, Maryann Brandon. “It's a constant learning experience learning how to shape a performance, shape a story.”

He shows a picture of his workspace at Bad Robot; he likes to work standing up at an elevated desk, though he also has a high chair to sit on for breaks.

They worked on Avid Media Composer - he said most large Hollywood movies edit on Avid. “There was a period of time five or six years ago when there was a push to Final Cut, and they both have pluses and minuses, but Avid is such an awesome system." With so many new features being added, he said "I’m still learning it.” [Note: Avid was a sponsor of the event]

They were running 12 to 15 Avid systems at Bad Robot with 86 terabytes of storage. The movie was encoded in DNx 115 and the completed project was about 150+ GB. When they started, Media Composer was at version 5.5.3, but they upgraded to version 6 during the film. He noted that in the past editors have been reluctant to upgrade during a movie: “In the past you wouldn't usually do that on a big feature, you stuck with what you've got. As technologies have gotten better and better, we've gotten more confident.”

The movie was shot it in several formats and he said that Avid handles that with no problem. Formats included: 35mm CinemaScope film, 65mm 15 perf (IMAX), 65mm 8 perf and VistaVision, that was used for an IMAX extraction. Also used was an ARRI Alexa and a lot of Red Camera footage was used mostly for video of people on screens. Some Sony F65 was used for some VFX plates. That said, most of the movie was shot on film, meaning that dailies were telecine’d and then sent over to the editors. He said that the processing house synced it as best they can, but they still need to make sure there's no problems with the sync. He says that he never trusts that the time code sync has worked because of the conversions that the footage has gone through, so he always uses the slate.

Within Aivd they kept the dailies separated by cameras (i.e. the formats), ultimately organizing the film into reels, which match the reels of the final film.

He showed some 65mm footage, which is almost close to 4by3 aspect ratio, and he explained how they have to make the decision on how this is going to be framed in CinemaScope and in some cases dynamically move the picture to fit the frame. You could see the footage move up and down in the frame as the framing was changed.

Julian then played back a sequence that was a mix of IMAX and Cinemascope footage. Some filled the entire screen, and some was letter-boxed. He said that if you had seen it in an IMAX theater you would have seen the same change in aspect ratios. “J.J. had the idea that all the outside stuff was IMAX and the inside was CinemaScope, and that's how it was projected in IMAX." He added that the best way to see an IMAX movie shot on IMAX is to see it in an IMAX theater projected on film.

There was some discussion of film vs digital and what digital resolution you need to match the resolution of film. Julian started work editing film, and still has a love for it. He didn’t really get technical, or have a real answer what resolution of sensor was needed to match film, but he also pointed out that most movies are broadcast at 2K at the moment.

Perhaps the most interesting part of the discussion was when Julian showed how one scene evolved in the editing process – an attack on the Federation leadership meeting in a conference room. He explained that one task of the assistant editor is to put in the temp sounds and audio to make a temp mix “to help it play better.” For the first time, they were able to do a 5.1 mix as they were working. This had the added benefit that the audio they produced carried through to the final mixes. Though a lot of it was ultimately replaced or enhanced by the sound editors, for the first time he said that part’s he’d done did actually make it through to the final movie.

He showed the original edit of the scene, with temp audio, some previz, and extra beats, including Kirk helping a woman that was injured and seeing Pike killed. He talked about coming up with the sounds such as the jump ship hover sound using a variety of other sounds mixed together. He said he has a lot of old sound libraries; “none are that great, but when you put them together you can get some quite good stuff.”

He showed a later edit and then said that when they showed it to J.J. he said he thought the sound was too muddy. “I had this idea that when Kirk runs to the side, the windows are intact, lets make everything subdued and muffled. Maybe there's a way we can play off that?” He then showed a later edit, where the editor decided to remove some of the extra parts in the scene, making it tighter and better.

They spent close to a year on post production on the film.

Questions from the audience:

What traits are needed to be an exceptional first assistant editor? "One thing that's really important is attitude; a friendly personality. An ability to adapt and change to whatever is thrown at you, especially on a big movie, because a lot of things can be thrown at you. You have to know enough technical stuff to be able to do the job."

What jobs helped me brake into the industry? "Nothing!" He PA'd in production first, and didn't like it, then he started as an editing PA. He never went to film school, he learned everything on the job. “Sitting in a class is just not me,” he said, adding “and I know I'm talking to a lot of students here, so I'm sorry.”

He added that he spent a lot of time unemployed, trying to get into feature films. He said he didn't want to work on TV

When the color grade comes back, do you import it back into Avid? Yes. One thing he said that he does is export a QuickTime of the original edit and then he plays that back at 50% over the movie after the corrected files come back, just to make sure nothings been messed up. Doing this he caught a one frame mistake in reel 6; a VFX shot was missing one frame in the middle of the shot.

Julian said he'd like to become an editor, but added “Being an assistant editor, i's a really exciting and challenging job and I couldn't love it more.”

Thursday, September 26, 2013

Taking The Movies Out of The Movies | proLostI wasn't a fan of The Hobbit, but I thought that was because I felt like I'd already seen the movie. Evidently, other's didn't like it because of the high frame-rates?

So apparently, what we’d be “fools if we didn’t learn from The Hobbit” is that we can charge more money for stuff people don’t actually like.

Having a clear plan. Build a crowd of followers who might like a product when it’s launched. Build a network of other professionals in your city. Have an online presence with the goal of ‘industry recognition’ and ‘brand awareness’. All of these are valid reasons to use Twitter. Without a targeted goal you are aimless.

“The latitude test was shot on a backlit-latitude chart. I had 13 shots of detail. We took that EOS C300 camera data back, processed it at the studio and were amazed by how good it looked on their big projection screen.”

Of the 702 occupation types studied, however, photographers (and, fortunately for us, writers) seemed to be dodging this bullet. With #1 being the least probable to be replaced and #702 the most probable, photographers rank at #91.

In post, I imported the clips into my MacBook Pro and opened them up in Adobe Camera Raw. What I saw was, finally, what I shot—it was dense. I could manipulate the image like I was in a dark room mixing chemicals—but now I was mixing temperature balance, tint, contrast, exposure, and film curves.

New features to Premiere Elements 12 includes a guided edit system that helps to walk the editor through a project, highlighting the features and functions of the application. There is no more having to sit through hours of footage to find the one usable shot.

In the civil group, those who initially did or did not support the technology — whom we identified with preliminary survey questions — continued to feel the same way after reading the comments. Those exposed to rude comments, however, ended up with a much more polarized understanding of the risks connected with the technology.Simply including an ad hominem attack in a reader comment was enough to make study participants think the downside of the reported technology was greater than they'd previously thought.

This all sounds like great news for the consumer and, to the outsider, it might seem that subsidised and protected film production in Europe is in rude health. Yet there’s a persistent paradox that will not go away; let’s call it the illusion of choice. While all of the experimentation in new platforms and release windows has unquestionably increased the volume of films available, volume and choice are not the same things.

A 2010 study by the University of California found that human sensitivity to non-linear alarm sounds, such as ones made by groundhogs to warn about predators, is being employed by film composers to unsettle and unnerve.In films like Hitchcock's 1960 classic Psycho, straining strings and overblowing brass are mimicking the noise of panic in nature.

How'd they do that?The deal's over, but SonyAlphaRumors reported a dealer in Europe was offering the Sony NEX-VG900 for $1,545, which is half what it's being sold for by most dealers in the US. I'd have bought one for that price!

On its own, FCPX cannot edit and output a 3D master. However, coupled with Resolve in the middle to do the 3D heavy lifting, it performs rather well. Here's how Matt Brading did exactly that on his latest short film.

The Z-Drive is a unique direct drive universal follow focus that can be used in an infinite number of positions. It can be used on the operator side in a frontal downward position for shoulder mounted use. It can be used in a angled backward position for use with a whip if you are standing behind the tripod or dolly. It can be used at any angle up, down or in between on the operator or assistant side.

We recommend using the Z-Drive with our Tornado grip when working handheld. The Tornado grip connects to the Z-Drive follow focus via a standard whip port. The unique 60 degree curved shape of the Z-Drive and the comfortable horn shape of the Tornado grip make a mechanical follow focus handgrip perfect for single operator shoulder mounted work.

We designed the EasyGimbal for the GoPro Hero 3 because this camera offers a lot of performance in a very small form. For maximum stability, we decided to go with a 3-axis design over a simpler 2-axis design. Unlike 2-axis designs, having all 3 axes stabilized allows for optimal results even when walking around with the gimbal.

Upcoming Events

Join us for this month's Pub Night on Wednesday, 9/25/13 from 6-8pm with Pizza and Beer! This event is free. See parking details at bottom.Shooting Abroad with Award-Winning Cinematographer, Zach ZamboniZach Zamboni returns to Pub Night to share his vast experiences filming in literally every corner of the world. Zach will talk about his most recent travels with feedback on what it's like shooting with the new Sony F5 and S-Log on the fly, in documentary situations. He'll cover his adventures from both a technical and non-technical perspective, and he is looking forward to questions on all aspects of travel and cinematography!

First Assistant Editor Julian Smirke visits us from Los Angeles to discuss working on STAR TREK INTO DARKNESS. Julian will screen footage and share the importance of his role in the editorial and storytelling process.

Tuesday, September 24, 2013

For professional use, the fact you cant see how much recording time you have left or even format a card is a completely ludicrous omission and causes issues. Issues that take more time, more work, more stress, more stuff you shouldn't have to deal with in a camera. Blackmagic need to sort this out, it cant be difficult.

My computer is a 2011 MacBook Pro. It was the first to have Thunderbolt. 4 GB of ram. 2.0 quad core. It easily runs Avid Media composer, Motion, and just enough GPU to run Resolve. When I say “just enough” the software plays the 2.5K at the blazing speed of 1 fps. Not exactly the editing powerhouse, but it gets the job done... it just takes a bit longer.

In the first video, London based filmmaker and director James Tonkin shows the audience Blackmagic Design's DaVinci Resolve 10. He majors on the Blackmagic Pocket & Cinema Camera footage within Resolve.

As is usually the case, I had to turn over the User Manual (also in beta) to production much sooner than the software was actually finished in order to create the final layout. Because the DaVinci engineering team is so ambitious, this means a few new features were slipped in at the last minute that either aren’t in the manual, or are easy to miss. Here, then, is a short list of some great new features you should know about, lest they slip beneath your radar.

When I first became a cinematographer, there were two companies in the gel manufacturing business, Rosco and Lee. A few years later, a new kid on the block emerged, GAM Filters. They had been big on party colors, aka theatrical colors, and jumped into the color correction gel business as well. Back in the day, CTO was the only way to warm your lights up in increments. You had: 1/8, 1/4, 1/2 and Full CTO.

Color Me Mini| Coat of ArmsA short promotional piece about color grading a commercial. Note that they are using Magic Bullet Looks!

Lacocque also softened some edges and added a minor vignette to help focus the viewer. Ranged saturation, HSL, curves, pop, and contrast round out the overall look. And finally, whenever pushing a color look, Lacocque uses "auto shoulder" or another broadcast safe filter to ensure there aren't too many highlights that could cause issues when viewed.

In other words, should the shooter trim out all head / tails / extra roll material so the editor can just get the meat and get going quicker. Now for those of us who have been around it’s obvious we get EVERYTHING. But in this age where the tools are completely accessible to all, not everyone goes through traditional editorial training, thus very simple questions like this are not encountered.

To start our test, we set up the D16 on a tripod and lit a color chart with a daylight balanced Kino Mini. First we dialed in the 100 ISO settings using reference images of similar charts from other cameras so we could understand where other cameras like the Alexa place their black, white, and middle grey values. 100 ISO is by far the easiest because it uses the least amount of gain; in our case it used zero gain.

My favourite app for audio recording on iOS has just had a major overhaul. Rode Rec version 2.8 has an all new interface for iPad which puts all the essential controls on one screen – including the waveform display. You can also now export audio as an MP3 and even edit the audio right in the app (although I imagine most filmmakers will want to edit audio in their NLE or audio editing software instead).

The PL-mount variant of the Tokina 16-28 Cinema will be first to ship in the Japanese market, and is currently scheduled to arrive in late-September. The EF-mount variant is currently expected to follow from late November. Pricing for either version is set at around ¥580,000 (approx. US$5,900).

However, the approach does have drawbacks. Although the cameras are compatible with an iPhone, they're best suited for devices that support Near Field Communication. Pairing them with a Samsung Galaxy S4, for example, was a snap, but making them work with an iPhone—or any other devices that isn't NFC compatible—requires fiddling with Wi-Fi settings.

Monday, September 23, 2013

I have to admit, I've wanted to get a quadcopter and fly a video camera for the last couple of years, but the cost - and the thought that I'd probably make a mess of it and just waste the money - caused me to hold back. A couple of weeks ago they went and dropped the price of the DJI Phantom by $200, and when a friend said they were going to get one, well, I had to jump on it.

Adding the prop-guard (an optional accessory)

First things first, the box says "ready to fly" and yes, you don't have to do a lot of construction, but we did find that there was no manual in the box, and there's some confusing bits and pieces you have to do before starting. Even attaching the battery to the charger was a little more complicated than expected; partly because there's two attachment cables, and one of the connections was very tight!

When you go to the website there's a bunch of manuals and instruction videos, and it's a bit overwhelming. Where are you supposed to start? Where do you stop?:

Even the videos were almost too much to absorb. It was like you needed to take a class before you could get going!

Some of the manuals are a bit opaque:

If in GPS ATTI. Mode, place the aircraft in an open space without buildings or trees. Take off the aircraft after 6 or more GPS satellites are found (Red LED blinks once or no blinking). If in ATTI. Mode, you can skip this step.

We tried updating the software on one of the copters, and that turned into a confusing hour (partly because the software's instructions are generalized and don't completely apply to the Phantom.) We got through that after some futzing and guessing. After a couple of hours we pretty much got it all set up, but it was still a bit of work.

Getting ready for the first flight

That's the bad news.

The good news: the first flight was fantastically easy. Really. I've flown a couple of remote-controlled helicopters over the years. One was a fairly large unit that belonged to a friend, and the other a very small, inexpensive "indoor" flying copter. I think that the number of seconds of controlled, stable flight in both cases could be measured on the fingers of one hand. So I was a little concerned about how long it would take to learn to fly this thing.

I started up the engines, pushed up on the control, and up it went. And when I released the power control to the middle position, it just hovered there. Then I tried left and right, and back and forth. And that's what it did. It really was very easy to fly. There's still lessons to learn, and I need to build up some flight time to really get confident with it, but this thing is simple. After only two flights, I could even get it to fly in circles around me.

Finally, there's FPV systems, which allow you to see what you're shooting. Those may be more important than a brushless gimbal, though with a wide angle lens like the GoPro or Sony ActionCam, you only have to vaguely point the camera in the right direction and you're probably capturing something!

I'm not going to post in video yet; you've surely seen much better stuff already.