Friday, December 10, 2010

Last night was the latest Boston DSLR Meetup at E.P. Levine in Waltham. A capacity crowd turned up and filled the studio to hear Rich Harrington, a video editor, producer and co-author of multiple books, including From Still to Motion [Amazon $31.49].

I've seen Rich speak before, and he certainly believes in packing in the information. I suspect he drinks three or four cups of coffee before he begins, and once he starts, he doesn't stop - and only takes questions in 30 second breaks along the way.

Pizza and socializing before the meeting starts

He's a man of strong opinions too, quite happy to tell people when they are wrong. Noting the variety of people in the audience - from still photographers to videographers, producers and editors - and the need to cover many different topics, he suggested that if you found yourself bored during the talk, just wait two minutes.

Rich Harrington getting ready while Dan Bérubé stands in for the screen

There was a lot of equipment and theory covered, but I was intrigued with his focus on the importance of pre-production planning. He said that too often the focus is on the production phase, when in fact better pre-production planning will save time, money, and produce better shots. He recommends going out and shooting stills of locations, planning your lighting, and he even recommended some iPhone apps for working out where the sun will be when you shoot!

He also is a strong supporter of editing natively, and said that if you're not using either Media Composer 5, Premiere Pro 5, or Vegas Pro, then please leave your wallet on the table for him, as you clearly don't care about money...interestingly, he does still use Apple Color for final color correction.

The meeting concluded with some showings of movies made by attendees. Unfortunately, I had another appointment and couldn't stay for all of that.

Definitely a great meeting, and it will be interesting to see what's in store for the coming year!

Large Sensor CamerasHigh Definition magazine features an article about the new large sensor cameras from Panasonic and Sony (you can read it in a flashy online viewer.)HighDefinition: Digital Issue 45

Createasphere WebcastsCreateasphere has a series of webcasts coming up featuring creative teams from the TV series GLEE, Dexter and Boardwalk Empire discussing the challenges they face.Createasphere: The Best of TV Webcasts

Canon 7D used to shoot Community stop-motion episode
NBC used a 7D for an upcoming Christmas episode of “Community”. Broadcast tonight, Thursday, Dec. 9th, 2010, they’ve gone old school and done a Rudolph-esque claymation epsiode. You can see a behind-the-scenes video at PetaPixel.PetaPixel: Canon 7D Used to Shoot Stop-Motion Episode of NBC’s “Community”

Due to its small, unintimidating footprint and excellent photography in low light, DSLR cameras are finding their way on movie sets everywhere. In this free seminar, Shane Hurlbut, ASC and Jacob Rosenberg discuss how their movie Act of Valor was shot using Canon EOS 5D Mark II DSLR cameras and how Adobe Premiere Pro was used on set to natively edit the DSLR files in real time. They will explain the workflow they used to edit the film and how Adobe helped knock down roadblocks in post-production.
Produced by Videography and sponsored by Adobe.Registration: How DSLR Cameras and Adobe Premiere Pro Help Movies Get made

Pitching your film project effectively is one of the most important elements in helping with outreach, collaboration and fundraising. Join us for this exciting pitch panel with selected experts, who will help deconstruct your pitch and teach you how to be successful in pitching your film projects. If you would like to pitch your film or work in progress, please submit a short description of the project and a link to a video sample to jen[at]filmmakerscollab.org by Friday January 14th.

Network & socialize with fellow DSLR filmmakers for our Holiday gathering of the Boston DSLR Meetup and Boston Final Cut Pro User Group! Plan to have a great time to screen your shorts and meet some great DSLR shooters, filmmakers and digital storytellers and talk about movie making!

COLLABORATE & START A PROJECT
And, for December, we will begin our mission goal to get members involved to collaborate together in crews to start a project - Stay Tuned for updated details!

DSLR Workflows – From Pre-Production to PostRichard Harrington, Rhed Pixel
Join Richard Harrington, a Director and Editor as he shares practical workflows for DSLR projects. Learn essential planning techniques including planning for storage, synchronization, and gear selection. Rich will also demystify post production with a particular emphasis on native editing. Learn how to transcode less and edit faster (no matter which NLE you choose).

A few weeks ago I attended the Public Television Quality Group's Boston Workshop. This two day event covered everything from shooting and editing to delivering the program to PBS. But you didn't have to be a PBS producer to get something out of this experience.

Emmy award winning SMPTE Fellow Mark Schubin started off the event with a presentation entitled: Things You Can and Can't Fix in Post. Beginning with the statement:
You can fix anything in post only if you have enough time and money, Mark covered a wide range of topics from lenses to lighting.

He demoed using a polarizing filter to cut reflections, he pointed out that it’s not always a good idea to do filtering during production since most optical filtering can be closely emulated by digital filtering - and he stressed the fact that it's the camera operator that's the most important function in the equation.

Here's some of the themes of the talk:

Acquisition starts the chain and affects everything that follows. If you want to get the greatest improvement in what you’re doing, and you don’t currently have a lighting director, hire a lighting director.

[image quality] is affected more by operator actions than by camera characteristics, and sharpness is affected by contrast as well as resolution. A really good operator with a really bad camera is going to do a much better job than a really bad operator with the world’s best camera.

Lighting is not just adding light; sometimes it’s subtracting light

Aperture significantly impacts the sharpness of the picture. We get less sharp as we go to smaller apertures; that’s caused by diffraction. And we get less sharp as we go to wider apertures; that’s caused by the lens aberrations.

How do you get to the sharpest sweet spot? You can add or subtract lighting, you can change the gain of the camera, you can go to negative gain on certain cameras if you have too much lighting, You can use the shutter, again if you have too much lighting, that also reduces motion blur but that’s not necessarily so good because it can introduce motion judder. And you can use neutral density filtering, but one of the things you have to watch out for is possible glass flaws...

There are different types of resolution:

There’s temporal resolution, which is frames per second

Spatial resolution; resolution across the picture

Dynamic resolution: things that are moving and Static resolution, which is what people typically measure

Chroma resolution, or resolution in color, and luma resolution, which is resolution in black and white

Non-dimensional resolution; limes and pixels

And then there’s linear resolution, you may be familiar from computer stuff; dots per inch, if you for photography you may be familiar with line pairs per millimeter

Sharpness and things that end in -ness - sharpness, brightness - those are things that people perceive. They are subjective functions They are not objective functions. Resolution is something you can measure.

Sharpness is tremendously dependent upon the amount of contrast that you are getting.

When you buy a camera that has a 2/3” imager […] absolutely nothing in the camera is 2/3” The reason that we call it that is that back in the old days when we had tubes, and the tubes were round, we measured the tubes by the outside diameter, so in a 2/3” tube, the tube was 2/3 of an inch around, that’s 17mm, but the target area on the tube was only 11mm […] which is less than 1/2 inch.

Going to a smaller lens format means you need a better lens.

To sum up: Acquisition affects everything that follows, so problems should be fixed there.
Operator actions effect picture more than most camera characteristics
Sharpness is affected by contrast as well as resolution
And contrast is affected by diffraction and lenses.

At the end he was asked about the PBS Technical Operating Specifications, which require submitted HD material be from three-chip camera, not from a single-sensor camera. Since he'd mentioned single-sensor cameras in his talk, he was asked if they should seek a waver, ignore the TOS, or does Mark Schubin have some special privileges? His reply:

I as Mark Schubin don’t have special privileges and everything that I’m involved in is three chip. I am not doing any single-chip stuff for PBS. As for the other stuff, I would refer you to the TOS session and maybe you can raise that, and I think it’s a good point to be raised... and on the QT, they probably can’t tell. So I’d say if you’ve shot single sensor, you don’t have to tell them

The PMW-F3 launch event will include a screening of some of the first footage acquired with the camera, and a panel discussion with the production teams involved. You'll also have a chance to demo the camera.

Zoom Q3HD Video Recorder
Looking to improve the quality of your video's audio (though maybe not the video itself?) Zoom has combined what looks like the mics from the H1 with a video camera to create the Q3HD (think of it as the love child of a Flip and an H1 perhaps?) Price is $299, and it's just started shipping.

It's an intriguing option perhaps for musicians who want a simple way to make a self video with better audio. I'm just not convinced that to improve the audio you always want the mics on the camera...Zoom Q3HD:Product PageB & H:Zoom Q3HD Handy Video Recorder [$299.00]

iPhone as Audio Recorder
Speaking of the Zoom H1Photo Cine News has a short article about using the iPhone as an audio recorder in place of buying a separate device like the H1. I'm partial to the idea, though the other day I was doing just that - using the iPhone as an audio recorder for an interview - when I got a phone call. Messed up the recording.Photo Cine News: iPhone Audio Recorder as Zoom Alternative

Overall I love the idea of the Monitor X, and for those using a lightweight Steadicam Merlin or other stabilizer it is a must have product as it's light enough to balance on these rigs. It's also great when shooting on a tripod at eye level. You can keep both eyes on the event, using the Motion-X to help keep accurate framing and focus. I also love that JAG35 included two 1/4-20" mounts, allowing you to use it with articulating arms when used with a Cage or Shoulder rig. I hope that JAG35 takes it one step further by adding an anti-glare coating.

Sunday, December 05, 2010

It's annoying when programs do things they aren't supposed to - like crash while saving, or a feature won't work when your project gets too complicated. But I've had pretty good luck with Final Cut...until last week when I was editing the ATC "making of" music video. Everything was going fine - I'd even done some test exports - until I made some changes and did another export and suddenly something very odd was happening; a still frame from a completely different clip was being rendered in place of a couple of clips at one point in the timeline.

It didn't happen while playing back in Final Cut; only when I exported the clip using the Using QuickTime Conversion option..

In the past I've found that odd things like this can sometimes be "fixed" by adding a "non-filter" to a clip i.e. forcing the editor to do some processing to the clip before it exports it. I tried moving the clips to another track, as well as putting another clip underneath the clip. All I seemed to do was cause the odd rendering effect to move to another part of the sequence!

I even tried copying and pasting to another Sequence, and exporting to Compressor (which takes about four times as long) and still the problem occurred.

Finally, I did a simple export of the entire movie as a QuickTime Movie, which saves it in the format of the Sequence, rather than recompressing to a format of your choosing. This finally worked; and then I just imported the clip into another Sequence and exported it again with the settings I wanted.

It was troubling that I couldn't figure out why it was mis-behaving; especially as the project was short and not very complex.