The hype is real. Everyone wants to shoot 3D all of a sudden. A smallish horror film ($7M) I'd been prepping in 2D may now go 3D. So I'm making inquiries I have never shot 3D, but like the rest of the world, find it's immersive potential suddenly manifest in Avatar.

The film takes place in the woods. And the director and I had discussed a look like 28 Days, a low res video look. All of which had lent itself to a smattering of formats and a bunch of cameras, from RED to EX3s and the like. We had envisioned shooting on location with at times a very small crew.

I'm realizing that to have control over the image we'd need a proper 2 camera rig with large crew and lots of electronics--what with 3 data streams, proper monitors, remote control over focus, zoom, IO, convergence etc. So the whole tromping around the woods with a little camera idea seems to have gone out the window. The little P2 Panasonic seems to be worth checking out--might be ok for certain shots.

Is there a way to preserve that sort of freedom?

Would dimensionalizing work? Or be expensive and not very satisfying?

Where in LA could you point me to play with a 3D rig? Shoot a test, etc?

Certainly would like to play with the SI2K 3D rig.

Much obliged for your thoughts

Byron Shah
DP Los Angeles

Byron Shah writes:

>>"Is there a way to preserve that sort of freedom?"

Consider shooting parallel, mainly, with small camera heads side-by-side, but have a mirror rig handy for close-ups or larger lenses. Stereo doesn't have to be big-rig. I can point you to a modern stereo VistaVision camera which is Steadicam-able if you want.

Parallel is the main method for shooting IMAX 3D, and both "G-Force" and Tim Burton's upcoming "Alice In Wonderland" are parallel shows. Briefly put, when shooting parallel, convergence is at infinity and quite naturalistic, and shooting converged, reverse parallax is allowed for a potentially stronger stereo effect, but at the cost of having to operate the convergence separately. Tying convergence to focus is IMHO like tying the brakes to the steering in a car. It's true that there's some correlation between the two, but making it a mechanical relationship is a mistake.

The standard fixed interocular of a parallel rig can be too strong for close-ups, so a mirror rig is needed to collapse the interocular to less than the physical camera and lens width.

BS: "Would dimensionalizing work? Or be expensive and not very satisfying?"

You can consider it for specific effects, like mixed stereo base shots, to avoid depth problems in close-ups or retinal rivalry in lighting, for aerials, or to repair on-set failures. But it's likely to be too expensive to rely on ($5-15M for an entire feature at high quality). Shooting in the woods will place you on the higher end of that scale. As with VFX, anytime you can shoot, you should shoot, but be prepared to spend some money in post. There's no reason for it to be an either/or proposition for a feature. Contact me directly if you want the really long explanation with illustrations.

BS: "Where in LA could you point me to play with a 3D rig? Shoot a test, etc? Certainly would like to play with the SI2K 3D rig."

Paradise FX, 3ality Digital, Pace HD, Keslow, etc., etc. I'd say that if you haven't shot 3D before, you need to test, test, test. The optimal mise en scene and editorial strategy for a 3D film _can_ be quite different from 2D.

Tim you are very experienced in this area and I hate to differ...and I could be wrong.

I think these were both shot mono and dimensionalized. I spoke with our friend and former employer Hoyt Yeatman about GForce after a panel last year. I was wondering why he didn't shoot stereo. It will be a great medium for him because he is just so good and technical. He said that the 3D rigs did not offer snorkel lenses and he would be too restricted shooting 3D. I also think Disney brought up the stereo aspect late in the game after the success of Hannah Montana concert. I did not see GForce in 3D.

I believe the challenges of stereo match-moving in the on Alice the choice was made to dimensionalized. in post..

I really worry about the pain and headaches of a flurry of low budget green lit stereo.

Fixing bad stereo is a huge pain. The Foundries Nuke and Ocula have a nice selection of fix it tools now. I suspect this will be a nice new market for compositors fixing bad stereo from not great rigs
Or lack of experienced stereographer advising of the pitfalls.

That's the bright side.

My live action compositing friends have really had a tough year.

Best,

Jeff Olm
Stereo FX and Colour

Disclaimer- Did not work on these shows and I just use the Foundries
tools I don't sell it.

Jeff Olm writes:

>> I think these were both shot mono and dimensionalized.

Not mutually exclusive. They were "Dimensionalized" (a trademark of In-Three) as parallel. If you do it in post you get to choose. Or do both.

>> Fixing bad stereo is a huge pain.

Sometimes it's frankly easier to just toss one eye and rebuild it from scratch. Or, like in "Avatar", just _don't_ fix it at all

Tim Sassoon
SFD
Santa Monica, CA

Thanks Tim.

I believe In-Three has been doing most of the work for these Imageworks shows. So we were correct on the term. Thanks for the disclaimer.

>> Not mutually exclusive. They were "Dimensionalized" (a trademark of In-Three) as parallel. If you >>do it in post you get to choose. Or do both.

I don't work for them but they did do G-Force
http://www.in-three.com/filmography.html
And they read the list. Be great to hear comments when they can.

>>Or, like in "Avatar", just _don't_ fix it at all :-)

I've only seen it once. I didn't do my technical pass viewing on it yet. But will do soon. The 17 year old kid next to me with his dad took off his glasses and cranked his IPOD. He couldn't the handle stereo.

I did enjoy it with my son in IMAX. but was pretty distracted by the ghosting of my off angle seat. Poor seating was the only available. Due to chatting with a hockey dad in the hallway before we secured proper seating. But that’s a issue. to consider when attending stereo films.

Like Journey 3D another PACE show I worked on. Framed actors will be on the screen plane tied to focus. And the BG will give you the depth.

Works well.

Makes DI roto easier. Which is a huge issue if the characters are not on the screen plane.
You have to offset the roto.

10 stereo moments. Not 9. Count them 10. It's like the commandments.

Best,

Jeff Olm
Stereo FX and Colour

Disclaimers -

Did not work on Avatar, but talked about it. Does not work for IMAX but works with them. Does not work for Pace works with them. Does not work for In-three. But they are in my hood. So I keep a eye on them.

Early up in this thread I think Tim described best practices for doing a 3D film, whether low budget or high end

I have issues too of slaving depth to focus (even is it allows for easier rotoscopy later) all you need to do is take of your glasses in some scenes in Avatar, to realize what you couldn't put your finger on (that mild feeling of discomfort) was actually the constant zig-zagging of the Convergence plane. .. kinda like drinking too much single malts... smooth and easy, but do it over time and you can get drunk at the end of the evening.

2D--3D another pet peeve is best left not discussed, though I think only the rodents were rendered in a CG stereo rig and the plates were shot 2D and converted?

>> Not mutually exclusive. They were "Dimensionalized" (a trademark of In-Three) as parallel. If you >>do it in post you get to choose. Or do both.

We did both on G-Force, actually, since the mono plate allowed us to "do whatever the heck we wanted to it". Some shots are the former and some the latter, and other shots are hybrid multi-camera parallel/convergent (where the foreground elements were converged but the background stayed parallel).

Three did a bunch of shotwork on G-Force but Imageworks also has its own nicely fleshed out stereo pipeline that handled most of the shots.

This plus mixed stereobase is a significant advantage of 3D conversion (or animation) - it's not something one can shoot live action in binocular stereo, without being pulled over by the Physics Police. When I said "G-Force" and "Alice" were parallel shows, I meant that all the straight conversion or shots with no need to be anything else were done parallel, with "G-Force" under Sony Imageworks stereographer Rob Engle's leadership, and "Alice" following the same pattern. I worked on both.

The reason I mentioned these shows in my original remarks was as current exemplars to reassure a live action DP that parallel shooting is a respectable option, not to haggle over their details.

Note that if one has stereo live action which one isn't satisfied with (i.e. what should have been done with quite a bit of "Avatar", but wasn't), one can throw away part or all of the one eye and rebuild it to taste. This is/will/should be a part of standard VFX practice from here on out.

>>what should have been done with quite a bit of "Avatar", but wasn't), one can throw away part or >>all of the one eye and rebuild it to taste.

Just to note, this was in fact done on a large number of shots in Avatar.

Mitch Gross
Applications Specialist
Abel Cine Tech

Mitch Gross writes:

>> Just to note, this was in fact done on a large number of shots in Avatar.

One of the problems of shooting in stereo is that some very logical physical effects that don't bother us in real life become objectionable when we see them second-hand in a synthetic stereo space.

The glint of lights on reflective surfaces are a particular problem for stereo filmmakers, as they appear to be not located at the object itself, but at the sum of the distance from the lens to the object and from the object to the light. That distance is projected along the ray path that the camera
sees between the lens and object, the angle of incidence of course equalling the angle of reflection. If you hold a pencil perpendicular to a mirror, it's length appears to be doubled.

Since the glint is always more distant than the supposedly solid object, contradictory depth cues tend to make reflective objects look like Escher drawings, which hinders our "reading" the scene and diverts our attention from the story. This effect is accentuated in close-ups for a number of reasons,
which serves to sink the glint even deeper into the object. Nice crisp little eye lights which in 2D serve to reveal the shape of the eye and give it life can do the exact opposite in stereo.

All I'm saying is that one would expect those kinds of things to be fixed in a zillion-dollar movie when they become noticeable. We've routinely fixed them in films like "U23D", whose entire budget was probably less than the prepro on "Avatar". Additional tip: much can be learned by watching a 3D movie without glasses.