Archive for the ‘moviemaking’ Category

Before VFX: Blockbuster movies without visual effects. The site at the following link has a collection of of behind-the-scenes photos prior to visual effects, hence revealing green screen etc. shots, actors festooned with CGI motion-tracking rigs etc.

Discovered via NoFilmSchool, which I subscribe to and heartily recommend for makers and enthusiasts of movies and videos etc.

It even has some shots from John Carter, in which I was a film Extra, though sadly none of “my” scenes. I wish I could re-cut it, not only for my bits 🙂 but also to allow its climate catastrophe message to be more dramatically expressed, some of the “cutting-floor” scenes were truly emotional. Regardless, “all the world’s a stage” 🙁

I serendipitously discovered a forum website for Michael Bay, the Executive Producer of Transformers and Producer of Pear Harbour, amongst others.

It has a Film-Making and Movie Discussion forum. What I’ve so far skimmed through suggests its more for film fans than film makers, but it does give production news/snippets/oddments and draws attention to trailers and makings-of movies, so who knows, maybe there’s more to be found in there.

This is a dark-sci-fi (good, I like those) with observations on the post-catastrophe balance between security and freedom/rights. The cast involves numerous famous actors, encouraged into roles beyond their normal types.

The film was drawn to my attention by someone (Matthew Roberts) with whom I was discussing the process of movie critical/constructive feedback and consequent reworking. Apparently the above movie was a case in point: An early (and unripe) version of it was initially screened in Cannes, resulting in some negative feedback but also support.

The moral of this story (about that film): feedback can be priceless. The consequent partial re- write/shoot/edit of that film, subsequently released on DVD, arguably elevated it to “one you have to see”. Thanks Matt, I’ll check it out.

I want to shoot a brief film exercise, ideally in misty early-morning conditions. How do I go about finding a mist-forecast? Aha! (and Oho!) Find an Aviation Weather Forecast, because pilots and airports care about visibility. The following seems simplest and best for my purposes (given I am near Alton in Hampshire, UK):

Slowing the shutter from 180 degrees (1/60 sec) to 360 degrees (1/30 sec) did not affect the smoothness of the pan, however it did cause an irritating blur to the (panning) image.

So when the background is the subject, don’t do that !

On the other hand, when following a moving object, holding it stationary in-frame, long shutter time can produce a pleasing background motion blur, diminishing background clutter and suggesting speed. Might be a problem (I guess) if object includes movement such as flapping wings.

What’s the best speed for a pan?

Typically 3 to 5 seconds

Don’t pan over too great an area, especially of a nearby object, especially if the middle portion is uninteresting.

Begin and end with a few seconds of static (static shots i.e. locked-off camera).

Cutting from a static sometimes better when first few frames up to half a second are static (pre-pan)

Tip:

For a stills camera: “Use a shutter speed between 1/8sec and 1/125sec depending on the subject’s speed and distance,”

For a video camera, I tried a range of shutter speeds from 1/25 to 1/60 and it made little difference to motion smoothness, the main factor was the chosen fps (on a laptop, 30 fps recording gave better smoothness than 25 fps recording)

As an assignment for an on-line video course, I shot some B-Roll footage in an interesting garden. While shooting, I was “hassled” by a lovely poodle called Ivy. She appeared so much (uninvited) in front of the camera I decided to give her a starring role! Hence it is now “her” garden.

This light-hearted and whimsical journey through an english country garden and back, occasionally accompanied by our fluffy hostess, Ivy, is my response to a film-course assignment, simply to record some B-Roll footage. It sort of acquired a life of its own, partly because some of the clips fitted together, like pieces of a jigsaw puzzle (that no-one designed) into some sort of mini-stories, and partly because the selected musical accompaniment turned out to reflect the various moods that emerged from the initial rough-cut, so in the end it became the editing back-bone. Serendipity.

It was shot (on a Sony EX3) over a few hours, during which the (typical British) weather varied while I occasionally made way for handymen and joined in the moving about of furniture etc. So not an entirely controlled situation then…

Please excuse the occasionally shaky camera shots of Ivy, not originally intended for use (she just kept getting in the way, demanding attention), but I couldn’t resist…and now the video even bears her name!

The musical accompaniment is what I believe to be titled “Introduction et Etude Brillante” (“Réveil des Fées”), which I purchased from the Vimeo Store under the title “Introduction er Etude Brillante”, which I assume to be a typographical error (until anyone advises otherwise). It’s by Giovanni Sgambati. I didn’t realise at first it was also titled “Réveil des Fées”, but that’s great, because the owner of the garden has a thing about mystical fairy worlds, as you will see from her various statues etc.

Oops, this is one post I left in “Draft” too long. It was about the weekend before last…

Spurred on by Den Lennie’s tutorials on shooting B-Roll, I grabbed the camera (EX3) and filters etc. to have a “play” in the garden, shooting stuff to edit together into a pleasant sequence of some sort.

The intention was to present the floral aspects of the property in an elegant easy-going fashion with occasional quirks like my girlfriend. While shooting, the dog (a toy poodle) kept pestering me for attention, because obviously the only important thing in the world is playing ball. It seemed best to “go with the flow”, so I assigned said canine a principal role.

This turned out to be a 4-hour shoot (with interruptions) of about 150 clips total duration about an hour. It took another 4 hours at least (with interruptions) to ingest, catalogue and convert the clips (into MXF, for Sony Vegas) and probably about 8 hours of editing, plus a little further shooting etc. In an ideal world there’d be no need to grade, but in reality some tweaks were necessary for continuity, especially since the lighting (sun/cloud) conditions were very changeable.

Hopefully I’ll get it finished soon,along with the rest of my backlog, which now includes a Diwali corporate event and wrangling / editing my own version of a music video in good old faithful Final Cut Pro 7.

I attended, working on one of the camera units. Had a great time, learnt lots, at all sorts of levels. Even how to make good use of the Movie Slate application on my iPhone! Link: http://www.fstopacademy.com/

I think ISO is linear, so if Camera is 320 ISO, they imply equivalent ISOs by simple division:

1080p: Clear=>320, ND1=>40, ND2=>4.5

720p: Clear=>400, ND1=>50, ND2=>6.25

1080i: Clear=>640, ND1=>80, ND2=>10

Alternatively, for ND1 filter you can leave the app’s ISO setting as Clear (no filter) and instead adjust the app’s Correction Factor to -3 EV (though it’s maybe better reserved for simulating lighting variations e.g. due to weather, as in the Exposure Value Table further below).

I guess from this one off case that EV is logarithmic, since 2^-3=1/8 as per ND1.

That guess was later confirmed by further web research (further below), stating that EV is an “additive system”, i.e. operates in the logarithmic domain, base 2.

Caution: being an ISO/EV newbie, I can only hope this is is all correct!

Nevertheless, when I tried my naive settings they worked just fine – I was successfully able to use the iPhone Light Meter to obtain a sensible camera configuration for good exposure level and (given the ND filters) the kind of shot I want (e.g. degree of DOF). When tested on the camera, they all worked out as expected. Cool!

The app can also “log” readings – in the form of jpg images of the screen and overlays including geographical location – to a DropBox account. For example, when I clicked the [Log] button, a jpf file appeared on my MacBook in the folder [ /Users/davidesp/Dropbox/Photos/Pocket Light Meter].

<<The full name for Exposure Value, or EV, is the Additive Photographic Exposure System. Exposure Value has two equivalent definitions. The first defines how much light will be admitted to the film by the combination of lens aperture and shutter speed. The second defines how much exposure is required by the combination of subject luminance (e.g., how bright it is) and film speed. Setting a combination of aperture and shutter speed on a camera with an EV that equals the EV for the subject luminance and film speed should result in a properly exposed photograph>>

(The article continues at length. For example the “Additive” element reflects the fact that this system operates in the logarithmic domain. The article also distinguishes luminance from illumination, explains units such as point-source intensity in candelas, flux in lumens, light illuminating a surface in foot-candles, light radiated from an area in foot-Lamberts, luminence in candelas per area (square foot or square metre)

An EV (Exposure Value) table is presented. I guess (?) this is useful for the iPhone app, where EV can be shifted up/down by a control, to estimate what would be needed should the lighting conditions vary:

Movie★Slate is a slate and clapper board traditional movie-making tools for syncing picture with sound, and photographing shot/production info at the start and end of shots. Movie★Slate also provides an easy way to log footage and take notes as you shoot saving you time during capture and edit.

My cameras are old DV units or are consumer models with no LTC support. Can MovieSlate’s optional PRO Sync module still help me sync a multi-cam shoot?

Yes, through additional software available from VideoToolShed. Here’s how:

Set MovieSlate to output timecode through one of the audio channels and connect from the headphone jack to your camera’s AUX/MIC audio.

Shoot your footage with MovieSlate running and Sending sync through the headphone jack. The LTC audio signal will be recorded on on one channel of your DV tape. (Please note the obvious: If this cam is handling your main sound then you will not have stereo audio).

Create a virtual 3D set in your computer with the freedom to place any number of virtual cameras in any placement, angle or height desired. Each camera features full Pan/Tilt, Dolly, Zoom, Roll and Crane control with the options to limit focal length to a specific zoom range or set of prime lenses, limit minimum and maximum heights (due to equipment limitations or physical ceilings) to accurately portray the actual range of your equipment.

FrameForge is not about your mastery of the pencil but more how you can explore and see what your actual equipment will see.

Camera-Mapping: take a still image and convert it into 3d geometry for use in an animation. This powerful technique is used extensively by visual effects studios for feature films, commercials and television shows. Its especially useful for faking helicopters flyovers because it costs just a fraction of the cost of hiring a real helicopter.

The Port of London Authority (PLA) owns and operates Richmond Footbridge, Lock and Weir, situated between Teddington and Richmond, which offers a wonderful location for any type of film and television productions as well as still photography.

permission to film on or by the Thames requires a filming license issued by the PLAs Corporate Affairs department.

“We don’t usually allow filming on buses that are actually in service. However, outside peak commuter hours you can hire a bus that will look like the bus on the route you wish to film, complete with driver.”

<< Even with the help of stalwart first assistant Jason Gaudio, the editing team did not want to risk upgrading their NLE software in mid-project despite the fact that Media Composers have been able to playback 3D sequences directly from the timeline ever since version 3.5. >>

(my italicization)

<< So when they wanted to view the 48 terabytes of footage on their Avid Isis storage system holding both left and right eye tracks, they had to run both dailies footage and cut sequences through a QuVIS Acuity 3D playback platform. >>

Michael is “the Senior Consultant for National Education for the Folger Shakespeare Library”. His course material provides a great overview of project planning (to deliverables), roles (Cinematographer, Director…), actor abilities/assessment, filming/camera technique and more. The other UNITs cover for example the stage area terms.

Some new movies, not just legacy ones, are being converted from 2D to 3D (stereo). This step is being planned as part of production. Don’t know why they can’t just shoot it in stereo (cost? maturity? conservatism?) but that’s how it is.

The method: a tech & manual rotoscoping pipeline (production line) where images are masked to create layers and artistic judgement is applied to the appearances of individual objects. As one would imagine, no simple “magic solution”. However beyond those basics they have their own patented 2D⇒3D inference algorithms operating on individual objects even at sub-pixel level.

Not quick or cheap: “for a 100-minute or 120-minute 2D-to-3D conversion, you would need about 300 to 400 artists phasing and out of production over about four to six months.”. Clash of the Titans was so-processed in under half that time – possibly explaining some negative press (mentioned in the article) regarding the quality of its 3D.

Dimensionalization is a method developed by In-Three of converting 2D content to stereoscopic 3D content.

There are various approaches to creating 3D content: capturing 3D using dual camera rigs, rendering 3D using dual virtual camera rigs within a computer graphics environment, and creating 3D by converting 2D content with processes such as Dimensionalization.

Dimensionalization is trademarked because it describes a patented process which gives the unique, depth, shape and perspective to each individual object on a pixel or even sub-pixel level. Throughout our process, there are a multitude of special and unique techniques our experienced stereo team has and continues to develop, so that you can be confident that we bring the tools and the skill to any conversion project.

The Dimensionalization process is covered by a number of U.S. patents. These patents make In-Three a leader in the development of intellectual property surrounding the conversion of two-dimensional films to stereoscopic experiences.

Heard about it at http://www.stagetools.com/previs.htm, which said: People are using existing multimedia tools for previsualizing projects, … “One exception to this trend is PowerProduction Softwares StoryBoard Artist drawing package, which helps automate the drawing of storyboards by non-artists. The software comes with a collection of pre-made characters, props and backgrounds that can be viewed from various camera positions and animated.

I checked it out and it looks to me mainly aimed at contemporary scenes, not for example english period drama. Within its own context, it looks extremely slick (hence quick) to use. Nothing that can’t be done by more general tools but just plain handy, all there and convenient; less technical fiddling to distract from the creative process. Quick & simple is what you want when the previz needs to be adaptable (e.g. is part of a dialog or things turn out differently than expected) rather than a fixed plan.

Of the three products I saw at that website, the StoryBoard Artist product (http://www.powerproduction.com/artist.html) seemed most appropriate. It has a timeline for soundtrack etc.. Also “Multi-angled, multi-positioned characters with overheads and expression.”. And “Non-linear linking storyboarding for DVD and iTV prototyping”. Or indeed uncertainty…