In a conversation with a UK film professional, I was told that in order to simplify technical stuff associated with projection, all modern film are projected slightly sped up at 25fps - although they were filmed at 24fps.

I can't believe this, as I haven't experienced overly video-like motion in cinema exhibitions so much. 25fps would produce TV or video like flicker / judder / motion, as far as I am informed (compare).

Two explanations are possible:

No video-motion because cinemas project 3:2 pulldowns at 25fps.

Or the 24fps-camera-induced-shutter-flicker is preserved even at 25fps.

Somewhere else I've read that cinemas really do alter frame rate, but in a good way. Stepping up in multiples of 24, 48fps, 72fps, preserving original flicker/shutter-gaps but double-exposing frames to produce a steadier stream of light on the audience's eyes.

Any projectionists here that can confirm 25fps frame-rates in projection?

Coda: Legend has it, same is true for TV, films presented sped up - although I see 3:2 pulldowns all the time... no video-motion in films on TV, evar!

We're looking for long answers that provide some explanation and context. Don't just give a one-line answer; explain why your answer is right, ideally with citations. Answers that don't include explanations may be removed.

1

Strikes me that the '25 fps looks cheap' argument is rather subjective, a bit like the 'vinyl is so much better than CD'.
–
iandotkelly♦Jan 3 '12 at 14:16

3 Answers
3

This is one of those interesting questions which gets more complex (and harder to answer) the more you learn about it. Unfortunately I cannot answer definitively how theater projectors work. I'll explain why I don't think that question can be answered. And I'm going to reference 100fps.com a bit.

First, I don't think the question can be answered now, because projector hardware is kinda stuck. Look at how few screens in your area converted to 3D in the past few years. Yes, most theaters likely have 1 or 2 screens capable of 3D, but the expense of converting them all just isn't supportable. Because of costs like this, most theaters are using projection methods which are likely very out of date; so even if newer/better projection methods exist, it's likely you're still seeing older methods. But this question likely couldn't be answered in the past, because of different video standards. American cinema companies developed one set of film standards, while european companies developed another. This carries into broadcast encoding technologies for TV, and pretty much every corner of video.

One great bit of discussion can be found here. It looks at the question of how the human eye perceives video images, and some of the trade-offs necessary to make movies look fluid. Note in particular this observation:

The fact is that the human eye perceives the typical cinema film
motion as being fluid at about 18fps, because of its blurring.

This makes the main page at 100fps.com more relevant to the question of how cinema can get away with such low frame rates, as the main page is focused on interlacing (and image de-interlacing). Cinema blurs the images in each frame (by combining time slices) it gives the illusion of smoothness to motion.

As near as I can tell, most theaters in the US run at 24 frames per second, with 2 or 3 exposures per frame. The higher exposure rate prevents the eye from seeing the black shutter. The interlaced frames make the image look smooth. By comparison, most theaters in Europe show 25 frames per second, still with 2 or 3 exposures per frame. This low frame rate (by modern video game standards) works because the images are not crisply rendered computer images. So motion blur works to make it look better.

First: Thanks for your elaborate answer! Second: I am in Europe, and I don't see sped-up movies here (I think). Any evidence that theaters here really project 25fps? I know that most 16mm made-for-TV drama is shot at 25fps and presented like this on TV, although interlaced then, before the HD revolution started, but I movies are shot in 24fps here as well, it doesn't make sense to project at 25fps what was made for 24fps.
–
isyncJan 4 '12 at 17:55

1

Did you ever see the TV series Connections? A LOT of things don't make much sense, until you learn they make a frightening amount of sense. You might like en.wikipedia.org/wiki/Frame_rate for hard FPS numbers. Recording speed (24 or 25) likely depends where the camera was manufactured; playback speed depends on where the projector was manufactured. Converting would only take adding or dropping 2-3 flickers per second, and extending 2 single frames by 1/48 of a second (or 3 by 1/72) you likely wouldn't notice. Speeding playback by 2 seconds per minute wouldn't be really extreme, either.
–
ScivitriJan 4 '12 at 23:42

I would really like to see a side-by-side comparison in a cinema, switching between 24 and 25fps and see/feel if it is the same as switching on/off LG TruMotion, Philips HD Natural Motion, etc. on your flat-screen TV. Might be that a 1fps speed-up (cinema) doesn't eliminate flicker as interpolating an artificial frame (TVs) does. Could be the camera shutter backed it into the film... link
–
isyncJan 5 '12 at 15:30

@Scivitri The standard for movie film is 24 fps wherever you are. The only place where this varies is TV not movie theatres using film. Europeans project film movies at 25 fps on TV (to be compatible with TV standards using mains electricity 50 Hz signal). The US doesn't as TV and electricity is 60Hz and there is no easy way to keep film in sync.
–
matt_blackApr 26 '14 at 13:15

TV presentation of movies in Europe will always run at 25fps to synchronise with the TV framerate (anything else would require expensive conversion or incur terrible flicker as the slightly different frame rates interfere with each other). The difference is barely noticeable (but the movie will play with a 4% shorter run time and, for a non digital soundtrack, a 4% higher pitch for sound). In the USA where TV is 30fps more complex solutions are required.

There is no good reason for a projector in a movie theatre to do this, so why would they? However digital projectors have no physical limitation on the framerate they use and it would be easy to project at different rates if there was a good reason to do so. But you wouldn't see big benefits unless the rates were much faster and many of these benefits would only appear if there was more data to project (it isn't entirely obvious that projecting the same frame twice leads to observable gains, for example, and TV sets that do this from 25 or 30fps signals also cause new and sometimes annoying artefacts on the perceived image). Some modern movie directors (I believe Peter Jackson and James Cameron are pioneering the idea) are planning to use digital shooting at 48fps which will look a lot better than current movies if projected at 48 fps.

Yes, Jackson is opting for 48fps, Cameron for 60fps. Let's hope Jackson means 48fps total, spending one half of the 48 per eye, meaning we keep "movie motion". Whereas Cameron's idea, even when split by two, for 3D would mean "TV motion". Except the movie motion comes primarily from the shutter, of which I technically don't know enough. There was a reason why Trumbull dropped 60fps when Showscan merged into IMAX...
–
isyncJan 5 '12 at 15:18

We don't have 3:2 pulldown - the film (audio and all) is simply sped up by (25/24) or about 4 percent. Same thing that is done for Television then 2:2 pulldown is applied to convert it to interlaced TV format. At least that is how things are done in Australia.

The reason for the frame rates is a historical one, Basically it is easier to rely on the AC supply to provide a timing source than create your own. The US uses 60Hz AC supply, so it is easier for them to use multiples of 6 for their timing (hence 24, 48, 60 fps).
Countries which followed the European electrical standards use 50Hz AC supply, so it is easier to use (25, 50, 100 fps) Because of this, film and television standards developed in both areas around what was easiest to display (and thus made TVs and Projectors cheapest to build).

There is also a second reason why 25fps is preferred - the light globe in the projector may have 100Hz flicker and running at 24fps would produce a noticeable variation in image brightness. 25 fps allows synchronising with the AC supply.

This would only apply to older film projectors though, as new stuff can use high-frequency drivers for the light source, as well as stepper motors and microprocessors for control, thus allowing for any frame rate you want. Digital projectors are only restricted by the response time of their base media (DLP or LCD panel) and data processing capability.