If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

In digital terms, what's is the theoretical resolution of 35mm film and 70mm IMAX film? Just curious . . . I know the best we can see is 4k because the bottleneck is the digital projector which is now only 4k.

In projection of print terms, 35mm is sub 4k, and dependent upon actual print and projector circumstances sub-HD. A typical cinema projection of a distribution film print will offer no better resolution than an HD digital projection, probably less.

It's easier to just talk about the resolution on the original negative and avoid bringing in the resolution of various printing and projection methods, etc. Red has tested Super-35 and generally found it to be, I forgot, 3.2K or 3.5K, something like that.

You could therefore say that if a 24mm wide piece of film negative resolves 3.2K, for example, a 36mm wide piece of film (VistaVision) would resolve 4.8K, and a 52mm wide piece of film (5-perf 65mm Super Panavision / Todd-AO) would resolve 6.9K, and a 70mm wide piece of film (15-perf 65mm IMAX) would resolve 9.3K. However, this ignores some real-world issues like the fact that older medium-format lenses used on large format movie cameras have a lower MTF compared to modern 35mm cine optics (because the larger negatives don't need lenses with high MTF's because if you have more millimeters overall, you don't need to resolve as many lines per millimeter...)

If you really want to be crude, you could say that you lose maybe half the resolution of the negative once it is printed through dupe elements and thrown onto a theater screen, which is why 2K projection seems on par with the best 35mm print projection, and 4K projection would be similar to 70mm projection, but it therefore also means that IMAX digital projection should be at least 6K...

Now don't confuse measurable resolution with optimal scanning, mastering, and archiving resolution -- if 35mm film really resolves 3.2K, then in reality you really should be scanning it at more like 4K to 6K to avoid aliasing, which is why most people round things off to 4K as being ideal for posting 35mm photography... but perhaps 6K would be better for scanning, and then you should finish at 4K.

Kodak always told me that original 35mm camera negative after 2000-2001 had equivalent resolution (based on line pairs per millimeter and MTF) to right around 6K. But they also admitted that, after going through a traditional IP -> IN -> release print stage in the lab, it goes down to 2K. Given the weave and lack of pin-registration in most projectors, plus the less-than-optimum focus, you could easily wind up with well below 1080 resolution.

I agree with Mr. Mullen that 4K is optimal for the real world. Note also that many (if not most) Imax films involve a 35mm uprez process, bumping everything up to about 6K for film out. They're also using CRT recorders for Imax recording, which have their own issues.

Real Imax productions, like the 65mm sequences shot for Dark Knight, are very hard to beat, especially when you're talking a Vision Premiere print struck right off the camera negative. That's about as good as it gets for photochemical acquisition, better than any existing digital camera or projector on earth. It's a good goal to shoot for.

Since digital images can resolve about 0.707 their pixel count you get at best a visual resolution without aliasing of:

1828x0.707 = 1292

988x0.707 = 698

Which correspond to tests done that show that viewing of 35mm film in theaters results in about 1280x720 +/- 25% or so visible resolution from test charts read off the movie screen.

The peak resolution of the film does not matter since the resolution of film varies with exposure and density and can be much lower than Kodak's specs would make you think in the highlights and shadows since they use methods that do not reflect what can be seen in projection contrast as projected.

If you take the image area and use a typical f/1.7 projection lens having 20lp/mm at high MTF you get:

.864 X .467 = 21.9mm x 11.8mm

21.9 * 40 = 876 lines per image width (*1.4 = 1226 pixels needed)

11.8 * 40 = 472 lines per image height (*1.4 = 660 pixels needed)

Those values are close to what you can see off a 4th generation print at high MTF under good conditions.

In today's theaters there is no projectionist to keep the projector in focus during the film projection as there was in the past and 35mm film projection with an f/1.7 lens needs its focus adjusted every 8 minutes or so to keep the maximum lens resolution, otherwise the focus can drift even lower than 20lp/mm at high MTF.

35mm film is a bit better than SD, which is why VistaVision and 65/70mm were developed.

Those would be lines of actual visible resolution at high MTF not scanner pixels that don't relate to the image itself, when the print is made by contact or in a good optical printer. Both contact and optical printing have their own problems as only someone who has worked with them can know. Its much harder to print large film sizes because its harder to hold the film flat enough over that larger area.

==

In previous times the projection lens was a 4 element design, not a 6 or 7 element one like used today, and would have even lower than 20pl/mm resolution at high MTF in the corners of the frame because of the curved field. Older theaters were longer and had smaller screens, but the 4:3 about ratio projection had more image height than today's wide screen prints 1.85:1. Some theaters still use older projection lens types which acted as a low pass filter to reduce the grain in the films of the past.

Using figures that represent less than 2% MTF as "resolution" give an impression of much more visible resolution than one would expect to see. Using a MTF value of 30% is a more useful value I think as far as being sure that the audience can see details of that contrast.

Another factor is that moving images printed film to film did not have a fixed raster like digital cameras and projectors, so the eye averages grains from frame to frame and so when you watch movie film projected at speed your eyes pick out details in the grain and blur as the frames weave by, so you can see a bit more detail in moving images than looking at a freeze frame.

If someone tells you that their projection lens can project 180lp/mm at high MTF off a film print onto a movie screen for more than a few seconds without adjusting the focus they should be considered of suspect experience. The film cannot stay in the focal plane within those limits.

The dye bleed in film stocks at higher exposure levels can reduce the 80% MTF resolution to 10lp/mm or less. The resolution of film varies with exposure, density, processing, and contrast of adjacent image points (edge effects). If you step contact print without an air gap you get Newton's rings (I think Kodak just changed the surface finish on their color print stocks to reduce the Newton's rings, very smooth print stocks show those more.)

To convert film resolution lines per values to raster pixels needed you need to multiply by at least 1.414, such as scanning at 6K to try to capture 4K off the film because of the anti-aliasing filters needed in a scanner, and the aliasing issue in general with all digital images using a grid pattern.

Well, I am all for digital, but I have to say that I am not very happy with 2k projections. Sitting at the ideal position in a cinema, which is normally in one of the middle rows, a 2k projection is - to my eye - not sharp enough. As soon as titles blend in it gets even worse. The problem is not the 2k footage but the 2k projection. 2k footage would benefit well from beeing projected in 4k, smoothing out the stair steppings and decreasing pixels on the screen. I am really looking forward when cinemas start adopting better projectors.

Back to the question of the topic starter: At the end of the nineties I worked in highend scanner development and saw an awful lot of scans of 35mm. Our scanner had an optical resolution of 5080 dpi, resulting in a 4360 x 3720 px image (Super 35 2.35:1). But you could see the film grain very well at that magnification. From that experience I would say that 35 mm is not able to give you any more resolution beyond 4k.
The highest resolution I ever got was from a 4 x 5 still image which was shot under perfect studio conditions - very fine grain and details up to the full resolution the scanner was able to deliver.

Well, I am all for digital, but I have to say that I am not very happy with 2k projections. Sitting at the ideal position in a cinema, which is normally in one of the middle rows, a 2k projection is - to my eye - not sharp enough. As soon as titles blend in it gets even worse. The problem is not the 2k footage but the 2k projection. 2k footage would benefit well from beeing projected in 4k, smoothing out the stair steppings and decreasing pixels on the screen. I am really looking forward when cinemas start adopting better projectors.

Good point. I've seen plenty of 2K screenings at my local cinema (not sure what kind of projector they use), but another one near me just upgraded to the Sony 4K projectors (that Sony is giving away like candy) and the difference is definitely noticeable, even though both are running 2K DCPs. I'd kill to be able to send some true 4K footage through those things.