Someone must answer this!

OK, this has been bugging me for some time, and recently it came to my attention again and now I am just dying for an answer, stupid as it might seem. Doesn't really fit any of the other forums so I'm posting it here.

It's about film quality. There is something in the film quality of a movie or regular TV show that is much different from that of, say, a newscast or (for all 4 of you gals out there ) a soap opera. I can't really say WHAT it is, but surely everyone must know what I'm talking about! So, my question is, what is it that accounts for this qualitative difference? Surely something about the film/recording method, but what exactly?

The lens used...
The lighting...
Whether it is analog or digital...
The properties of the film used (light sensitivity, color saturation etc)...
The developing techniques...
The shutter speed...
The list goes on and on and on.

Originally posted by one_raven The lens used...
The lighting...
Whether it is analog or digital...
The properties of the film used (light sensitivity, color saturation etc)...
The developing techniques...
The shutter speed...
The list goes on and on and on.

Hm, if it comes down to so many variables, why are there only 2 or 3 recognizable types? Industry standards? High JNDs?

Originally posted by hypnagogue Come to think of it, that's probably not true-- I can usually pinpoint what decade a movie or TV clip comes from by noticing differences in film quality. Or is that just due to film degradation?

I know exactly what you mean. When I was about age twelve I remember telling my dad that I could tell if a local news report was live or video taped. He was quite surprised and tested me; I was right. Really this was quite obvious since the live broadcast images were crisper and brighter than the recorded ones...I am sure due to the limitations on quality in the late 60s/early 70s.

Without googling...I think that the following is approx. true:
Very early TV used movie film and cameras; and then several variations on film size and shutter speed that each yielded unique image characteristics e.g. graininess, range of contrast, eff # of shades of gray [I'm really guessing on this one but this is what I perceive], and probably others. I know that for a short time in the very early years of TV, something called Cinemascope was used. I think used movie cameras for TV - resizing the picture for TV by re-recording the image through a lens or something. This method produced a very distinctive grainy and low contrast image. Anyway, due to the type of film used almost all of this material was lost long ago. I have heard some big star from the early years of TV lamenting this fact.

Then we see the early days of video that are really obvious in the programming of the late sixties. I'm not sure exactly how to describe it, but I think that what I see is a greater range from white to black, but with fewer shades of gray in between. In other words, video black is blacker that film black; and the same with white; but not so many steps between as in film. This is most of the stuff that was airing while I grew up – the early pre-color sitcoms like Bewitched, I Dream of Jeannie, or Gilligans Island; as opposed to re-run shows from the 50's like Father Knows Best, The Nelsons, Leave it to Beaver and so on.

From there we see early color and the increasing quality of Video as a date stamp. Also, in addition to that mentioned by one_raven, there appear to be popular or indicative styles in the sets used or the location shots chosen [for example, using the back lot of Universal Studios as opposed traveling to Texas for a location shot], filming techniques, sound quality, etc; in addition to the effects of time on the quality of the recorded information; be it film or magnetic tape. The number of plays takes a toll also.

Anyway, these are my observations and guesses. I know some of this is definitely true but I have never researched the history in detail.

I have made a bit of a hobby trying to distinguish between these difference most of my life. I have never heard anyone else mention it!!!

Edit: My guess is that most of the TV programs went from BW movie film to the standard BW TV film of the 50's; then BW video followed by color film, and then color video.

Originally posted by hypnagogue It's about film quality. There is something in the film quality of a movie or regular TV show that is much different from that of, say, a newscast or (for all 4 of you gals out there ) a soap opera.

The difference according to the question the way you have asked it here, is that movies and regular TV shows are filmed while newscasts sre live, and soap operas are videotaped.

Regarding film alone, there is no limit, as people have pretty well pointed out, to how much the quality of a film image can be manipulated to emphasize different characteristics. Different "looks" become popular and seem to stay in syle for about ten years. Film reached an all time low in the late 60s early 70s with a cheap and sloppy look that is just about unbearable to watch.

Thanks for all your info Ivan. Very insightful! I have such a hard time putting my finger on exactly what accounts for what I can so obviously distinguish as different.

Originally posted by Ivan Seeking Then we see the early days of video that are really obvious in the programming of the late sixties. I'm not sure exactly how to describe it, but I think that what I see is a greater range from white to black, but with fewer shades of gray in between. In other words, video black is blacker that film black; and the same with white; but not so many steps between as in film. This is most of the stuff that was airing while I grew up – the early pre-color sitcoms like Bewitched, I Dream of Jeannie, or Gilligans Island; as opposed to re-run shows from the 50's like Father Knows Best, The Nelsons, Leave it to Beaver and so on.

That is a great way of explaining it. I totally agree there. I recall seeing a very early episode of the Twilight Zone (possibly the first) where they used something like that TV format you describe, but even more exaggerated away from the traditional B&W film format. Looked very very odd! Like it was a home recording almost. It actually made a huge difference in the viewing experience, especially for a mystery drama-ish show that is intended to give you the hibbly-jibblies. The whole time I found it difficult to get into the episode because I couldn't get over how it looked like someone put it together using low budget/unprofessional filming techniques. It wasn't something blatantly bad like graininess or blurriness (the picture was very clear and crisp in fact, probably due in part to the brightness)-- just something that somehow gave me a very un-cinematic/transporting kind of feel. Funny how subtle film qualities like that can do so much.

I have made a bit of a hobby trying to distinguish between these difference most of my life. I have never heard anyone else mention it!!!

For me it really stands out whenever I watch an old program, although it winds up being the kind of Seinfieldian observation that you always experience in everyday life but kind of put in the file drawer and forget about.

The funny thing is that even when I watch things from the early/mid 90s now I can easily tell the film quality apart from modern shows, whereas I used to have those typed as modern-looking, or rather as having nothing conspicuously noticeable in film quality. It makes me wonder to what extent film quality is really changing/improving, and to what extent the film quality of older material just changes/degrades over time.

Staff: Mentor

Originally posted by one_raven The lens used...
The lighting...
Whether it is analog or digital...
The properties of the film used (light sensitivity, color saturation etc)...
The developing techniques...
The shutter speed...
The list goes on and on and on.

Actually most of these apply to still photos.

I know what you mean Hypnagogue, I've wondered the same myself. I would say that it has to do with the type of camera used for the cinematography and maybe a few other things. If no one else knows, I know a film director I could ask, but he doesn't work much with tv.

Film is not interlaced, while TV is. For example, PAL television uses 50 fields/second interlaced, which is equivalent to 25 full video frames. Also, soaps usually do on-site recording. After a while this will sink in and you can tell very easily if you are watching a soap opera, newscast, or whatever.

Staff: Mentor

Originally posted by zoobyshoe And to motion pictures. One_Raven is quite right. I've taken both still photography and motion picture photography courses and the same techniques for manipulating the qualities apply to both.

By this I take it you are refering to my inability to stick with one field of knowledge causing me constantly to meander from one subject to the next, ending up never accomplishing anything in particular in any one field. Yes, it is kind of amazing.

Originally posted by zoobyshoe By this I take it you are refering to my inability to stick with one field of knowledge causing me constantly to meander from one subject to the next, ending up never accomplishing anything in particular in any one field. Yes, it is kind of amazing.

Originally posted by hypnagogue um.. jack of all trades, master of none?

No no! I was looking for the positive slant...especially given Zooby's way with words.

Really though, I make a living as a generalist. I can't be an expert at every new piece of technology rolling off the line. In fact, no one is...often including the manufacturer's experts. I managed to take a physics degree and a jack-of-all-trades background and turn it into a real job! Electrical-mechanical-engineeroprogrammingdesigner-industrial-physics consultant.

You always add consultant on the end as a generalist!

I would love to hear Zooby's interpretation of his combined set of skills.