Since this is a general 'chit-chat' room here is a topic for everyone. I constantly read opinions in the forum about various programs being 'great HD pictures' followed by a post which says it was the 'worst HD I have ever seen'. For example, I have been told I use my sharpness up too high, although that is the way I like it. Since I am not a video quality expert, perhaps one or more of the pro's in this group could take a little time here to tell me what to look for when I watch and HD show. What do you guys watch for in terms of color purity? edge definition? sharpness? contrast? color intensity? brightness? black purity? Does anyone want to take a shot at an HD primer for those of us who are not in the video business? It would really help the folks out there who are still shopping for the right HD set to know what they should look for (other than the proverbial, whatever you like best) when comparing various sets.

The reason, mainly that people observe differences in HDTV quality has to do with the source of the program. The bulk of programming we have today was actually shot with a variety of film cameras and converted to video tape. Unfortunately, we see only the end product after there has been much processing, including many compression and decompression schemes that adds to quality degradation. Don't forget that we have the gigo concept at play too. If the movie was poorly lit, badly focused, then the transfer can be no better than the source. Additionally, any additional processing will take it's toll on the image quality as well.
To determine true HDTV, I choose to only qualify the video in these categories:
1. scan frame size as 16:9. This will include any letterboxed movies.
2. 1080i or 720P
3. video source was maintained component digital on all tape formats.

If the video was maintained component digital then the color integrity will remain with sharp edges on the chroma as well as more intense saturated color as the digital component color space is greater than the same picture in analog composite video.

Picture detail- The maximum horizontal resolution of HDTV is 1920 pixels, however, not all images will have this level of detail yet still be classified as "HDTV" the degree of detail loss in the HDTV video editing, dubbing and tape formats used affect the PQ of one over the other in addition to the source quality variance. for example. Take a high res film and transfer it to D5 video tape for the base master. This format will record and preserve 1920 pixels in the 1080i scan rate HDTV format. Then the distribution dub is done to HDCAM because that is what the cable op or DBS service happens to request. Automatically the H res was lost to 1280 pixels in the 1080i format. Assuming all else being equal, the station airing the D% will have a slight better PQ than the one with HDCAM. This also assumes you have a way to see this difference as well. The reality is that most people have HDTV monitors that top out at between 1000 and 1300 pixels resolving power. Therefore you probably won't see the difference between HDCAM and D5. What you may be seeing is other signal losses that are caused by compression of the video signal. It is my opinion that comparing one movie to another and attempting to blame the poor focus or bad lighting of one on compression is just wrong. I have seen many do this.

Sharpness adjustments- This picture adjustment has a precise calibration that can be made with a proper test pattern. However, no one should argue that you may have a personal preference to see your picture with an excess of sharpness added. This is simply an artifact as it is not the way the picture was designed. Heck, my father in law chooses to watch his TV with the color sat all the way up. The picture is iridescent but he likes it that way. Too much sharpness will add a white edge to black text on a gray background. the proper sharpness adjustment is to just back off to the point where the white edge just disappears.

However, that caption does beg the question: Where exactly are we supposed to debate the controversial topics? If the answer is the Hardware, Programming, and Recorders forums, then what exactly is this forum for?

Not that this thread was controversial to begin with, but I'm just curious.

I was just looking for some opinions on what you guys look for in your PQ, so I might be able to look at my own setup more critically. Don's suggestion to set the sharpness with black letters on a gray background until there is no white edge was exactly the sort of things I was looking for. I am well aware that the source quality, personal taste, and the quality of our home systems has a LOT to do with it.

At the same time, I know people have their sets ISF'd, and they are not just getting it set up so that it 'satisfies the eyes and taste of the guy doing the calibration'. There must be some basic standards that you all look for, and then matters of personal taste allow you each to deviate from the baseline. I am in the business of designing satellite communications links and satellite hardware, I am not a video producer. My job is just to get the data to the receiver with a low level of bit errors. I just thought it would be fun to 'chit-chat' about what you guys look for first when you see a picture on a new set, or on your own set. If that isn't what this room is for, then I guess I really did miss the point.

The things that make a good image, that are related to the display and not the source material, are:

1. The gray scale is correct
2. The color balance is correct
3. The geometry and convergence is correct
4. There are no artifically imposed artifacts from sharpness or SVM
5. The brightness and contrast are correct, and that the display can hold black at black well.
6. If its being scaled and/or deinterlaced, that that add no obvious artifacts.

Get all of that set up correctly, which a good calibration will do for you except the second part of #5, then the image will be as good as your set and the source material allow.

No one has as of yet said anything about compression artifacts. I have found those to be very noticeable and annoying picture impairments at times. The edges if fast moving objects should not become busy with small blocks. Blocks should also not be noticeable during dissolves or rapid cuts. Gradual grey scale or color intensity ramps should not have visible contour lines due to an insufficient number of bits being carried. These can be very visible picture defects if too much compression is used or in some cases with concatenated compression/decompression cycles. There is also the problem that all compressors perform equally well even at the same output data rate.

Here is a simple test for HDTV quality. Find a distance from your screen where DVD looks best. Now look at HDTV and move three times closer to the screen. It should look just as sharp as the DVD only three times bigger.

Frank

Did IQ's suddenly drop sharply while I was away?I enjoy 3D in spite of HDMI 1.4!Full screen only 3D doesn't cut it!

Guess I should put in a good word for live OTA HDTV broadcasts. As Don points out above, there's considerable 'shaving' of higher-resolution details when certain tape equipment is involved, so live HDTV broadcasts should deliver the best quality. I've found that's usually the case; clearly, there's a range of equipment and lighting conditions involved with various live broadcasts that influence quality. It's crude, but I use a recollection of live HDTV images to help judge others.

Also mentioned above is the limitation of some equipment in displaying the highest HDTV resolutions. So far it appears that most HDTV programming, because of the tape limitation, is within the range of most home hardware. Find it surprising, though, this far along in HDTV delivery, that there are no widely available independent test reports listing the resolution (and other) capabilities of gear. Hope they'll prove me wrong, but the videophile magazines seem to avoid this critical area of testing. The if-it-looks-good-buy-it philosophy is prominent, even if you wind up buying a 'looks-good' set with half the detail-resolving ability of other models. (A December post by forum member LB indicates this is a possibility, depending on how much credence you want to give his model comparison data.) -- John

Frankly, I've been disappointed with CBS PrimeTime HD broadcasts. These are shows that have been shot in 35mm and transfered to HD 1080i for broadcast. I find the picture to be dark, with reddish overtones, and lacking detail. Live video on CBS HD, on the other hand, has been terrific. The Tonight Show on NBC HD is of reference quality (in spite of the lousy sound!), and WETA (PBS) consistantly broadcasts stunning 1080i HD!

Frankly, I've been disappointed with CBS PrimeTime HD broadcasts. These are shows that have been shot in 35mm and transfered to HD 1080i for broadcast. I find the picture to be dark, with reddish overtones, and lacking detail. Live video on CBS HD, on the other hand, has been terrific. The Tonight Show on NBC HD is of reference quality (in spite of the lousy sound!), and WETA (PBS) consistantly broadcasts stunning 1080i HD!

So let me get this straight. It seems that you and a lot of people here feel that "quality" = live video. You all seem to feel that a live video picture is the end-all and be-all and basic reason for existence of HD, regardless of what the program material itself is or what it's trying to present. As someone who has worked in the television and film industry for many years, I find it really disheartening that people are so taken with technology and the possiblities of that technology that it becomes more important than creating visual worlds, moods, and storytelling. We work very hard to create imagery that helps to tell a story on film, or whatever medium is used. The moods that are set are not made to convey reality, they are meant to create a world in which the story exists. If a show like "The X-Files" looked like reality, it wouldn't work. If a show like "Ally McBeal" looked like The Tonight Show it wouldn't work. Art is about setting moods and conveying emotion. It is generally accepted in this town that the images we create on film, for the most part, successfully do that. It's not about showing extreme detail and having everything in the world in focus, and it's not about having smooth motion, although it seems that alot of you feel that this is the only thing you want to see. Composition, artful lighting, letting you see only what the storyteller intends for you to see - that's what dramatic narrative programming is really about. If you want to talk about unintentional focus problems, washed out colors, milky blacks, and extreme graininess what they're not intentional - these are things that are true measures of picture quality, not whether the screen looks like it's showing you a documentary about the Grand Canyon (appropriate for a travelogue, but not appropriate for, say, CSI).

It occurs to me as I read the discussion of live vs. film, lighting etc. that this is a bit like the older discussions that audiophiles often get in to. When it comes to sound reproduction, electric instruments such as organs and guitars are really hard to use to judge a sound system since you don't really know what it was supposed to sound like live in the first place. I usually judge sound equipment I buy with how it sounds reproducing a piano, drums, or a violin, since I know what these sound like live and 'unamplified'. In terms of video, it would seem that it would be easier to judge my equipment during a live broadcast of an event I have seen in person since I know what it should look like. Leno might fill this criteria (although I have never seen it live) since I know what people in a room should look like. PBS nature/documentary shows (such as the 'Grand Canyon Railroad' from last year which is my own 'personal best' HD experience) are another example. When I watch a film transfer, I know that the director may use lighting and other tricks to set a mood, rather than create the clearest image (I think of the film 'Dick Tracy' as a great example). In that sense, I would think that although a film may actually take more effort to stage and record, it may not be the 'pure' way to judge the quality of the equipment reproducing it. There are really two different issues, the quality of the source material and the quality of the reproduction equipment. The best reproduction equipment should not 'color' or change in any way the source material, but it is often difficult if not impossible for me to know what the source material really should look like. I guess that is a major reason why 'personal taste' may not be the best way to select your equipment. I must admit when I first posted the question I had not thought of all of these issues. It seemed like it would be easy to develop a criteria to judge PQ.

Since I initially raised the point about HDTV quality and live ATSC broadcasts, I should clear up what I meant. Agree completely about all the points made about various techniques to bring artistic quality to productions, film or otherwise. Suspect most realize I meant the ability of HDTV transmissions to deliver enough picture details to convey what the director wanted to present. Live broadcasts, if everything is going correctly, so far seem to deliver the best quality (picture detail here). But if HBO uses technology that captures and delivers only a fraction of what a director has put on film and what HDTV is capable of, I'd say that's a significant loss of quality. -- John

Let me further explain: I believe that CBS is doing something wrong, because film transfered to HD on other channels looks great! Actually, X-Files, one of the darkest lit shows being broadcast today, looks terrific in 480p (yeah, I wish it was 720p!), with a sense of depth and right-on colors. Film transfers on ABC/PBS/SHO/HBO meet my highest expectations. I fully realize the different look video provides, and the debate as film is slowly replaced in Hollywood will only heat up...

Again, my point is that CBS does not do a good job with it's prime time HD broadcasts. And I'm scratching my head over this because the few movies I've seen in HD on CBS look great.

Let me further explain: I believe that CBS is doing something wrong, because film transferred to HD on other channels looks great! Actually, X-Files, one of the darkest lit shows being broadcast today, looks terrific in 480p (yeah, I wish it was 720p!), with a sense of depth and right-on colors. Film transfers on ABC/PBS/SHO/HBO meet my highest expectations. I fully realize the different look video provides, and the debate as film is slowly replaced in Hollywood will only heat up...

Again, my point is that CBS does not do a good job with it's prime time HD broadcasts. And I'm scratching my head over this because the few movies I've seen in HD on CBS look great.

First of all, networks have absolutely nothing to do with post production of the shows they air. Post production work is done by a number of video post facilities, contracted by the production company, primarily in Los Angeles, but occasionally in New York or elsewhere. All of these facilities use essentially the same equipment for film to tape transfers (there are only 3 companies that make telecine equipment) and basically the same techniques and equipment for other steps in the post process (show assembly, color correction, titling, duplication).
There are essentially no technical differences between the products produced by any of these facilities. What is different on a show by show basis is the work of the director of photography and, occasionally, the choices made in final color correction. If a show appears grainy, it is because it was shot that way, not because of any deficiency in the post process and certainly not because of anyone at the network. There is a significant difference in the path a feature film takes to television and that of a program made for television. In the latter, the product is usually post produced electronically, with the film being transferred to tape (either standard definition or HD, depending upon the requirements) and all post production from that point onward using those transfers as the source material. In the case of feature films, the film is transferred primarily from an interpositive, which is an element created from the original assembled negative with color correction done in the lab. Additional color correction is done during the film to tape transfer. What many people don't seem to understand is the amount of time allowed during production for lighting in these two very different worlds. In features, the usual rate of production is about 2-3 pages per day. In television, it jumps to at least 8 pages per day, often closer to 9 or 10. This essentially cuts at least by half the amount of time allotted for lighting tweaks and other nice-ities. Essentially, we're trying to produce equal quality in less than half the time. Clearly, this can't be done. I think the photography in current dramatic television programs is uniformly at a much higher quality level than it's ever been.

CBS does nothing unique in its broadcasts. It simply takes the video masters that are delivered and puts them on the air. There is no uniformity to the "look" of a network. Diagnosis Murder looks nothing like CSI, and CSI looks nothing like Judging Amy, which looks nothing like The District. If you like what The X-Files looks like, that's fine, but that look is determined by the photography and some choices in post production, not by Fox as a network. And, BTW, the X-Files has the longest production schedule of any show I know of - so they actually do have more time for complex lighting setups than most of us.

Originally posted by mmost:
So let me get this straight. It seems that you and a lot of people here feel that "quality" = live video. You all seem to feel that a live video picture is the end-all and be-all and basic reason for existence of HD, regardless of what the program material itself is or what it's trying to present. As someone who has worked in the television and film industry for many years, I find it really disheartening that people are so taken with technology and the possiblities of that technology that it becomes more important than creating visual worlds, moods, and storytelling. We work very hard to create imagery that helps to tell a story on film, or whatever medium is used. The moods that are set are not made to convey reality, they are meant to create a world in which the story exists. If a show like "The X-Files" looked like reality, it wouldn't work. If a show like "Ally McBeal" looked like The Tonight Show it wouldn't work. Art is about setting moods and conveying emotion. It is generally accepted in this town that the images we create on film, for the most part, successfully do that. It's not about showing extreme detail and having everything in the world in focus, and it's not about having smooth motion, although it seems that alot of you feel that this is the only thing you want to see. Composition, artful lighting, letting you see only what the storyteller intends for you to see - that's what dramatic narrative programming is really about. If you want to talk about unintentional focus problems, washed out colors, milky blacks, and extreme graininess what they're not intentional - these are things that are true measures of picture quality, not whether the screen looks like it's showing you a documentary about the Grand Canyon (appropriate for a travelogue, but not appropriate for, say, CSI).

Mike Most
Visual FX Supervisor
Los Angeles

You hit it right on the button! Over the years a few "filmed before live audience" programs (e.g. citcoms) have aired a "live" broadcast. I every case I have ever seen, the show was ruined. Cinematics (if I've used the corredt word) plays an ENORMOUS role in any video entertainment. I'm not a video techie, just a moakes who likes to go to movie houses (oops, theaters), and watch good TV, but I know what I like to see, and you just described it. HDTV is better, period, whatever the original format.

HDTV is the first video format to be capable of reproducing a visual medium without introducing it's own coloration on the medium sized home screen. If the original medium is on film then HDTV should be capable of giving you the original film look on your FPTV. If shot in high res video it should pass along that res to your FPTV screen. A few broadcasting companies have, indeed, used HDTV to modify the original look of a movie, HBO has set as a policy in modifying the movie for HBO broadcast. They are not alone in this but are given the credit as the preverbial bad guys in this endeavor. Or, good guys depending on your movie to HDTV politics.
In my opinion, HDTV should be used as like a good xerox machine. It should be used to reproduce the original program without adding its own coloration to the work. Unfortunately, the process is tainted with people who do choose to use the transfer process to modify the original work. We can look at this as degradation of the work or helping it. In the HBO policy, it seems that the camp is split with HDTV owners who are movie buffs being in the "don't modify my movie" camp and those who own HDTV who are mainly interested in having the maximum area of their screen filled. I also think that it is accepted that when a broadcaster or transfer house for HDTV broadcast does the process of maximizing the movie for screen area (HBO policy) they do add certain artifacts or degrade the picture quality in the process. This is in addition to changing the artistic look of the image that the movie was intended to show.
Personally, as a movie buff, I prefer t unadulterated OTAR look. I also view movies on a large screen but when I travel and view DVD's on my notebook computer, I also prefer to watch in letterboxed wide screen, so screen size isn't really a deterant to my viewing OTAR as long as the quality is adequate for the screen size and I can view close enough. HDTV and DVD on small screens, IMHO, accomplishes this goal. On large screens, 80" and wider, HDTV clearly is superior to DVD from what I have seen.

I think Don and Mike are right on! In HD, I want to see what was originally produced...reproduced as closely as possible. If that means an image in one or another aspect ratio, then that's fine. If that means grainy film, that's okay. It's all part of the feel of the story, only part of the whole.

I moved to HD because I was tired of the frankly crappy picture quality of old analog sets. I'm even pleased with our local Fox station's digital picture quality, which is just plain old 480i, because even it looks better than their analog channel.

Keep in mind what this post was originally about: Looking for people's OPINIONS, and remember that each of us has an opinion. It's just that. Our opinions are formed based on our experiences, both good and bad. None of us has had the same exact experiences, so our opinions are going to vary. Our opinions can be altered by what we learn, and I don't think anybody using this forum can claim they're not here to learn something. I pick something new up with each post I read. But remember that there isn't a right/wrong answer to the question.

All that said, my personal praise to JMartinko for posting the question, and looking to educate him/herself. To mmost: you've made me learn some things about all the work that goes into post-production, so thanks for that. But what it boils down to, IN MY OPINION, is that this post is akin to asking how to judge fine wine, or good cigars, or what defines "great music". We're all gonna be looking for something slightly different, but I bet we all have the same thing in common: great video is something that MOVES you. It creates that ultimate experience in your head: no matter how the pixels are mapped, or how the images got from source to your screen, how much it cost, or whether the rest of the world agrees with you or not. To butcher someone else's quote...I don't know what great video IS, but I know it when I see it. IMHO. Keep up the good dialogue.