Originally posted by Darin Another reason would be if you had a high-end CRT display that could do multiple HD resolutions natively. If your projector could do both 720p and 1080i natively, then you'd want to output 720p for ABC and ESPN (and Fox later this year), but 1080i for the others. This would prevent any quality loss from converting to another resolution. .....

I agree with your statement, but the truth of the matter is that the majority of people have no clue what resolution programs are broadcast in. Frankly, there should be no reason that a consumer should have to ascertain what resolution a particular broadcaster decides to broadcast in. For the record, I am in the clueless group as far as what the networks are broadcasting in, but I do consider myself to be somewhat knowledgeable in the area of Home Theatre and consumer electronics.

Honestly, the whole HD roll-out is just a mess. If the standard supports multiple resolutions (like 18 of them) then the displays they are shown on should automatically switch to that resolution. An override would be available for people who wanted to customize. But what the heck do I know...

Originally posted by DCIFRTHS If the standard supports multiple resolutions (like 18 of them) then the displays they are shown on should automatically switch to that resolution.

The ONLY display technology in use today that doesn't have an inherent "native" resolution is CRT. CRT can simply change the number of lines it does in one scan to accommodate any number of resolutions. But unfortunately, most CRT HD sets available don't take advantage of this ability. The electronics within them are designed to scan at one or maybe two specific frequencies (usually 1080, and maybe additionally 480 for SD), and separate electronics in the set run the image through interpolation when the incoming signal does not match that resolution. Why they do this, I don't know, because the picture would look best if displayed at it's original resolution rather than interpolated to another. Practically all CRT based computer monitors, even cheap sub-$100 ones, do multi-scan, so why this ability can mostly only be found in high-end projectors is a mystery to me.

Anyway, all the other display types have a fixed pixel geometry, no matter what you do with the incoming signal, they only have a specific amount of physical pixels to work with, so they MUST interpolate the incoming signal to match their native resolution, if it's not the same already.

So this is why most (all???) STBs don't simply output whatever resolution the original signal is, because unless you have one of those relatively rare CRT sets that can do multiscan, the TV is going to convert it to a specific resolution anyway. So all things being equal, it's better to do that conversion in the STB while the signal is still digital, and it ensures the guide and other STB generated graphics look better (they don't end up going through an interpolation).

That's why I said the ability to switch output resolutions would benefit high-end CRT owners, because this would let them override the automatic interpolation to a specific resolution, and get the best PQ possible out of the incoming signal. And those who wanted to take advantage of that ability would most likely know which networks are which. For everyone else, leaving it set to the native resolution of your TV is generally best. FWIW, to the best of my knowledge, ABC, ESPN, and in the fall, FOX, use 720p, while all the others use 1080i.

Originally posted by DCIFRTHS Where is the signal processed for things like brightness, contrast, 3:2 pull down etc. when a DVI connection is used? In the source device or the display?

Brightness and contrast are generally done in the display device. 3:2 pull down would not be done in the display device unless you were sending it a 480i signal. So if you have your STB set to 720p, then if any 3:2 pulldown was done to SD programming, it would have to be done at the STB.

Originally posted by DCIFRTHS I agree with your statement, but the truth of the matter is that the majority of people have no clue what resolution programs are broadcast in.

If the standard supports multiple resolutions (like 18 of them) then the displays they are shown on should automatically switch to that resolution.

My CRT set supports 480p & 1080i. So a STB outputting native resolution would work fine for most of my channels. ABC on the other hand outputs in 720p, which my CRT doesn't support. So if the STB outputs 720p I'm going to get a blank screen. This would be a bad thing. Assuming I knew that ABC was 720p and that my CRT won't support 720p I'd still have to manually switch the STB to output either 480p or 1080i to get a picture. If the STB just side-converts everything to one resolution (that my CRT supports) all I have to know is how to change the channel.

Plasmas will usually do 480p and 720p. So I'd be able to watch either Fox or ABC & ESPN-HD without having to know anything. PBS, NBC, CBS, UPN, HBO, etc. would give me a blank screen. And we're back to knowing what channel outputs what signal.

It's much easier for the masses to output one (supported) resolution at all times. Those with more knowledge and a compatible set can adjust the output resolution to their heart's content.

I think that the best all-around feature would be to be able to select whatever output you want, OR to enable a "native" mode option that would pass whatever signal the show is in. It looks like for whatever reason that neither the HD-Tivo nor the Dish 921 have this option though.

Originally posted by Darin The ONLY display technology in use today that doesn't have an inherent "native" resolution is CRT. CRT can simply change the number of lines it does in one scan to accommodate any number of resolutions. But unfortunately, most CRT HD sets available don't take advantage of this ability. The electronics within them are designed to scan at one or maybe two specific frequencies (usually 1080, and maybe additionally 480 for SD), and separate electronics in the set run the image through interpolation when the incoming signal does not match that resolution. Why they do this, I don't know, because the picture would look best if displayed at it's original resolution rather than interpolated to another. Practically all CRT based computer monitors, even cheap sub-$100 ones, do multi-scan, so why this ability can mostly only be found in high-end projectors is a mystery to me. ......

While I think it's great that technology moves forward, and we are seeing all sorts of new display innovations, it's too bad that most of them are fixed pixel designs. DLP, LCoS, LCD etc. This is why I still say that an RPTV, if you have the room, is the best way to go. They still offer the best display, and are a bargain because of the new technologies.

My RPTV will display 480i, 480p (?), 960i, 1080i, and it accepts 720p. The reason I put a question mark after 480p is because I am still researching that mode, and the details are sketchy. It is thought that all resolutions below 720p are converted to 960i on my particular set (Sony 57-WV700). Apparently 960i is very easy for the set to convert to, and gives a very solid picture. 720p is up-converted to 1080i. I don't like this idea because I believe that a 720p display will look better than a resolution of 1080i because of the interlacing. It's a shame that the best looking resolution (in theory) is more expensive to reproduce and therefore ignored by most (all?) manufacturers.

When I said there are 18 formats for HD, I didn't mean that the display device could or should display all of them. Since the networks have "standardized" on a few resolutions, what I would like to see is the native display of the ones that are being broadcast - without the end user having to figure out the resolution of the broadcast and then selecting the output on the STB. You brought up a really good point, and it's one that supports my whole "HD roll out is a mess" theory. There is conversion going on in at least one place in the signal path.

It bothers me that the HD roll out has so many compromises in it. This was the perfect chance to do a standard correctly, but because of various (read political) reasons, it is a compromise. On top of that it is a confusing for consumers.

Originally posted by Knative My CRT set supports 480p & 1080i. So a STB outputting native resolution would work fine for most of my channels. ABC on the other hand outputs in 720p, which my CRT doesn't support. So if the STB outputs 720p I'm going to get a blank screen. ....

Are you sure the display won't convert a 720p signal to something else?

Quote:

Originally posted by Knative Plasmas will usually do 480p and 720p. So I'd be able to watch either Fox or ABC & ESPN-HD without having to know anything. PBS, NBC, CBS, UPN, HBO, etc. would give me a blank screen.

I believe that plasma displays output at a fixed resolution, so regardless of what signal you feed them, there is conversion of the signal to match the native resolution of the display. Am I making an incorrect statement?

Originally posted by Darin Brightness and contrast are generally done in the display device. 3:2 pull down would not be done in the display device unless you were sending it a 480i signal. So if you have your STB set to 720p, then if any 3:2 pulldown was done to SD programming, it would have to be done at the STB.

So what does the display device do with a DVI (digital) signal? Is the ONLY thing it does is convert it to analog for display?

I also have a Sony set (46WT510), and it actually DOWNconverts 720p to 480p, and many Sonys do the same. And 960i is technically 1080i... all they do for 960i is put put it in a 1080i frame, then vertically stretch the image so the black bars are off-screen. There is also a guy at HomeTheaterSpot who claims that 480 signals are always converted to 960 (and hence, 1080) on Sony CRTs. I'm not sure if I believe that, but if it's true, then these sets only have one scan rate.

There are sets that don't support 720p AT ALL (they won't do an internal conversion), but it's usually moot since virtually all STBs give you an option of what resolution they output. The fact that my set downconverts 720p is also moot, since my set never sees 720p... the signal is converted to 1080i in the STB before it ever sees the TV. But I agree, the no compromise solution would be for my set to be able to natively disply 720 lines as well as 1080, and have the STB output whatever resolution the programming came in at. But I think the fact that so few displays actually can do multiple resolutions natively is why no STBs have the option to pass the signal in it's native format. In fact, my STB has the output resolution selection as a physical switch in the BACK of the unit... it's meant to be a set and forget setting.

Originally posted by DCIFRTHS So what does the display device do with a DVI (digital) signal? Is the ONLY thing it does is convert it to analog for display?

Well, that's going to depend on the set, and I can't pretend to know what they all do, but I think generally, if it's an analog display (CRT), then it's converted to analog, then processed like any other input. If it's a digital display, then it stays in the digital domain, and any processing is done digitally.

Originally posted by DCIFRTHS Are you sure the display won't convert a 720p signal to something else?

Yep, I'm sure. I have a Zenith HD-DVR that outputs 480i, 480p, 720p, and 1080i. Pressing the button on the front of the STB cycles the output resolutions. And while I was incorrect for the sake of convenience in saying that 720p will give me a 'blank screen' what it does in reality is give me a signal that looks like an old VCR that should be outputting on channel 4 instead of channel 3. A vertical line down the middle of the screen with both halves of the picture on the wrong sides and wavy horizontal lines. 480i does give a blank screen as the component inputs can't handle that signal.

My projector does 480p, 720p, and 1080i. But it looses sync for a couple of seconds when the signal type changes. So I leave the DVR at 1080i and don't worry about ABC too much. Alias still looks great.

My main point is that there is a large number of people spending a large amount of money on TVs that have no idea what they're really getting. I personally know a gent who thought that his "HDTV Ready" set would give him HDTV from every channel on basic cable. He had no idea that he'd need a STB and (for OTA) an antenna to receive true HDTV. Or that only certain channels were broadcasting in HDTV. And only at certain times. Or that he couldn't use the yellow video cable for HDTV. At least he'd upgraded from coax. I only found out when he was disappointed in the "HDTV" that he was seeing after hearing me rave about my PQ. A quick trip to his house and a bit of explaining only started to set him straight. Telling him he'd need to spend another $500 or so on a HDTV STB didn't make him too happy.

The same type of thing happens to folks that think they're getting 5.1 DD from the red & white audio jacks. Their receiver supports it so they must have it, not matter what connector they used, right?

Fortunately every HDTV Set-Top Box on the market lets you pick the output resolution and converts between them. Very few HDTVs can handle 720p natively and many Plasmas can't handle 1080i so this is the perfect solution.

The only people left out are those with high-end equipment that can handle 720p and 1080i and display them natively. I don't know of any STBs that let you set it to "Output Source Format".

__________________DirecTV vs. Cable: If you can't tell the difference, why pay the difference?

Originally posted by feldon23 I don't know of any STBs that let you set it to "Output Source Format".

The Sony HD-200 and Zenith HD-SAT520 do (as I'd expect the Sony HD-300 and LG-branded version to also do): The native setting converts 480i signals to 480p and passes along 720p and 1080i signals directly to the display without conversion. The hybrid 1 and 2 settings are similar, but hybrid 1 converts all HD signals to 1080i, and hybrid 2 converts them to 720p. EZ DVI automatically detects the monitor type and converts all signals accordingly.

This is a valuable feature when you have a decent display. I have the Fujitsu P50 plasma display and it's absolutely annoying using it with the Echostar 6000 receiver where you have to switch manually between HD and SD output modes, and it only offers 720p or 1080i all-the-time choices for HD modes. I much more enjoy the behavior of the HD-SAT520. It provides easy access to all the advanced scaling features in the plasma that will only function on 480i and 480p signals.

__________________
¯\_(ツ)_/¯****************To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.

I mean a few people would like it; but as a practical matter very few people will really be missing it.

The two Hybrid modes solve that 720p-only or 1080i-only issue for those displays also supporting 480i/480p. Pseudo-native.

I just would hate being stuck with some cheap scalar in the loop I'm force to use short of manually toggling a setting; I could live with discrete codes to select the setting, but toggles are terribly annoying to use.

__________________
¯\_(ツ)_/¯****************To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.

To view links or images in signatures your post count must be 10 or greater. You currently have 0 posts.

Originally posted by rogo I mean a few people would like it; but as a practical matter very few people will really be missing it.

I guarantee you if you sat two otherwise identical displays side by side, and had one showing a native 720p picture, while the other was displaying the same 720p source scaled to 1080i (much less 480p), the difference would be readily apparent. I just don't understand why more CRT sets don't include multi-scan capabilities. As I said earlier, if the cheapest CRT computer monitors can do multiscan, why not HD televisions? The lack of a "native resolution" is one of the advantages CRT has over other technologies, it seems that by the time television makers recognized this advantage, CRT sets will be extinct. I can't help but wonder if that feature isn't purposely omitted to prevent CRTs from having an upper hand over their much more expensive (and probably more profitable) fixed pixel displays.

Originally posted by smak Do the OTA tuners on the directv version mean that you would be able to use that box without directv, maybe buying it because you'll have directv in the future, but not wanting to have to buy another box when you do get the dish.

-smak-

But then where would you get your TiVo service from? Who would you pay for the TiVo service?

Since no new TiVo's work (PVR functionality) without service, I would assume that they would not allow the box to become merrily a OTA tuner (no PVR functionality).

And since this a DirecTV box, I would assume that you would not be able to pay TiVo directly for the SA TiVo service.

Because of that -- I am almost 100% positive that the box will only work if you are subscribed to DirecTV.

__________________Do you really care what kind of TV or TiVo I have? Didn't think so...

Todd76 saidMost consumers wouldn't understand a word of this conversation. And for that reason it's probably not a very important feature.

Only because nobody has been asked to write it in a format that is readable. Here's my shot at it, as a part-time technical writer:

Different HDTV models have different ways of producing a picture, using different technologies such as CRT, LCD, DLP, LCOS, or Plasma. There are 2 different HDTV formats/qualities, called 720p and 1080i.

These produce roughly the same picture quality but have some Pros and Cons.

Some HDTVs are designed for 720p and use a converter to play back 1080i.
Some HDTVs are designed for 1080i and use a converter to play back 720p.
Most HDTVs are designed for 1080i and cannot play back a 720p signal without help from your satellite/cable receiver.
And a fortunate few HDTVs natively display both 720p and 1080i in the best possible way.

Fortunately, the DirecTV HD TiVo can handle all these situations.

__________________DirecTV vs. Cable: If you can't tell the difference, why pay the difference?

i hope that in a few years someone comes up with a progressive scan monitor with 2160 lines. then there would need to be no scaler ever (or a real stupid one).

If it got a 720 signal then shot each line 3 times if it got a 1080 then show each line twice, bamm no interpolation just basic math. (of course some might argue that it should still interpolate but at least even then you know ever 2nd or 3rd line is correct. (480 eve divides into it by exactly 4 and a half times so thats not such a tough interpolation either).

This whole scan lines thing gives me a brain hurt. From what i saw shoping for my TV is that many sets dont even have 720 or 1080 lines to begin with so their "native resiloution" is in fact interpolated too. For example the butt kicker top of the line pioneer elite plasmas that their was a showcase on the other day- some of those use 7689 fixed pixels. What the hell is the sense in that? It is always making up data by adding 48 lines of guesses to 720 or throwing out data by deleting 312 lines of a 1080 signal. And i never saw an explanation so for all I know it throws all the incoming data in the blender and makes all 768 lines up and we never get to see what was originally transmitted.

Another good use for native output from an STB is the ability to have a specific device do the scaling.

For example, the Samsung HLN617w DLP set will accept both 720p and 1080i signals. The display is a 720p native display, but uses a Faroudja for signal conversion. That chipset is arguably one of the best available for signal conversion in the digital domain.

In my setup I have a Samsung SIRT165 STB which has a physical output selector switch for a single output resolution connected to my Samsung DLP. Nearly every HD program I watch is on CBS which broadcast a 1080i native signal. If I set the STB to output 720 (which most people would recommend because it is the native resolution of the display) CBS looks 'horrible' compared to the following alternative. Set the STB to output 1080i (native resolution of the broadcast) and let the DLP do the side-conversion to 720p via the Faroudja chipset, this method looks far superior.

The problem then works in reverse. With the STB in 1080i and watching ABC (720p native), the STB performs the conversion (from 720p to 1080i) then the TV converts it again (from 1080i back to 720p) and ABC also looks 'horrible.'

Fortunately for me, I pretty much only watch CBS and NBC (1080i native) so I leave the STB set to 1080i letting my DLP side-convert to its native resolution of 720p. With the switch on the back of the STB, I almost never even bother watching 720p native shows.

Therefore, I would love to have native passthrough, but it wouldn't be a deal breaker for me either (due to my viewing habbits)