Re: what's up with 1080i setting but no 1080p?

Right but what I am saying is that you are not going to get a 1080p quality picture from 1080i, you cannot get a progressive image from 2 interlaced images no matter what TV you have because the raw data isn't there.

Re: what's up with 1080i setting but no 1080p?

longhornsk57 wrote:

Right but what I am saying is that you are not going to get a 1080p quality picture from 1080i, you cannot get a progressive image from 2 interlaced images no matter what TV you have because the raw data isn't there.

You are right in the affect you are not going to get Blue Ray quality but its do to more varibles than just interlaced vs progressive but your flat panel will show native resolution.

Re: what's up with 1080i setting but no 1080p?

longhornsk57 wrote:

Yeah since I'm super late to the HD game, I figured we were up to the point where service providers could to 1080p now, but it looks like it is still too much bandwidth. So yeah you're right, only Blu-Ray and game consoles and stuff can do it. 1080i is still TONS better than that SD crap I was watching before though, plus my phone actually outputs 1080p on everything i record like with the video camera or movies/streaming, so that's pretty cool to watch on a big 1080p TV.

Has nothing to do with bandwidth. Even OTA broadcasts are 1080i. It's a network issue not a provider issue. Their is a reason on one provider (DirectTV) has any 1080p content and those are strictly on demand movies.

” Auto racing, bull fighting, and mountain climbing are the only real sports … all others are games.”- Ernest Hemingway

Re: what's up with 1080i setting but no 1080p?

Longhorn,

All modern TVs have a video processing chip inside them usually called the scaler/deinterlacer. This chip is responsible for changing any incoming signal format into what the display can actually display natively.

The scaler/deinterlacer is the chip that is responsible for changing the 1080i/60 signal from the broadcaster into a progressive signal that the (LCD Panel, Plasma Panel, etc.) can display.

It converts interlaced video into progressive video by using motion-adaptive interpolation.

A 1080i signal does not have frames. It is a sequence of fields that alternate between the top field (scan lines 1, 3, 5, etc.) and the bottom field (scan lines 2, 4, 6, etc.). There are 60 fields per second in a 1080i signal. Each field is independent in BOTH space and time. That means that the pixels in each field come from a different part of the image captured by the camera (spatially independent) AND each field was actually snapped by the camera at a slightly different time (temporally independent). That is why a 1080i signal cannot be said to have "frames" because you do not get one continuous picture if you combine two adjacent fields together.

Your TV takes the stream of fields and buffers them, and carefully examines each field to identify objects that are moving and objects that aren't. It then applies two different algorithms to the fields to create full frames using the information in the current field, as well as information in the previous and subsequent fields. The result is a 1080p signal at 60 frames per second, each frame consisting of half actual pixel information from the 1080i signal, and half algorithm-created information.

The modern scaler/deinterlacer chips in TVs do this job fairly well, although there are wide variations in quality from different manufacturers.

A 1080p signal that you get from Blu-Ray is not 60 frames per second. It is typically 24 frames per second, which matches the film rate. 24 frames per second looks very "juddery" when played back because there is not nearly as much temporal information as a 1080i signal has. (A 1080i/60 signal has 60 updates per second of temporal information, corresponding to the 60 fields. This is more than 2.5x what a 1080p/24 signal has).

Because a 1080p signal has far less temporal information, the bandwidth requirements for 1080p are actually less than what is required for 1080i. So the issue with providers not doing 1080p is not a bandwidth issue. The reason they typically don't do it is because not every HDTV out there can process a 1080p/24 signal. Early HDTVs (typically rear-projection CRT models) could display 1080i signals only, because they actually still use electron beam scanning, which has always been interlaced in consumer equipment.

Re: what's up with 1080i setting but no 1080p?

*The views and opinions expressed on this forum are purely my own. Any product claim, statistic, quote, or other representation about a product or service should be verified with the manufacturer, provider, or party.

Re: what's up with 1080i setting but no 1080p?

If technology keeps going the way it is, SJ is going to need more RAM and a bigger hard drive. I read that twice and still can't remember everything he just said.

"If you find this post helpful and it solved your issue please mark it as a solution. This will help other forum members locate it and will also let everyone know that it corrected your problem. If they have the same issue they will know how to solve theirs"

*The views and opinions expressed on this forum are purely my own. Any product claim, statistic, quote, or other representation about a product or service should be verified with the manufacturer, provider, or party.

Re: what's up with 1080i setting but no 1080p?

Seriously, what method is used to encode/compress each stream? I am actually under the impression that the 1080i hd setting is nothing more than "upconverted" 720p happening at the stb. That is if the method of encode/compress actually even results in true 720p exiting your stb.

Re: what's up with 1080i setting but no 1080p?

Each stream's resolution is encoded and compressed at the native resolution that the broadcaster provides. Many broadcasters natively provide 1080i signals (NBC, CBS, Discovery, HBO, several others). Some broadcasters provide native 720p signals (ABC, FOX, Disney, ESPN, a few others).

AT&T uses H.264 (AVC) to compress/encode their video streams. This is the same encoding used on most Blu-Ray discs, but AT&T uses far lower bitrates. The native bitrate that AT&T is using for HD streams is approximately 5.7 Mbps.

The STB will output whatever resolution is selected in the Menu -> Options -> System Options -> Aspect Ratio screen, regardless of the native resolution of the stream. For example, if you have selected 1080i output, the STB will output 1080i for all channels: incoming streams that are 1080i will be sent natively to the TV, and incoming streams that are 720p will be converted to 1080i, then sent to the TV.

The opposite happens if you've selected 720p output. 720p streams will be sent natively to the TV, 1080i streams will be converted to 720p and then sent to the TV.

Some providers, notably DirecTV, have a selector in their STBs called "Native Mode", where the STB will send the stream to the TV in the native resolution, no matter what it is, and your TV can then handle it. This selection is unfortunately not available on U-Verse.

Re: what's up with 1080i setting but no 1080p?

The reason for 1080p output is not to receive 1080p channels, it's so that BOTH 1080i and 720p content get scaled properly. If you have 1080i output, then you're losing resolution on 720p channels, and if you use 720p output, then you lose resolution on 1080i channels. With 1080p output, you don't ever lose resolution.

Re: what's up with 1080i setting but no 1080p?

*The views and opinions expressed on this forum are purely my own. Any product claim, statistic, quote, or other representation about a product or service should be verified with the manufacturer, provider, or party.