Not here, are you sure your box passed the 1080i to 720p conversion test? Are you sure your service provider isn't broadcasting 720p in 1080i. Remember you're still processing the image twice as fast in 60 fps even if it's a 30 frame broadcast, and sharpness is not the only issue, it's color rendering, artifacts, and over brightness and texture of the image.
So there are a lot of tangaables for different providers, equipment and set up. I also see some extra detail in some 1080i broadcasts when I let it pass through native, but the smoothness reduced artifacts, and texture of 720p outweigh that. When the side by side tests were performed with right eq. And proper conversion 720p won in all sequences and berates. This is what I see ......"In the presentation of uncompressed sequences, the delegates reported difficulties in seeing differences between the three formats – even at a viewing distance of 3h. But when the compressed images were shown, the viewers did notice differences in the visibility of compression artefacts. Depending on the viewing distance and scene content, the artefacts became visible to a greater or lesser extent and, with few exceptions, the following were reported:The 720p format showed better image quality than the 1080i format for all sequences and for all bitrates;"http://tech.ebu.ch/docs/techreview/trev_308-hdtv.pdf

All I know is that with OTA (which is no where near as compressed as cable/sat for the most part) 1080i looks much better overall than 720p. Flicker, or what ever you want to call it is non-existent as far as I'm concerned. But, you're more than welcome to believe what you want. I just don't think that 6 year old data is as applicable with today's sets than it was then.

Agreed on all points. I laugh when guys still talk about 'flicker'. Please, this is an artifact that was largely a function of NTSC and our old CRTs. But hey, some like to live in the past.

Another spin... The original source was 1080p and got converted to 1080i and 720p then they displayed all ..... All they an comeback with...They should have .... That's not true.... Not for all broadcasts. ... Yea they're all lies.., look from the guy who posted no verifiable evidence as I did as well his photo of girls with lime hair trying to prove something that actually proved everything I said. No evidence other that you're twisted hail maries trying to overturn verifyable evidence presented from multiple source. Your wishful twists hold no water. You got no evidence, you got your asses kicked plain and simple
Go join the flat earthers, even they had more of a case defending the earth is flat than you did defending 1080i. Maybe you'll learn some of their tactics. Never in my life did I ever see a group of people that want to psychologically believe their set is converting a 1080ix1920 signal to a full 1080p. Live in fantasy land, that may be the only place you can go to get something to back what you want to believe.

Look at it this way Sole. Aside from your sources being outdated, technically incorrect or operating under erroneous assumptions, most here are reporting precisely what is so easily seen, MORE detail in 1080i. It's so absurdly easy to see, there must be something wrong with your setup to insist this is not the case. Are we all crazy? When you consider the variety of displays we must all have, yet reporting the same results, don't you think it might be you operating under the wrong assumptions?

I know what my eyes see and you can post 1,000 references, but more detail is more detail.

A statement that is only true with near stationary frames. Once movement is introduced progressive scan is the clear winner. Too bad they could not/would not go with 1080p60 as the broadcast standard.

A statement that has no understanding of deinterlacing or framerates.

As I've had to repeat many times here, 25/30fps content sent as 1080i50/60, is no different from 1080p25/30. All you are doing is splitting the frames in half to create a 1080i signal (one frame = two fields) and combining them to recreate the original 1080p image.

With 50/60fps content, you halve the resolution to 1920x540p (one frame to one field, and a field has half the resolution) which is still higher resolution than 1280x720p, and looks sharper on a 1080p native display.

If you are seeing anything different, your display hardware is not handling 1080i correctly.

Quote:

Originally Posted by Joe Bloggs

Isn't most live TV and soaps exceptions too?

Not that I have seen. That's typically 25/30fps. (depending on whether it's 50/60Hz)

Not that I have seen. That's typically 25/30fps. (depending on whether it's 50/60Hz)

Yes, they are 25 frames per second, but in general live TV/soaps (though not all) are 50Hz not 25Hz (ie. not like 25p).

Here's a link to a wikipedia article that shows things it says have been "filmized" (ie. given a 25p look in post) - and ones that stopped being (though it might not be totally accurate - I think the BBC would rather things are shot in 25p mode or interlaced mode and not to add "film" effect (ie. 25p look) in post.

The perfect standard would have been 960p. Then all the old 480i content would look as good as you could get it.

I'm not saying 1080i doesn't look more detailed than 720p for static material--all I know is this:

When I first got my Panasonic 50-inch plasma in April of 2006 it was right before the satellite companies started all the over compressing and down-rezzing.

HD TV looked fantastic! Once the down-rezzing started it hasn't looked as good since.

I went to OTA and it looked great--then when TV went digital the stations started having one or two sub channels.

Now I don't know if that had anything to do with it or not but OTA didn't look as good then. When you have subchannels and the main channel has less bandwidth is it supposed to look not as good?

Now for motion sports I do think that 720p is better when things are in motion but when there isn't much motion happening then 1080i is better--at least that's what I see on my 1333X768 Panasonic plasma TV.

I think FIOS is better than the other satellite providers.

I think that the only thing that really looks great is Blu-Ray.

If it were up to me and I ruled the universe there wouldn't be a million channels showing garbage quality signals.

I'd say you don't need but 100 channels at the most. I would make down-rezzing against the law! I would make Interlacing against the law! I'd make sub channels against the law! And I'd implement a 960p/72 standard.

Many people would hate me because I would make them watch great quality TV!

Yes, they are 25 frames per second, but in general live TV/soaps (though not all) are 50Hz not 25Hz (ie. not like 25p).

I suppose it depends what you are watching. I don't watch soaps, but virtually anything else I've seen broadcast (including live content) has been film-type content and not video-type. Certainly the examples posted here of films, and drama are film-type content which should deinterlace into 1080p.

But it still remains that 1920×540p has higher resolution than 1280×720p and only requires upscaling in one dimension rather than two. (upscaling degrades image quality in most circumstances)

Quote:

Originally Posted by Artwood

The perfect standard would have been 960p. Then all the old 480i content would look as good as you could get it.

Are you saying that broadcasts should be 960p, or that the whole 1080p standard should have been replaced with 960p? (presumably 1440×960)

More resolution is always better, and I would have taken 1080p with its square pixels any day. Don't forget that 1080i was around for a long time before HD broadcasts or HD-DVD/Blu-ray, and SD content was anamorphic, so it would require complex scaling anyway, rather than simply pixel doubling 720×480 to 1440×960. Widescreen content would still require being scaled to 853×480 (1707×960) or 1024×576. (2048×1152)

And I would argue that modern video scaling algorithms are better than simply pixel-doubling, especially if you have a lot of extra resolution to work with. Pixel doubling is sharp, but it's also very aliased and makes the flaws in the source much more apparent.

Quote:

Originally Posted by Artwood

Now for motion sports I do think that 720p is better when things are in motion but when there isn't much motion happening then 1080i is better--at least that's what I see on my 1333X768 Panasonic plasma TV.

To be fair though, with a 1366×768 panel, you are not seeing the full 1920×1080 resolution, and if it's a 2006 panel, it is probably not deinterlacing 1080i into anything more than 1920×540p. (if it's even doing that—it may be trying to deinterlace to 1080 and doing it incorrectly)

Now for motion sports I do think that 720p is better when things are in motion but when there isn't much motion happening then 1080i is better--at least that's what I see on my 1333X768 Panasonic plasma TV.

Art, it's time for a full 1920x1080 display! The disparity you already see between 720p and 1080i will be even greater.

You are correct though, compression and multicasting has negatively impacted the pristine nature of HD that was not uncommon years ago. There is still some great broadcast HD that can be found, but it's harder to find now than it used to be.

I can well remember the stunning HDNet broadcasts when Cuban began that channel!

The problem is you guys didn't provide any, this wasn't all about resolution, it was about color rendering smearing and artifacts in where I provided all the test results and links. I clearly have proven from demonstrable links and tests that I have won this debate. you can twist create false scenarios, and that's all you done. Can all those links I provided be wrong vs some forum that couldn't lose a debate gracefully.

LOL! Getting the last word in doesn't mean anything. The tests links and evidence is in. BTW, the Pioneer plasmas they used I'd pass the full 1080p conversion test. If you look back I provide the 2006 list of TVs that fail the deinterlace test. Also Pioneer is #1. So another one of your hail mary scenarios shot down by my verifiable links. Oh once again, the 720p ssignal beat 1080i. That's the evidence. I presented the evidence of kell and other factors. Once again 720p WON in all bitrates.

To better understand the issues about interlaced-versus-progressive scan, we consider in the following sections the idealised spectrum of the three HDTV formats: 1080p/50, 1080i/25 and 720p/ 50. In the following diagrams we apply on the three axes: cycles per vertical resolution (picture height), cycles per horizontal resolution (picture width) and a temporal resolution (cycles per second). We assume for the following consideration the impact of a vertical Kell factor and an inter- lace factor applied to the vertical resolution, although it can be assumed that the low-pass character- istic of the HDTV system will also cause a reduction in realisable horizontal resolution. Therefore we denote the Kell factor as Kv (v for vertical) and the interlaced factor with a capital “I”.

The Kell factor Kv is defined as the ratio of the number of perceived lines to the number of total active video lines and usually has a value of 0.7 [1]. This factor was based on CRT measurements, and ideally would be measured in a non-CRT environment. The interlaced factor is field-rate dependent and is given in the literature [2] between 0.6 and 0.7. We use here a factor of I = 0.7 for 50 Hz field rates.

As we can see from Fig. A1, the 1920 x 1080p format has a larger horizontal and vertical resolution than the 1280 x 720p format. When a Kell factor of Kv = 0.7 is applied, both formats suffer from a reduction of vertical resolution.

In Fig. A2 we show the 1920 x 1080i format and the impact of interlace which results in a gradual reduction of vertical resolution with movement, caused by subdividing a single frame into two fields (interlaced). Fig. A2 (right) shows the 1080i/25 format with a Kell factor of Kv = 0.7 and in addition the interlace factor I = 0.7 caused by incomplete cancellation of the fields (interline twitter). Both factors further reduce the available vertical resolution of the format.

In Fig. A3 we show the idealised spectrum of the 1920 x 1080i format with Kell and interlaced factor compared to the 1280 x 720p signal with Kell factor....

From the considerations given in this Appendix we can conclude the following:

The spectral distribution of a 720p/50 and an 1080i/25 signal is basically similar in spatio-

temporal volume;

The 720p/50 signal should provide better movement portrayal and the 1080i/25 system should provide more detail via the higher horizontal resolution;

Kell and interlaced factor both “reduce” the available resolution while the interlaced factor reduces the vertical resolution of the 1080i/25 signal. Considering all factors, a 720p/50 signal seems to have more advantages than a 1080i/25 signal;

.....
If 1920x1080p/50 format production is not available, and the programme content has very little
movement (i.e. with movies), the highest potential viewer quality will be achieved for viewers with
1920x1080p/25 production and 1920x1080psf/25 delivery. This will deliver the best quality for
'drama'.

--
I don't necessarily agree that it would always provide the best quality for drama - since 1080psf/25 wouldn't give the best motion quality (ie. interlaced (50i) and 720p50 can both give better motion quality than 1080psf/25), but for existing content (ie. nearly all existing movies) that has already been shot in a 24p/25p format, I think it's correct that that should (likely) be the best way (ie. better than 720p50) to broadcast that content to consumer's TVs, given sufficient bitrate.

* Though none of the EBUs tests back around 2006 used 24p/25p content (ie. movies).

I didn't go trough all 49 pages of the discussion, so I hope you have the patience to answer to this question regarding backing up your BR Disks on lets say mkv files (x264):

Given:

1) you use the same BR Disk (movie) as source.
2) you use the same audio settings for all files (audio info will be the same quality and size on all files).
3) you use the best video quality settings possible for the resulting file size.
4) you have a 1080p native panel that scales 720p content beautifully.

Should a 5gb size 720p movie will look better than a 1080p of the same size (why)?

If yes, which approximate file size should the 1080p movie be to have better image quality than the 720p one?

And one last question regarding a real life case (don't ask where the info is taken from ):

Of these examples of the same movie, which one should look better (why)?

It's hard to say really, because you are using CBR encoding rather than VBR, which is massively wasteful, and the "720p" file has a comparatively higher bitrate.
Depending on the downsampling used, a lot of high frequency detail is probably getting filtered out of the 720p image as well, which makes it easier to compress.

It's hard to say really, because you are using CBR encoding rather than VBR, which is massively wasteful, and the "720p" file has a comparatively higher bitrate.
Depending on the downsampling used, a lot of high frequency detail is probably getting filtered out of the 720p image as well, which makes it easier to compress.

Thanks for the quick response, but that was just a real life example, as you can see. I'm also interested on my initial 2 questions, can you comment on them?

Here they are:

Quote:

Originally Posted by leoajax

Given:

1) you use the same BR Disk (movie) as source.
2) you use the same audio settings for all files (audio info will be the same quality and size on all files).
3) you use the best video quality settings possible for the resulting file size.
4) you have a 1080p native panel that scales 720p content beautifully.

Should a 5gb size 720p movie will look better than a 1080p of the same size (why)?

If yes, which approximate file size should the 1080p movie be to have better image quality than the 720p one?

How much better is motion resolution for 720p versus 1080i material on a 1080p set. I'm sure alot of it has to do with how good a processor the TV has--would an outside processor help?

Is 720p better when it comes to motion but only at fast speeds--in other words do you only see its benefits when watching NASCAR, hockey, or when someone passes the football and it is in flight?

Remeber all this gets reduced to real world performance of the display technologies. Some like DLP have little or no inherent motion distortion to begin with. I watch a lot of football NBC is 1080i and the local OTA is filled with lots of ugly mpeg compression artifacts that negatively impact motion and detail. My local CBS is 1080i with little or no compression artifacts for the most part with great detail and motion is superb, ABC, ESPN 720P has decent motion as you would expect with a progressive signal but the image is not particularly detailed and lacks the eye candy effect of the 1080' broadcasts. On the other hand NBC sports network at 1080I does a lot of hockey and the puch motion is very good and on the content I have scene compression artifacts are minimal.

My guess is that in the abstract we can argue about the technical merits of one approach versus the other but in reality it as always comes down to implementation. We seldom if ever get the best a particular resolution is capable of from its source provider and then we have to deal with the processing and display weakness or strengths of the individual display. So basically this argument is a fine hypothetically but in reality it all comes down to the junk they send and the junk we play it back with.

I have a 2006 Panny 1366 x768 plasma panel. This older device will not take input at the panels native rate. You would think that sending 720P would produce the best result when upscaled but the processor in the panel works really well went sent 1080i, 1080P looks just as bad as 720P. Go figure! I am using a DVDO DUO to ouput to the Plasma and a Radiance to the 92" DLP.

Last week I was at Costco and they were showing clips from the Ravens recent superbowl win. The motion on one particular mid size Sony panel was so bad the ball carriers legs were smearing together. That might have been the Jacobi Jones kick return for a tocuhdown. I can't recall as I lost the context. I was stunned as I never remeber seeing motion look that bad on a modern display. I saw that game at home on the big 92" Mits and it was a stunning broadcast with no motion or compression artifacts that I noticed.

just a quick question, one day I will buy a home theater and have a projection screen about 150" -180" about..

will it be better to rip at 720? so I can save space.
or should I just rip at 1080 now so in 5 years time when I get my screen, I have them in the highest res.
or does it not matter as it seems in these 720 vs 1080 debates the out come is pretty much, not much different unless you ARE really picky.

but those comparisons were for tvs, 40" 50" screens.

I would like to know about projection screens .

Im not interested in Audio, as I will rip at the highest quality possible.

You need to factor in both resolution and bit rate. A 700mb file is anything looks fine on my tablet but that's about it. Looks horrible on even a 40" TV. But that is just my opinion and you have to decide for yourself what still looks OK. I rip only full res and would like to think anyone with a 100"+ screen would want the same :-)

well last I checked 4K is not available to everyone with that price tag, and there is not even enough content to view it. so what you are saying is, blurays in the next 5 years will be re released in 4k?

You are planning for 5 years from now? I suggest taking rest and moving to other things for the next 4 years. Nobody can predict what will be in 5 years. Speculations are that 4K will be old by then, 8K will be quite new, and displays will be printed like wallpaper.

You are planning for 5 years from now? I suggest taking rest and moving to other things for the next 4 years. Nobody can predict what will be in 5 years. Speculations are that 4K will be old by then, 8K will be quite new, and displays will be printed like wallpaper.

Yep and OLED will be out any day for the masses like they told us 5 years ago.

I have a question regarding the grid lines in 1080p, ie the space between pixels.

I've just upgraded from Optoma HD70 720p to HD25 1080p projector. I'm kind of disappointed because it didn't give me the image quality jump I had expected. With a million more pixels, I was expecting a much smoother, less pixelated picture.

I can see that the pixels got much smaller but the grid lines seem more prominent. They didn't seem to shrink much in width at all. Which leads me to wonder whether the space between pixels get reduced proportionately with the reduction in pixel size from 720p to 1080p.

If this space doesn't shrink by as much, then we are actually getting more space used up by the grid lines than the actual image. Going forward to 4k resolution, would it get worse with 4x more grid lines than 1080p?