Baselworld is only a few weeks away. Getting the latest news is easy, Click Here for info on how to join the Watchuseek.com newsletter list. Follow our team for updates featuring event coverage, new product unveilings, watch industry news & more!

I've been pondering getting a video processor and was wondering something - if I'm watching a movie off of cable in 1080i/60 and the processor de-interlaces and does reverse pulldown and outputs 1080p/24 then shouldn't that be the same quality as the same movie on Blu-Ray? I mean setting aside Comcast compression issues shouldn't the processor be able to get real close? Or is it just not the same?

Not really. See, you are only talking one aspect of the image...resolution. Though you can do what you mention and output the same as a BluRay, there are many other factors that come into play that will affect the image even more. For instance, BluRay uses WAY less compression than broadcast...this is a BIG problem for satellite/cable signals. BluRay has a higher signal to noise ratio, etc...

I've been pondering getting a video processor and was wondering something - if I'm watching a movie off of cable in 1080i/60 and the processor de-interlaces and does reverse pulldown and outputs 1080p/24 then shouldn't that be the same quality as the same movie on Blu-Ray? I mean setting aside Comcast compression issues shouldn't the processor be able to get real close? Or is it just not the same?

I have done it and yes it does work. A lot of HD TV material is 24 fps. Depending on the video processor it can automatically pick out the cadence of the source material. You have to have the VP output 24/48/72 fps to take advantage. I have a VP50 set to output 1080-48P or 1060-60P. I do get lazy and just leave it at 48 fps most of the time. I do get some motion artifacts this way but I don't watch much TV on my HT projector.

1. The movie is converted from 1080p24 to 1080i, and the provider sends the resulting 1080i to the cable box. I configure the cable box to produce 1080i output and use a video processor to convert it to 1080p24.

2. The provider sends the movie as 1080p24 to the cable box. I configure the cable box to produce 1080p24 output (and do not use a video processor).

1. The movie is converted from 1080p24 to 1080i, and the provider sends the resulting 1080i to the cable box. I configure the cable box to produce 1080i output and use a video processor to convert it to 1080p24.

2. The provider sends the movie as 1080p24 to the cable box. I configure the cable box to produce 1080p24 output (and do not use a video processor).

What difference would you see in the final output?

Assuming the deinterlacing and cadence detection is correct, none. I've already stated this would be the same.

Quote:

Originally Posted by me

It may be in terms of resolution and framerate, but the list basically ends there.

The OP shrugged of compression as the it shouldn't be that big of a difference. I'm stating that yes in fact, it will be. Cable bit rates are terrible relative to BD.

I use a Lumagen HDQ and watch movies set to 1080p/24. Works great with my Directv HD DVR. Resolution is no where near what my Bluray player outputs but in most cases looks better and sharper than a DVD. It will depend on the channel you are recording or viewing the movie from. Some channels have more bandwidth and use a higher bitrate which produces a better picture but even the better channels vary in quality.

I've been pondering getting a video processor and was wondering something - if I'm watching a movie off of cable in 1080i/60 and the processor de-interlaces and does reverse pulldown and outputs 1080p/24 then shouldn't that be the same quality as the same movie on Blu-Ray? I mean setting aside Comcast compression issues shouldn't the processor be able to get real close? Or is it just not the same?

huge difference HD Broadcast is in 8 bits/second if you are lucky. Blu-ray has 48. Blu-ray has a lot more data.

I know 1080i is stunning as compared to 480i, but I would like to mention one fundamental difference between 1080i and 1080p. That is that with 1080i the video processor only has 540 lines per frame to work with and 1080p is just that, the full 1080 lines. Deinterlacing is not an exact science and artifacts are to be expected when you go from 540 to 1080. Just my 2 cents.

That is that with 1080i the video processor only has 540 lines per frame to work with and 1080p is just that, the full 1080 lines. Deinterlacing is not an exact science and artifacts are to be expected when you go from 540 to 1080. Just my 2 cents.

No, it has 540 lines per FIELD, two fields make up frame. So there are still 1080 lines per frame in 1080i. The difference is the two sets of 540 lines are seperated in time by 1/60th of a second. In a still image or an image that does not move during that time, 1080i can be converted back to 1080p quite well. It's when something moves between fields that deinterlacing is difficult if not impossible.

The bitrate can make a difference in picture quality, but the bitrate does not really make a difference in the process of converting from 1080i to 1080p24. The process is the same regardless of whether the bitrate is higher or lower.

Thanks much for the replies guys, I think the DVDO Edge is going to be the ticket.

Converting film sources to 1080p24 theoretically should be perfect but in reality errors often occur producing intermittent jerkiness or stuttering in the image. This is especially true with broadcast sources. Even if this artifact is infrequent it can be pretty annoying depending on your tolerance for it.

The best performance at this I've seen so far is that of the Realta-based Denon 602ci.

Converting film sources to 1080p24 theoretically should be perfect but in reality errors often occur producing intermittent jerkiness or stuttering in the image. This is especially true with broadcast sources. Even if this artifact is infrequent it can be pretty annoying depending on your tolerance for it.

The best performance at this I've seen so far is that of the Realta-based Denon 602ci.

That model looks the business and at $2500 MSRP is certainly priced like it. I'm going to dig into the reviews for some hints at how the DVDO EDGE performs by comparison.

If the cheaper unit does introduce substantial errors like you describe, I'll have to decide what's worse: consistent telecine stutter or inconsistently converted 24p? Either way the motion performance won't be perfect.

"... we wonÂt be stopping plasma production any time soon. We see it going on for another ten years." -- Kevin Lee, VP, Smart TV Partnerships (Samsung), 1/7/11