The topic reminded me of my test with PDPs. Just reminiscing about the results and how it shaped my thoughts. I find people in this argument tend to read and interpret from a defensive position. It is interesting. I also am interested in how many AVSers stick to the assumption that HVS does not vary widely. Not suggesting you do.

I'm not defensive; I have no stake in this fight. I'm just not a big fan of dismissing things that are improvements just because better improvements exist elsewhere. It's very weird that at AVS we had to endure more than a year of people insisting upscaled DVDs were some sort of major picture quality advance and now we can't get consensus that actual 4K is better than 2K. And why? Because someone has produced a chart explaining that your visual acuity prevents you from recognizing this?

I remember vividly looking at HDTVs in the early part of the 2000s. The different between 720p sets and 480p sets was clear well beyond the range at which you could detect pixels. The picture simply looked better. I used to have a friend who ran a Magnolia and we'd hang out some and ask regular customers "Which of those is better?" from a distance that no one could possibly be considered "in range" of a 42-inch set. The 720p sets (or 768, whatever it was) dominated.

Quote:

Originally Posted by Chris5028

Rogo, in your opinion, will there be a 4k LED TV for sub $2000 that has comparable PQ to a VT60 within 3 years?

Probably. The 50, 55 and 60-inch Vizio P series for 2014 already meets your pricing criteria and will -- in some ways -- eclipse the VT60. It won't in many ways, but at least 2 more versions of it will ship within 3 years. I expect Vizio will exert severe pricing pressure on absolutely every making TVs for the U.S. market. Locally dimmed, 4K sets for under $2000 in 2014 are a certainty. A tweaked version of them beating the VT60 by 2017? Seems almost as certain.

Quote:

Originally Posted by Chronoptimist

It is surprising how good 480p can look with a high quality source and good image scaling. The high compression and poor mastering techniques common with DVD (bad MPEG2 encoders, lots of sharpness and noise reduction added etc.) made it look a lot worse than it could.

...

I'm curious to know if you have an opinion about displaying 1080p content on a 4K display. Personally, I think it can look better than 1080p on a 1080p display - but only if manufacturers avoid using horrible upscaling techniques. (I'm looking at you, LG...)

So here's the reality. Using the bits for better quality rather than pixels has been a drum I've beaten here for a long time. In the brief discussion of 4K BluRay some time back, I was pretty adamant that if they try to jam it into a dual-layer disc, it will fail to be gigantically better than BluRay and will render the entire effort to introduce a new disc format moot. Seeing Netflix stream 4K at 15 megabits makes me pretty confident that for a 4K format to meaningfully separate itself from streaming, it will need to be more like 30 megabits.

But what if we took 15 megabits and streamed 2K? Well, even under H.264, we could make gigantic progress, bringing Vudu/iTunes very close to BluRay quality. If we switched 1080p encoding to H.265 and use multipass encoders and such, I wouldn't be at all surprised to learn those kind of bits could outpoint what you find on most BluRays... In fact, it might not even take 15 megabits to do that.

But the problem with all this analysis is that unless Apple or Vudu or Netflix or Amazon decides to care, it will never happen. Apple could push H.265 with a new AppleTV and just decide to have great movie quality that was bit adaptive. It could go 10-20 megabits depending on connection quality. But will it find this important? Apple's TV efforts are frustrating as hell. They could announce, "We will be spending $1 billion over the next 24 months making AppleTV truly awesome" and it's pocket change. Instead, we get a few more channels -- something nice, but nothing important.

In short, I feel like this is tilting at windmills. People think pixels = better because an industry conditioned them to believe that. Now it sells them pixels.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working. (Oh, and plasma didn't die because of logistics problems, nor does OLED ship in big boxes because it comes from Korea.)

I'm not defensive; I have no stake in this fight. I'm just not a big fan of dismissing things that are improvements just because better improvements exist elsewhere. It's very weird that at AVS we had to endure more than a year of people insisting upscaled DVDs were some sort of major picture quality advance and now we can't get consensus that actual 4K is better than 2K. And why? Because someone has produced a chart explaining that your visual acuity prevents you from recognizing this?

I remember vividly looking at HDTVs in the early part of the 2000s. The different between 720p sets and 480p sets was clear well beyond the range at which you could detect pixels. The picture simply looked better. I used to have a friend who ran a Magnolia and we'd hang out some and ask regular customers "Which of those is better?" from a distance that no one could possibly be considered "in range" of a 42-inch set. The 720p sets (or 768, whatever it was) dominated.
Probably. The 50, 55 and 60-inch Vizio P series for 2014 already meets your pricing criteria and will -- in some ways -- eclipse the VT60. It won't in many ways, but at least 2 more versions of it will ship within 3 years. I expect Vizio will exert severe pricing pressure on absolutely every making TVs for the U.S. market. Locally dimmed, 4K sets for under $2000 in 2014 are a certainty. A tweaked version of them beating the VT60 by 2017? Seems almost as certain.
So here's the reality. Using the bits for better quality rather than pixels has been a drum I've beaten here for a long time. In the brief discussion of 4K BluRay some time back, I was pretty adamant that if they try to jam it into a dual-layer disc, it will fail to be gigantically better than BluRay and will render the entire effort to introduce a new disc format moot. Seeing Netflix stream 4K at 15 megabits makes me pretty confident that for a 4K format to meaningfully separate itself from streaming, it will need to be more like 30 megabits.

But what if we took 15 megabits and streamed 2K? Well, even under H.264, we could make gigantic progress, bringing Vudu/iTunes very close to BluRay quality. If we switched 1080p encoding to H.265 and use multipass encoders and such, I wouldn't be at all surprised to learn those kind of bits could outpoint what you find on most BluRays... In fact, it might not even take 15 megabits to do that.

But the problem with all this analysis is that unless Apple or Vudu or Netflix or Amazon decides to care, it will never happen. Apple could push H.265 with a new AppleTV and just decide to have great movie quality that was bit adaptive. It could go 10-20 megabits depending on connection quality. But will it find this important? Apple's TV efforts are frustrating as hell. They could announce, "We will be spending $1 billion over the next 24 months making AppleTV truly awesome" and it's pocket change. Instead, we get a few more channels -- something nice, but nothing important.

In short, I feel like this is tilting at windmills. People think pixels = better because an industry conditioned them to believe that. Now it sells them pixels.

Quote:

Originally Posted by rogo

I'm not defensive; I have no stake in this fight. I'm just not a big fan of dismissing things that are improvements just because better improvements exist elsewhere. It's very weird that at AVS we had to endure more than a year of people insisting upscaled DVDs were some sort of major picture quality advance and now we can't get consensus that actual 4K is better than 2K. And why? Because someone has produced a chart explaining that your visual acuity prevents you from recognizing this?

To be honest, I think the confusion is people thinking the discussion is about uhdtv not being better then 1080p. (like you say above)

We're not saying the "technology itself is not better" or a higher resolution is not "better" as it technically should be from an evolution standpoint.

It's about what the human eye can resolve with minute details and if it's even worthwhile to get.

My prediction was about what will happen with OLED, the comments about me personally feeling udhtv is a gimmick is my own view, and was not meant to start a debate again about it; it was just part of my talking points. Like suggested earlier the discussion on resolution should be kept to the other thread.

I will have to politely disagree with you about a TV coming out in the next 3 years to trump the VT/ZT60. While local dimming can be quite effective with higher end models, it is still getting mixed results, and in essence the technology itself does not allow it to get the level of black or PQ that plasma offers. It also does not take away the response times. While I feel excited about Panasonics prototype which is said to equal there plasma's. I'm very skeptical if this will make it to the market, and if it does will it be as good? Why dump tons of money into LCD/LED to simulate the black levels of plasma which did it naturally

If anything, in 3 years we will have numerous UHDTV sets selling like hot cakes with a "good/very good" black level due to local dimming, the black levels sought after by enthusiasts or people demanding top PQ will be put on hold until OLED comes out. We hardly effect their bottom lines. The interest to achieve those blacks won't make much sense as the average consumer doesn't care and the big boy's know it.

Why try to make lcd/led match a plasma in blacks, if plasma's are gone? Why make a technology which is not emissive, simulate it? Why even waste money "trying" to get to that benchmark when the superior tech (plasma) that offered it, is dying. It just doesn't make sense, especially with them all working quietly on OLED where they know black levels are close to perfect.

I feel OLED is technically the only way to get to where we want to be in trumping plasma for PQ as they share many more similarities then OLED and LCD/LED do. (both being emissive)

Then again in 8-10 years, we may have some other crazy tech, which is cheaper to produce and OLED never comes to fruition.

I had a Sharp Elite. It was quite disappointing in my setup with U-verse. The Sharp's problems with color manifested itself as unnatural skin tones. When the source was good, the picture was good, but if the source was bad faces were either washed out and a bit greenish or sunburn pink.

Black levels were good, but their local dimming algorithm would occasionally cause the backlight to flicker. You would see shadows move across the screen with nothing causing them. It lasted for only a moment and was gone, but once you noticed it there was no going back.

So if I had the choice, it would be a display with good black levels which require no special video processing to accomplish. I'm not excited by local dimming where the display has to determine how to set the backlight from the RGB data, and I'm not excited by technologies that use subpixels other than RGB. Cameras record RGB. Manufacturers seem to have great difficulty processing just that and adhering to the video standards. To me, adding extra processing requirements is just more opportunity to introduce inaccuracy and bugs.

I heard about the color error that was sadly never resolved. I simply can't sacrifice black levels after being spoiled by 0.0011 fTL. That said, I'm not sure I could tolerate additional artifacts either, so I may have no choice but to compromise in the next 5 years. OT but does anyone know the definitive word on the # of zones incorporated on the new FALD sets from Sony and Toshiba (Vizio purportedly has 384) versus the Sharp Elite? I've seen a lot of discussion, but most of it seems speculative.

I know all of us are eager for OLED as am I. For OLED to be successful, it needs some type of reasonable market penetration. I am the local know it all when it comes to questions of audio and video and I use the term "know-it-all" as I consider myself simply someone who likes to stay informed with some reasonable level of knowledge as to what I am talking about.

In the past, when someone asked a question concerning the purchase of a new display, I typically directed them to a Plasma with my basic opinion of the three companies that were actively involved in the technology. I often received a counterpoint from my friend, co-worker or acquantance that while shopping at some big box store, the well-informed sales person had directed them to an "LED" television set as plasma was outdated and would soon be discontinued (they were finally right). I gave up long ago on correcting anyone on the actualy existence of "LED" television sets.

I believe that most of these people, and my guess, most of the buying public pay almost zero attention to the letter "O" from "OLED". I get the feeling that most people when they see a television set at a big box store costing closer to $ 10,000 than $ 1,000 they are not making any connection at all that there is a different technology involved and other than the curved screen, they are not paying a whole lot of attention to the differences in picture. I believe that Sharp suffered from this with their Elite series from three years ago. I know that my local BB store still has these units in their inventory and had them for lower prices that what I paid for my new ZT60 last summer.

Unless there is something that happens to educate the consumer, who basically does not care to be educated, OLED will not receive any considerable market presentation until the price is within easy striking distance of the other "LED" televisions.

I know that these points have been discussed regularly by ROGO and others from this forum. If someone has commented about the lack of differentiation of LED and OLED to an average consumer, I apologize for restating this. It seems that some of the big manufacturers and big box stores may have assisted the shot in the foot with their marketing of LCD's over the past four or five years since the introduction of LED backlighting.

To be honest, I think the confusion is people thinking the discussion is about uhdtv not being better then 1080p. (like you say above)

We're not saying the "technology itself is not better" or a higher resolution is not "better" as it technically should be from an evolution standpoint.

It's about what the human eye can resolve with minute details and if it's even worthwhile to get.

My prediction was about what will happen with OLED, the comments about me personally feeling udhtv is a gimmick is my own view, and was not meant to start a debate again about it; it was just part of my talking points. Like suggested earlier the discussion on resolution should be kept to the other thread.

I will have to politely disagree with you about a TV coming out in the next 3 years to trump the VT/ZT60. While local dimming can be quite effective with higher end models, it is still getting mixed results, and in essence the technology itself does not allow it to get the level of black or PQ that plasma offers. It also does not take away the response times. While I feel excited about Panasonics prototype which is said to equal there plasma's. I'm very skeptical if this will make it to the market, and if it does will it be as good? Why dump tons of money into LCD/LED to simulate the black levels of plasma which did it naturally

If anything, in 3 years we will have numerous UHDTV sets selling like hot cakes with a "good/very good" black level due to local dimming, the black levels sought after by enthusiasts or people demanding top PQ will be put on hold until OLED comes out. We hardly effect their bottom lines. The interest to achieve those blacks won't make much sense as the average consumer doesn't care and the big boy's know it.

Why try to make lcd/led match a plasma in blacks, if plasma's are gone? Why make a technology which is not emissive, simulate it? Why even waste money "trying" to get to that benchmark when the superior tech (plasma) that offered it, is dying. It just doesn't make sense, especially with them all working quietly on OLED where they know black levels are close to perfect.

I feel OLED is technically the only way to get to where we want to be in trumping plasma for PQ as they share many more similarities then OLED and LCD/LED do. (both being emissive)

Then again in 8-10 years, we may have some other crazy tech, which is cheaper to produce and OLED never comes to fruition.

1. Large-screen displays are going to be very common in the future and we will need 4K resolution or higher to accommodate them, this is a normal part of display evolution.

2. Plasma displays in the past have offered better black levels than LED/LCD true, however when they introduced full-arrays with local-dimming zones you're talking about a completely different story entirely.

3. Plasma displays have their own issues as well, remember the many reasons why Plasma failed in the eyes of the consumer? Well now unfortunately it's dead. Many people don't want to have to worry about, screen burn, image retention, buzzing, humming if you're to far above sea level, higher energy consumption, heating up the room etc. True many of these issues were resolved later on down the road but not in time in the eyes of the consumer anyway. Don't get me wrong here I love Plasma displays and I own a 55" Panasonic ST60 in my bedroom and just love it, especially the off axis viewing compared to my Elite.

Anyway back to OLED topic, nothing would make me happier than to purchase a 100" OLED display down the road I really want one. But it's looking rather grim right now, we seemed to have entered into some kind of dark ages for display technology right now.

I had a Sharp Elite. It was quite disappointing in my setup with U-verse. The Sharp's problems with color manifested itself as unnatural skin tones. When the source was good, the picture was good, but if the source was bad faces were either washed out and a bit greenish or sunburn pink.

Black levels were good, but their local dimming algorithm would occasionally cause the backlight to flicker. You would see shadows move across the screen with nothing causing them. It lasted for only a moment and was gone, but once you noticed it there was no going back.

So if I had the choice, it would be a display with good black levels which require no special video processing to accomplish. I'm not excited by local dimming where the display has to determine how to set the backlight from the RGB data, and I'm not excited by technologies that use subpixels other than RGB. Cameras record RGB. Manufacturers seem to have great difficulty processing just that and adhering to the video standards. To me, adding extra processing requirements is just more opportunity to introduce inaccuracy and bugs.

Never had any of these issues with my 70" Elite but the later generations I heard had many problems unfortunately all over quality control otherwise the model may have still been available today.

Quote:

vinne97 "I heard about the color error that was sadly never resolved. I simply can't sacrifice black levels after being spoiled by 0.0011 fTL. That said, I'm not sure I could tolerate additional artifacts either, so I may have no choice but to compromise in the next 5 years. OT but does anyone know the definitive word on the # of zones incorporated on the new FALD sets from Sony and Toshiba (Vizio purportedly has 384) versus the Sharp Elite? I've seen a lot of discussion, but most of it seems speculative.
Edited by vinnie97 - Today at 12:51 pm

The color issue was blown way out of proportion Kevin Miller calibrated my display himself and told me the exact same thing, never have I look at my picture and said to myself that color looks wrong in fact just the opposite. All depends upon the source material and many people confuse poor source material for poor color accuracy when in fact It's not. Directors have artistic licenses as we'll which allow them to play around allot confusing people further. If you want to know how accurate your display is always ask a professional reputable calibrator they will tell you the truth. That or get a known accurate color display for comparison side-by-side with the exact same source material and judge for yourself.

Yes, I've read that opinion, too, particularly from Ken Ross (that the color coding issue was largely blown out of proportion). Having never seen one, I can only rely upon the impressions of others vicariously. Thanks for weighing in. iMagic (Mark Henninger) claims the Elite has met its match in the new Vizio M series (with better viewing angles to boot), and Kevin Miller had a similar favorable impression. Bearing in mind this is still merely a CES demo, that's the only bright side to this OLED quagmire.

So long as plasma supporters pre-determine LCDs are inferior on very narrow ground, I'm quite sure they'll keep holding up OLED TVs they cannot buy against plasmas they've already bought, while dismissing LCDs....

It's so absurd to read criticisms like "cameras record RGB" to dismiss Sharp when none of the next generation LCDs we're discussing use Quattron. It's absurd to dismiss algorithms that set the backlight when every plasma relies on subfield modulation to create the illusion of grey levels at all... But, hey, PDP proponents have never been a rational lot.

The reality is the best plasmas you could buy had fantastic black levels and really good ANSI. They didn't offer much brightness or exceptional peak whites on mixed content, however. There is going to be what appears to be a battle for picture-quality supremacy on LCD coming. Maybe it produces a clear winner; more likely it produces several choices. The idea that LCD's state of the art hasn't moved is bizarre. The idea that it won't move because plasma is gone is just techno-depressiveness.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working. (Oh, and plasma didn't die because of logistics problems, nor does OLED ship in big boxes because it comes from Korea.)

The unfortunate thing is that Plasma is made out of glass. It seems to put out EM Light at a wide angle efficiently where LED just puts out a disgusting bright blue light through LC shutters. Since we are so impatient to get to the legendary 2160p/4K, plasma TVs are supposed to follow suit, and we are supposed to believe LCDs aren't cheap to make like OLED plastic, whatever that is. Omega LED, where are theke? So as 1080p60p plasma TVs get better briter and more reliable, I am supposed to hope for a discount on Plasma TVs.

So long as plasma supporters pre-determine LCDs are inferior on very narrow ground, I'm quite sure they'll keep holding up OLED TVs they cannot buy against plasmas they've already bought, while dismissing LCDs....

It's so absurd to read criticisms like "cameras record RGB" to dismiss Sharp when none of the next generation LCDs we're discussing use Quattron. It's absurd to dismiss algorithms that set the backlight when every plasma relies on subfield modulation to create the illusion of grey levels at all... But, hey, PDP proponents have never been a rational lot.

The reality is the best plasmas you could buy had fantastic black levels and really good ANSI. They didn't offer much brightness or exceptional peak whites on mixed content, however. There is going to be what appears to be a battle for picture-quality supremacy on LCD coming. Maybe it produces a clear winner; more likely it produces several choices. The idea that LCD's state of the art hasn't moved is bizarre. The idea that it won't move because plasma is gone is just techno-depressiveness.

I think it will be very interesting to see just how much further they can push LCD technology or improve it? Could it ever get it to a point where it's so close to OLED picture quality you can't tell them apart at a glance? At that point would there be any reason to produce one given the cost factor? Off axis viewing would seem to be a major hurdle to overcome if it could ever reach that point. Sharp's IGZO display technology looks interesting as we'll down the road, great company for innovation that's for sure.

So long as plasma supporters pre-determine LCDs are inferior on very narrow ground, I'm quite sure they'll keep holding up OLED TVs they cannot buy against plasmas they've already bought, while dismissing LCDs....

It's so absurd to read criticisms like "cameras record RGB" to dismiss Sharp when none of the next generation LCDs we're discussing use Quattron. It's absurd to dismiss algorithms that set the backlight when every plasma relies on subfield modulation to create the illusion of grey levels at all... But, hey, PDP proponents have never been a rational lot.

The reality is the best plasmas you could buy had fantastic black levels and really good ANSI. They didn't offer much brightness or exceptional peak whites on mixed content, however. There is going to be what appears to be a battle for picture-quality supremacy on LCD coming. Maybe it produces a clear winner; more likely it produces several choices. The idea that LCD's state of the art hasn't moved is bizarre. The idea that it won't move because plasma is gone is just techno-depressiveness.

Hmm, I thought you were a PDP proponent even as recently as 2012. The forward innovation hasn't been so striking since 2011 actually, and the shootout results bear this out (I know, I know, not scientifically sufficient to rule out variables like placebo effect and confirmation bias, but such a panel *did* actually win this competition many moons ago...and in 2014, if there is still no OLED competition on hand, it will win by default).

What would be interesting, is if OLED eventually becomes feasible if they decide to call it something else. From people hearing "organic" and assuming it's a ploy, to the majority of consumers that automatically assume because its "LED" its all the same.

In reality the average joe doesn't care if Tim at bestbuy wants to "tell him" how the panel works, he just wants to be "wowed" into purchasing a TV he wants today.

Provide the next step, with a different name that differentiates from the whole LCD/LED section....Could actually be a pretty substantial move.

Yes, I've read that opinion, too, particularly from Ken Ross (that the color coding issue was largely blown out of proportion). Having never seen one, I can only rely upon the impressions of others vicariously. Thanks for weighing in. iMagic (Mark Henninger) claims the Elite has met its match in the new Vizio M series (with better viewing angles to boot), and Kevin Miller had a similar favorable impression. Bearing in mind this is still merely a CES demo, that's the only bright side to this OLED quagmire.

I will use this for several of the Elite comments. Regarding zones, a Sharp engineer at 2012 CES was quoted as saying the 60" had ~ 300 zones and the 70" ~ 600. A photo supposedly showing a backlight panel for a 70" for sale on ebay was counted as having 284 zones. Shape of the panel indicated 2 would be required to have a complete backlight. As far as I now, no one has done an independent 3rd party count.

Regarding color accuracy, the cyan defect could only be seen in a side by side comparison to an accurate picture. I have a Lumagen Radiance video processor and thanks to lunagen's ongoing improvement efforts their 125 point 3D LUT (free) fixed the cyan issue on my set. What had previously appeared as nice shades of blue are now cyan. After calibration with the 3d LUT all de's are less than 2 for both primaries and secondaries measured at 5% increments from 25% luminance thru 100%.

I will say something about the Pioneer Kuro Elite. It was a 60 inch diagonal, 900p60p PDP. It had artifacts and needed better hardware. Get a 60 inch Panasonic S60, if you don't believe me. It is 900p too.

Thanks for the correction on the Vizio notation, dsinger. I am more inclined to believe those like yourself who claim the Sharp cyan defect is not visible without an accurate picture to be honest. I'm excited to see how the new models measure up in spite of being surreptitiously labeled as irrational.

1. Large-screen displays are going to be very common in the future and we will need 4K resolution or higher to accommodate them, this is a normal part of display evolution.

Upon what are you basing your prediction? From what I've read around here, probably from Rogo, there is a little indication that large screen displays are going to come anywhere close to, "very common". Except for a relatively small subset of enthusiasts, most people aren't interested in screen sizes beyond 42" - 60".

Upon what are you basing your prediction? From what I've read around here, probably from Rogo, there is a little indication that large screen displays are going to come anywhere close to, "very common". Except for a relatively small subset of enthusiasts, most people aren't interested in screen sizes beyond 42" - 60".

HDTVs in people's living rooms are already significantly larger on average than the CRT sets that preceeded them a decade ago.

As folks (the significant others with the decorating authority) become more comfortable with large displays, they'll be "allowed" in more and more living rooms. Plus as image quality continues to improve and prices continue to fall, the obstacles outside of convincing the other half to agree will diminsih as well.

Why wouldn't a sports enthusiast want an 80 inch display in his man-cave if it costs no more than his current 60"?

Large displays will become more and more common. 4K will help it along.

And none of this negates the benefits that OLED and other technologies will introduce once they become commercially viable (and they will, and they will also be 4K and up).

Making 4K into some sort of villan or marketing ploy with no benefit is as invalid as all the arguments we had to endure on this very forum telling us that 1080p was of no benefit and that the human eye couldn't see any detail beyond 720p. Most AVS members are a bit wiser now regarding the benefits of 1080p. Can't we just learn from that past silliness and accept that 4K is a good thing and be happy about it while we push the industry to develop OLED?

Two points. I don't know anybody that has a Man Cave. I'm sure some certainly do but they are far and few between. Why is 4K such a good thing when nothing is broadcast is 4K? Actually, what is broadcast is 1080P?

Two points. I don't know anybody that has a Man Cave. I'm sure some certainly do but they are far and few between. Why is 4K such a good thing when nothing is broadcast is 4K? Actually, what is broadcast is 1080P?

The assumption is that broadcasters will convert to 4K which I doubt will happen any time soon.

whether there is 4k content or if the benefits of 4k can be discerned on a non-jumbo
screen are really moot at this point at this time 4k is fait accompli. look at ces. 4k is here and isn't going away any time soon

look at it this way, aside from a current oled or plasma, what 2k display would you feel compelled to buy this year?

if I'm getting a set this year or next it will be 4k with all the necessary 4k specs, not necessarily because I crave 4k, but if I'm dropping a grand or two, I want to avoid immediate obsolescence

I think it will be very interesting to see just how much further they can push LCD technology or improve it? Could it ever get it to a point where it's so close to OLED picture quality you can't tell them apart at a glance? At that point would there be any reason to produce one given the cost factor? Off axis viewing would seem to be a major hurdle to overcome if it could ever reach that point. Sharp's IGZO display technology looks interesting as we'll down the road, great company for innovation that's for sure.

So I've been questioning this since 2012 actually. When people were very excited about how good the Elite was (and how good LCD could be), it left me -- and others -- wondering about how much room there was "above" that to excite people.

Quote:

Originally Posted by vinnie97

Hmm, I thought you were a PDP proponent even as recently as 2012. The forward innovation hasn't been so striking since 2011 actually, and the shootout results bear this out (I know, I know, not scientifically sufficient to rule out variables like placebo effect and confirmation bias, but such a panel *did* actually win this competition many moons ago...and in 2014, if there is still no OLED competition on hand, it will win by default).

I'm still a PDP fan, I'm just not a technology nostalgist. I own a 2012 plasma. It will be my TV until (a) it breaks or (b) until 2016-17. The reason for that date is both arbitrary (5 years is my personal TV replacement cycle) and not (I believe there'll be something worth buying by then at a price that isn't unreasonable).

One thing I think we are disregarding is that true local dimming was dead and gone. Then (thanks largely to the advent of "direct LED") it came back out of nowhere. Now, it's headed for inexpensive product. That product will put pressure on an entire industry. It will lead to a local-dimming arms race over the coming 24-36 months. The death of a-Si backplanes for high-end product is also in the offing. Whether IGZO/oxide comes to dominate or whether LTPS actually becomes affordable/viable at large sizes is TBD, but either way LCD gets better. Whether quantum-dot films become more common or whether light purity of whites improves, LCD gets better. Do we get IPS panels with the contrast of VA panels? Hard to know, but it's not impossible.

Quote:

Originally Posted by DaViD Boulet

HDTVs in people's living rooms are already significantly larger on average than the CRT sets that preceeded them a decade ago.

That's true. But big screens are not significantly larger than projection sets of more than a decade ago. And those used to sell into about 3% of households per year. They weren't especially uncommon.

Quote:

As folks (the significant others with the decorating authority) become more comfortable with large displays, they'll be "allowed" in more and more living rooms.

Evidence doesn't really support this. The increasing urbanization of the population and the increasing popularity of individual screens suggests macro trends that will act against this. That doesn't even account for the general dislike among most women of large screens. And a generation of younger women growing up without TV at all is unlikely to see things radically differently.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working. (Oh, and plasma didn't die because of logistics problems, nor does OLED ship in big boxes because it comes from Korea.)

Upon what are you basing your prediction? From what I've read around here, probably from Rogo, there is a little indication that large screen displays are going to come anywhere close to, "very common". Except for a relatively small subset of enthusiasts, most people aren't interested in screen sizes beyond 42" - 60".

I think the overall notion is that screens have been increasing as the cost per inch plummets. People have shifted their notions over time. As X inches becomes more common (no matter what the asymptote is for the rate of increase), X+5 inches always seems less weird.

Quote:

Originally Posted by andy sullivan

Two points. I don't know anybody that has a Man Cave. I'm sure some certainly do but they are far and few between. Why is 4K such a good thing when nothing is broadcast is 4K? Actually, what is broadcast is 1080P?

There are always going to be co-dependent technologies. It never pays for one technology to completely wait for another technology to catch up when that 2nd technology only needs to exist because of the first. If 4K held off until content was available it would never arrive at all because content only needs to be there to satisfy the needs of the 4K users.

Quote:

Originally Posted by rogo

That doesn't even account for the general dislike among most women of large screens.

Among the most amusing things I've ever seen in all of AVS was this concept of being "wifed". This verb is absolutely hysterical in that nearly everyone can relate to it. A guy bought a large TV, set it up completely only to be "wifed" later and forced to return it for a smaller size.

1. Women have something largely referred to as the nesting instinct. They typically are comfortable managing from the walls inward. (Men have a provider instinct, from the walls outward.) An interesting aside to this is that women are largely comfortable managing the cooking.....right up until it becomes a barbecue outside. Then somehow, men feel an evolutionary need to take over. This absolutely applies to the existence of gigantic rectangles within the walls of the home that most of the time are off and look garish. Sorry, but gigantic chunks of electronics are just not pretty.

2. Women have become accustomed to "reigning in their men". LOL.... And it's a complete riot that most of us accept as more than half sensible. Because, quite frankly, we'd be spending most of our time blowing @#$% up for fun if we could. I have two young boys, and my wife routinely comments that she's exhausted being "the sole voice of reason". LOL....

Two points. I don't know anybody that has a Man Cave. I'm sure some certainly do but they are far and few between. Why is 4K such a good thing when nothing is broadcast is 4K? Actually, what is broadcast is 1080P?

Two points. I don't know anybody that has a Man Cave. I'm sure some certainly do but they are far and few between. Why is 4K such a good thing when nothing is broadcast is 4K? Actually, what is broadcast is 1080P?