Ultra HD and OLED TVs will blow your mind, but are they worth it?

As predicted, OLED and Ultra HD televisions dominated headlines at CES 2013. Most major manufacturers – and even a few newcomers – had at least one large-screen Ultra HDTV to show off, with the promise of several more size options coming soon. And OLED – sweet, beautiful OLED – made its strongest showing at CES yet, with LG promising US shipments of its 55-inch model in March, and both Samsung and LG surprising show-goers with curved OLED displays. Not to be outdone, Sony and Panasonic went so far as blend both technologies into amazing hybrids, each teasing its own 56-inch Ultra HD OLED prototypes.

Beholding the spectacle that is the future of televisions was quite breathtaking. And we will admit that, while on the ground in Las Vegas, we enjoyed indulging in our fair share of the buzz surrounding all the eye candy. But now that our torrid, week-long technology lovefest is over, and we are once again confronted with this funny thing called “real life,” we find ourselves taking a more logical and practical point of view in regards to Ultra HD and OLED. Where should ordinary consumers stand? Our reasoned conclusion: Don’t bother buying either one yet, because they just aren’t ready for prime time. Here’s why.

Ultra HD

Ultra HD (or, if you’re stubborn like Sony: 4K) refers to a resolution with four times the pixels of 1080p HDTV. Though there is some debate over how valuable the heightened pixel density is in screen sizes under, say, 60 inches, the issue we have with Ultra HD doesn’t have anything to do with that. Our problem is with a conspicuous lack of native Ultra HD content.

When high-definition televisions first came available in 1998 , we had the same problem: no HD content. At the time, the first Blu-ray players were eight years away, which meant that HD content would need to come from broadcast TV, or cable and satellite providers. Though there were a smattering of localized, over-the-air HD broadcasts made between 1998 – 2000, it wasn’t until January 30, 2000 that we saw the first major sporting event (Super Bowl XXXIV) televised nationwide in HD. Furthermore, it was not until 2002 that satellite providers Dish Network and DirecTV began carrying HD programming. Finally, cable companies caught up and started carrying HD content in 2003.

Looking back, it took about four years just for HD programming to get a foothold, and even longer before we were able to get true HDTV programming in any kind of quantity. Can we expect content providers to adopt Ultra HD any faster this time around? We think not. There are quite a few new challenges to meet, and it’s going to take some time to deal with them.

We will concede that upconverted 1080p content looks better on Ultra HD TVs, especially on screens larger than 70-inches. But, as we see it, the difference is not compelling enough to justify the lofty prices these sets are demanding. Speaking of lofty prices, we should note that Sony is the only manufacturer with a Band-Aid solution for delivering native 4K content, but it will cost you about $25,000 to get the TV, and even more to keep the content flowing.

There is some encouraging news to consider: Eutelsat Communications in Europe launched a dedicated Ultra HD station, which began broadcasting on January 8. And Sony says it is leveraging its expertise in 4K to begin offering content via a media server this summer.

The bottom line is this: If there is no Ultra HD content to look at on a TV that costs upwards of $8,000, then the TV is not worth $8000 – not yet, anyway. Let’s just hold off for a couple of years and see where we are then. In the mean time, we’ll let the industry figure out what it needs to do to get us Ultra HD in our homes while we enjoy our perfectly beautiful 1080p TVs, Blu-ray discs, and Netflix SuperHD.

OLED

While Ultra HD TVs are awesome to behold from a resolution standpoint, OLED tech takes the prize when it comes to picture quality. Never mind that OLED TVs can be constructed to be thinner than a pencil (though there’s plenty of cool factor there, too), what we’re excited about are the most amazing black levels, contrast, color and brightness we’ve ever seen from anything other than real life.

The good news: You can buy one as early as March 2013 (assuming LG comes through as promised). The bad news: You probably shouldn’t. What do we know about OLED that you don’t? Nothing. It’s what we don’t know that has us concerned. Specifically, we don’t know how long OLED TVs will last.

Sony introduced the world’s first OLED TV – the XEL-1 – in October, 2007. But that was over five years ago. Where has OLED been all this time? In development.

Scaling OLED display sizes up to a level that would resonate well with early adopters (read: big enough to keep people from laughing at the stratospheric price) has apparently been a challenge. Clearly, manufacturers have figured out a way to do that, as we’re seeing 55-inch models going into production. But there are other challenges that exist with OLED that we can’t confirm have been successfully dealt with.

As the name implies, OLED (organic light emitting diode) technology uses organic material to create light. Different organic materials are used to create red, green and blue pixels. The problem is with those pesky blue pixels. Reportedly, the material used to create blue OLED pixels has had a short shelf life. Recent studies have shown blue OLED efficiency ratings at around 4-6 percent, while red and green OLEDS live around 19-20 percent. Other research has shown older blue OLEDs down in brightness by 12 percent after 1,000 hours of testing while red and green fared better at 7 and 8 percent, respectively.

That may not seem so bad at first, but consider this: It is estimated that the average household TV in the US is on for about 6 hours and 47 minutes a day. By those numbers, if blue OLED brightness were to go down by just 6 percent after 1000 hours, then it would be down by 50 percent in less than 4 years.

The difference in degradation spells bad news for OLED picture quality in the long term. Since red and green pixels have historically degraded much more slowly than blue, what you’re going to have are some serious color-balance issues. If a third of the color palate fades considerably faster than the rest, then it won’t take long for colors to start looking funny. Sure, you can make adjustments to try to keep things dialed in, but no consumer is going to want to re-calibrate every few months all by themselves, let alone pay someone to do it.

We can hope that OLED manufacturers have managed to address the blue OLED issue, but we have not been able to confirm with any of our sources that they have been successful. Until independent, long-term testing of the latest OLED TVs can be performed, we can’t be certain.

It rarely pays to be an early adopter, but in the case of Ultra HD and OLED, the cost is simply too high. We want to see proof that OLED can stand the test of time, and we want to see Ultra HD content rolling out at a decent clip before we can be swayed to jump on board with either technology.