Why Ultra HD won’t be taking the world by storm

Higher definition means higher prices, but there's more to the story.

The four-day orgy of technology-enabled consumerism that is the Consumer Electronics Show has come to a close. One of the wow-inducing technologies on display was Ultra HD—televisions that offer stunning 2160p (3840×2160) resolutions as well as 4320p, which puts us in IMAX territory. All of the major television manufacturers were showing off Ultra HD sets in hopes that consumers will ready their credit cards for an upgrade.

I think those hopes are in vain. And it's not just because Ultra HD sets are insanely expensive, although that's a significant reason. The prices on some of the TVs that were shown off were just as jaw-dropping as the specs. Assuming you had space for an 84" Ultra HD TV, you'd need to cough up $20,000 for the privilege of owning it.

There will be "value priced" Ultra HD sets, of course. Westinghouse will have a lineup of eight sets out in 2013 starting with a 50" model priced at $2,500 and a 55" model for $3,000. That's comparable to where 1080p prices were in 2008, and they'll no doubt drop. But that low-end Ultra HD set still costs five times what a budget 50" 1080p model runs at an outlet like Amazon.

Whether you're going big or small, you'll pay a hefty premium for those extra pixels. The cost will mean the only ones going for Ultra HD sets will be early adopters with deep pockets, at least for the first couple of years.

Even when prices drop, however, most consumers are likely to stand on the sidelines. How many of you have purchased HDTVs in the past four or five years? I've bought three: a 32" 720p set for the bedroom, a 40" LCD in the family room, and a 58" plasma set for the mancave. How anxious are you to upgrade? I'm not. Those televisions represent a significant outlay of cash for me, especially given that my 40" 1080p set was purchased in the fall of 2007. I don't think most people are going to be inclined to walk away from their investments in HDTV just because there's something better-looking in the pipeline, at least as long as their HDTVs are working.

There are also legitimate doubts that many people will even be able to appreciate the full beauty of an Ultra HD image. If you're standing 10 feet away from an 84" Ultra HD TV at CES, you'll no doubt see the difference in image quality. But think about your living room, or whatever place in your house you do your TV watching. Will you actually be seated close enough to the TV to truly appreciate the difference?

A home theater enthusiast worked up some charts showing the distances at which viewers need to sit in order to get the full benefit of the higher resolution. They're really concerned with the difference between 720p and 1080p, but 2160p is included as well. Looking at the chart, you can see that if you're rocking a 50" Ultra HD set, you'll need to be sitting pretty close in order to get the full benefit of the increased resolution. And if you're sitting much more than seven feet away, you're going to have a tough time seeing the difference between Ultra HD and 1080p.

Given TV sizes and room dimensions, 1080p HD content is going to be good enoughfor most people. There will always be people who want the best possible picture, but for a large segment of the HDTV-owning public, 1080p (or even 1080i) is HD enough—especially when faced with the cost of buying a new set. I've got the Lord of the Rings extended version on 480p DVD and it looks really fantastic on my 58" plasma display. The current generation of game consoles can't handle resolutions higher than 1080p, and upscaled games aren't going to look too hot an a 2160p set.

There's also a scarcity of Ultra HD content to watch, and that's unlikely to change in the anytime soon. Yes, there are a couple of Ultra HD players available, like the one from RED we reported on. Sony also has a 4K download service in the works. There will no doubt be more coming down the pike, as well. But optical media is on the downswing, so unlike 1080p, which had Blu-ray and HD DVD launching alongside it, there's no corresponding format push from consumer electronics manufacturers.

Consumers who do opt for Ultra HD and buy a REDRAY player or subscribe to Sony's service are going to be faced with another uncomfortable reality: bandwidth caps. As noted in our piece on RED, those Ultra HD movies are huge—a single 100-minute flick will eat up about 15GB of your ISP's monthly data allotment. That's several times the size of a 1080p download from Netflix or the iTunes Store.

Ultra HD content is going to be difficult to find outside of download services, as well. Cable and satellite TV providers currently have to pick and choose which networks they offer in HD due to bandwidth constraints. If you take the 20Mbps data rate for RED's Ultra HD files as a guideline, there's just not enough infrastructure for TV providers to deliver Ultra HD. Getting the data rate down lower will require additional compression, and when you start compressing content, you lose one of the key selling points of the higher-definition format. The pipes just aren't phat enough for Ultra HD programming right now, and that looks unlikely to change in the near term.

A couple of years ago, 3D was all the rage at CES. There's some content available—3D movies, 3D games, and even a smattering of 3D broadcasts. But people haven't been lining up to buy TVs just for the 3D experience. Last April, Samsung said that sales of 3D TVs were just 30 percent of what the industry had expected. Sales of TVs with some 3D capability are on the rise, but that is not driving sales—3D is often just another item on the feature list.

That doesn't bode well for Ultra HD adoption over the next few years. The new format will be heavily hyped, but a buying public that has purchased HDTVs in the past few years will agree that yes, it's an utterly fantastic picture, and then go back to happily watching movies, sports, and reality TV in 1080p.

Promoted Comments

We've been through this once before in the audio world: having lined their pockets quite nicely by convincing people to re-buy their entire (LP/cassette) music collections on Compact Disc, the labels and the hardware manufacturers immediately tried to start the process all over again by introducing a murderer's row of "high-definition" audio products: HDCD, DVD-A, SACD, blah blah blah. Every last one of them sank without a trace: CD audio was more than good enough, and nobody was stupid enough to fall for the same trick twice.

Ultra-HD is looking like a strong candidate for the same treatment in the video world.

I'm an image quality snob. Given my profession this is hardly surprising I guess. I buy Blu-Rays for the films I care about because I want the higher quality compression they give over downloads (and frankly I like things on discs too). I sit pretty close (6' or so) to a 50" 1080p set, and have no trouble differentiating between 720p and 1080p content.

So all that said, I just can't imagine wanting to go Ultra HD unless I had a dedicated home theater room, with a UHD projector, and a ridiculously large screen to throw the image on. For *TV* watching it just doesn't make any sense to me.

Another thing is that Red One compression rate 4K for is very suspect.

That is 4 times the original data, in half the size of Blu Ray.

That is an 8 fold increase in compression over Blu Ray, which is near state of the art (it is also a 10:1 increase in compression over the Lossy compression RED does in it's cameras). This is clearly no where near transparent.

15-20GB for 4K is like the iTunes equivalent of 1080p, not the Blu Ray equivalent.

We've been through this once before in the audio world: having lined their pockets quite nicely by convincing people to re-buy their entire (LP/cassette) music collections on Compact Disc, the labels and the hardware manufacturers immediately tried to start the process all over again by introducing a murderer's row of "high-definition" audio products: HDCD, DVD-A, SACD, blah blah blah. Every last one of them sank without a trace: CD audio was more than good enough, and nobody was stupid enough to fall for the same trick twice.

Ultra-HD is looking like a strong candidate for the same treatment in the video world.

Seems to me that the real benefit of this tech will be to have multiple hdstreams showing at the same time as well as other windows displayed by your computer all on the same screen.they are better thought of as big hi res computer monitors.

There is one important factor left out in the article. How many of those first gen HDTV's are even capable of being allowed to connect to a bluray player (or almost anything else that requires hdmi cables). Some will partially , some won't at all.

I can't wait to get an 85" Ultra HD TV! I will certainly be close enough to see the difference in my small apartment. I want to plugin my HTPC into it to play Battlefield on it using a monster graphics card. As soon as it hits $5k and has HDMI 2.0, I'm in.

I'm with you. A more logical increment would be going to 1440p. I actually bought one of those cheap monitors off eBay, a 27" Yamakasi Catleap back before the huge spike in demand so I got it for around 300 with shipping. I prefer it so much over a similar sized 1080p monitor.

The average consumer doesn't care about 3D TV, 4K, or OLED. What we really want is content that doesn't suck. I don't even watch TV anymore because it's all crap. I get all of my entertainment from Netflix. How about bring back Firefly. I'd much rather have more Firefly than a TV that supports 3D, 4K, and OLD companied. Even﻿ if such a TV was only $1000 for a 46" to 52" model.

The physics behind this argument are logical and well written points. I diverge from your opinion once the comparison between 3D and UHD was made. 3D was largely a huge pain in the ass - you have to wear glasses to do it, it has to be at particular angles to get the best effect, and it led to some funky artifacting for some subset of the population trying to watch it. These hurdles are non-existant for UHD, as it is built into the panel and does not require extra easy-to-lose gadgets that have to be awkwardly worn over Rx'd glasses.

3D didn't stand a chance from the get go without the built in paralax that never worked out due to costs and angles. UHD is just an image of where the industry is heading currently. Sure, the basic supporting technologies have to catch up, but that is why these TVs are cutting edge in the same way that the software and hardware are still catching up to the "retina display" mobile technologies.

The chart provided is about resolution change being noticeable, but probably under similar conditions, such as the sharpness being set the same. There are two things that an UHD set does to improve HD playback.

One, one of the "lies" with having a 1080p set that matches 1080p content is that almost all video is overscaned 3-5% to eliminate annoying production mistakes such a boom mike dipping into the picture, offsets in editing, etc... These small scaling ratios can cause noticeable artifacts and result in either a softer picture or edges with visible overshoot (ringing). Picture quality will actually improve if the scaling ratio is higher.

Two, getting back to the resolution chart, it's not if our eyes can see one frequency and then for just a little higher frequency you can't see anything at all. The roll off is gradual. So, if you have a set where you can't see the pixels, you can boost the very high frequencies and even though up close you have ringing and artifacts, the eye blurs them out at typical viewing distances, therefore you end up with a picture that is significantly better. The issue is how to create those very high frequencies. Almost all of the sets, at least from Korean and Japanese companies, I saw at CES were using some sort of single frame super resolution technology as part of the solution. Think of it as a simplified version of fractal scaling. Very effective in creating the extra details required for pre equalization required for you eye.

If you really want a big screen, the Sony UHD front projector is very impressive!

I bought a 42" plasma in 2005 knowing full well that after the newness wore off, content was going to carry the day. If I enjoy the program / event that I'm watching, I'm more focused on that aspect of the experience rather than what it looks like.

Back in 2006 and 2007, especially in the Blu-Ray vs. HD-DVD vs. DVD and PS3 vs. 360 threads all over the Internet, some were arguing that 1080p displays would never achieve significant market penetration, because they were overkill, because the ATSC standard didn't support 1080p broadcasts, because high-def discs were too expensive and downloads were the future, etc.

1080p sets did command a premium back then but they quickly dropped in price.

I believe Direct TV has announced they will support UHDTV and I'm sure the cable companies will eventually as well. Studios will milk DVD and Blu Ray sales for as long as they can and when sales start to tail off, they will support whatever 4k standard comes out.

3D is a gimmick but if UHDTV demos well at showrooms, people will eventually upgrade.

I used a similar viewing chart to determine where to put the seats in my basement home theater, which is why I sit 9' away from my 65" 1080p TV. To get the benefit of a UHD tv, I'd have to sit 3' away from it. (Maybe that's what we'll all be doing in the future, but right now, the idea seems ludicrous...).

I like the 4k idea. I don't have interest in it currently as I sit 12 feet away, but it is certainly more interesting to me than 3D was. I don't know about other Ars readers out there, but my cable company still has issues pushing 1080i without artifacts. Asking them to output 2160p (8x the data) is just so far out there right now with the current infrastructure/bandwidth that is just isn't worth thinking about for the next 5+ years.

This article really feels like yet another boring entry to the standard filler stretch back for decades of "stating the blindingly obvious."

Quote:

Why Ultra HD won't be taking the world by storm

When was the last technology that "took the world by storm" by your definition Eric? My recollection is that "ultra crazy expensive for the first few years, and eventually becomes the norm many years on" is the rule, not the exception. I can remember when CD equipment was $1000+ (unadjusted for inflation), DVDs were ultra limited, etc. The first public HDTV broadcast in the US was in 1996. The ATSC HDTV launch was 1998, and early sets started to become available around then. HDTV didn't actually catch on and become the standard for a long time after that. The nature of semiconductor and mass manufacturing practically makes this pattern of development an economic law. It always, always takes time to ramp up, to figure out optimizations, increase mass, decrease costs, etc. Every single time, every single standard, be it new display tech or new consoles or new codecs. At the same time, the trends tend to be pretty clear. How many people reading this are now using SSDs, or considering one? How about in 2008, nearly 5 years ago, when Intel introduced the 80GB X25-M with a MSRP of $600? How about another 7 months earlier, when the 64GB Samsung MCCOE64GEMPP was a $1000 upgrade?

So yeah, at $20k or even $5k adoption will be low. It doesn't provide the leap that 480 to 720 or 1080 was. What a scoop.

Quote:

A home theater enthusiast worked up some charts showing the distances at which viewers need to sit in order to get the full benefit of the higher resolution. They're really concerned with the difference between 720p and 1080p, but 2160p is included as well. Looking at the chart, you can see that if you're rocking a 50" Ultra HD set, you'll need to be sitting pretty close in order to get the full benefit of the increased resolution. And if you're sitting much more than seven feet away, you're going to have a tough time seeing the difference between Ultra HD and 1080p.

While the overall point doesn't change dramatically when it comes to a lot of typical television content, since charts like that tend to get overused and applied everywhere as the last word it's worth pointing out that human vision is a lot more complicated then simply saying "one arc minute is it." Even ignoring people with better then 20/20 vision at distance, vision is also a product of the human brain which does serious pattern recognition and interpretation that can in turn affect results. Here is one good illustration of this, courtesy of TUAW:This tends to apply more to computers then to video, but even on televisions it's not as if there's no content that's affected. Assuming prices drop as they tend to do, this is still useful technology, and it'll be helpful in driving down costs and creating content for HiDPI systems elsewhere too.

I think tech pundits in general should be careful about too smugly pooh-poohing new technology just for being new. We can acknowledge that it'll be many years before it enters the general world, and that there can be hiccups along the way. But that doesn't mean it needs to be stepped on or cynicism has to be turned up to 11, particularly when we're talking about a standard that can in turn apply to many different technology paths (the best kind of standard). Just as 480 content made before the age of the transistor can still be watched and appreciated on a plasma screen, Ultra HD content will be there for retinal imaging displays that don't exist yet. There are something on the order of 7 million cones and 120 million rods in the human eye. As we approach one element to receptor yeah, we'll have run into another fundamental limit and can probably chalk that one off as "done" (at least for regular humans), but 8-bit 1080p is quite a ways off from that.

Much of the article sounds similar to the whining we had about 1080p a couple years ago. Yes it will take a long time to be the standard.

Surprise.

By now all tvs are 1080p, blurays have a good chunk of the markets and non hd television channels feel weird when I watch them. And in the meantime Sony et al will take a couple grands from people who have too much money.

1080p was chosen precisely because higher resolutions aren't noticable when you are watching as a "TV." However, higher resolutions are useful when you are watching it as IMAX (where the screen bounds go to or past your sight bounds) or when the TV is used as a presentation display.

UHD will be fantastic for display boards for walk around presentations.

And 3d is different. It by and large sucks. You need glasses and if it weren't for the gimmicky factor of seeing a face pop out of the tv once in a while I would say that non 3d actually looks much better and more immersive because it's so much sharper. The only movie where 3d really was amazing was avatar not sure what the good titanic Regisseur did there but after seeing it i briefly thought 3d is the future. And then I saw a lot of movies where 3d sucked. In the cinema and even more so at home.

Another thing is that Red One compression rate 4K for is very suspect.

That is 4 times the original data, in half the size of Blu Ray.

That is an 8 fold increase in compression over Blu Ray, which is near state of the art (it is also a 10:1 increase in compression over the Lossy compression RED does in it's cameras). This is clearly no where near transparent.

15-20GB for 4K is like the iTunes equivalent of 1080p, not the Blu Ray equivalent.

Kind of irritating that Ars talked so much about RED (that other article/press release was just embarrassingly terrible) rather then HEVC, but double the data rate for similar quality seems about right, and as with all lossy compression there is no one right answer, quality to data rate tradeoffs can be made to whatever the situation requires. With good H.264 encoding 8-12 Mbps is very high quality for most content, and naturally streaming stuff tends to be much lower. HEVC has demoed 50% gains, though it's still way too early in the encoder dev process to say for sure how it'll all shake out in the end. Taking that as a baseline though, and being conservative with 4k being a 4x increase in data rate (it shouldn't actually scale that badly in most circumstances), 16-22 Mbps should be doable for high quality 4k content. 15-20GB doesn't seem outside the realm of the reasonable.

I'll note too that Blu-ray is absolutely not state of the art, far from it sadly. It is highly common to see really terrible Blu-ray encodes. No one uses Hi10p, etc etc.

A lot of these same arguments were made for 1080p back when it was new (lack of content, can't tell the difference vs 720p, etc), but give the market 5 - 10 (or possibly more) years just like HD took to catch on. The first HD broadcast in the US was in 1996. It took several years after that for HD to become common place in people's homes. We haven't even seen that same milestone for UHD yet so why bash it?

40" TV - 31" (this is just about my normal desktop monitor viewing distance).55" TV - 43" (Under 4feet seems just a tad close to sit to a 55" screen)85" TV - 66" (Again 5.5 feet seems rather close to an 85")

I don't see anything but a small fringe of people wanting to sit that close to screens of this size.

How close we sit to our screens is seldom driven by resolution anymore, but room/ergonomic factors, that really preclude any real necessity of much higher screen resolution.

The relationship continues to apply to almost any size screen as screens get bigger we just keep moving further away (50 foot movies screens are mostly using 2K/HD projectors).

There will be some small advantage to people who sit very close, but this deep in diminishing returns. Most digital movies have also been shot with 2K Cameras, 4K cameras are are very recent. It is not the kind of thing you would really notice beyond 1 minute into the movie.

The real main advantage will be showing up your friends by having more pixels than them.

Another thing is that Red One compression rate 4K for is very suspect.

That is 4 times the original data, in half the size of Blu Ray.

That is an 8 fold increase in compression over Blu Ray, which is near state of the art (it is also a 10:1 increase in compression over the Lossy compression RED does in it's cameras). This is clearly no where near transparent.

15-20GB for 4K is like the iTunes equivalent of 1080p, not the Blu Ray equivalent.

A lot of these same arguments were made for 1080p back when it was new (lack of content, can't tell the difference vs 720p, etc), but give the market 5 - 10 (or possibly more) years just like HD took to catch on. The first HD broadcast in the US was in 1996. It took several years after that for HD to become common place in people's homes. We haven't even seen that same milestone for UHD yet so why bash it?

Actually no, the arguments were not the same.

Last time, the difference was between HDTV 1920x1080 vs SDTV ~704x480.

There was nearly zero argument that you couldn't actually see the difference. Just that some people claimed they really didn't care.

This time there is a fairly strong argument that in a normal living room the extra resolution is irrelevant.

This is closer to the argument that happened when Super Audio CD (SACD) came out, which was that CD was essentially already perfect for home use.

Why won't TV's just become bigger then? TV sizes are limited by space and by cost, but also by the quality available. No one would buy a 150" 720p or 480p TV because it would have looked really terrible to watch, so they were rarely made. Once you get over 80-90", there's a half decent argument for while 2K is a reasonable standard to watch and I imagine as technology costs come down, that's where we'll go. I also can't stand the look of 1080p, 27" monitors, going with a higher resolution and actually having content that works on one seems to make sense to me.

And consumers are still idiots, there's very little reason to buy a 42" 1080p screen but they certainly still exist, because they look incredible when you're looking at it on the shelf from 3 feet away.

From a home theatre perspective, I don't think I'll be upgrading to 2k anytime soon. As someone who last upgraded 3 years ago with this thing called "a budget" in mind, I looked at the comparable 1080p and 720p projectors and went with 720p I could get better colours and blacks in a 720p projector than a comparable 1080p projector that cost twice as much. All this is likely to do for me is drive down the price of 1080p projectors and phase out 720p ones.

I think the more important part of this is that 4K TV's means that 4K computer monitors will hopefully become more common/mass produced. A 3840x2160 24" monitor would be a joy to behold. It's high-time desktop monitors moved past 90-ish PPI.

On a related note, here is an interesting article about 4K gaming performance/quality on a GTX 680.

15 years ago, a 42" plasma screen would have cost nearly $20,000 too. Once manufacturing scales, 4K technology will slowly become the mainstream too, and at comparable prices to today's screens at the same size.

UltraHD won't get me to upgrade from my 5 year-old 50" 1080p plasma any more than 3D did. OLED will, when it becomes more affordable. The OLED TV will also have 3D, so I'll get that by default, but it's unlikely to be UltraHD.