Sony’s 4K UltraHD TVs plummet in price, but content still scarce

Fancy LCDs drop in price equivalence from "nice new car" to "crappy new car."

We saw plenty of UltraHD "4K" TVs at CES—that is, TVs with a vertical resolution of 2160 pixels and a horizontal resolution that varies a bit depending on manufacturer or format—but they all had one thing in common: a price tag about equivalent with a mid-size car. This isn't totally without precedent, as when plasma TVs were first entering the consumer space 15-ish years ago they too were priced ludicrously high. However, according to CNET, Sony announced this morning much lower pricing on two LED-powered Ultra HD TVs: $4,999 for its 55-inch XBR-55X900A model, and $6,999 for its 65-inch XBR-65X900A model.

The TVs will be available to order starting on April 21, but they are actually on display now at Sony stores in cities across the country, including New York, Las Vegas, Houston, and several places in California. According to Sony Electronics Home Division Vice President Mike Lucas, the TVs will be delivered to customers "just before the summer."

The general consensus among the Ars staff at CES was that while Ultra HD TVs certainly looked nice—especially the ones shown off in closed booths with carefully controlled lighting and viewing distances and content—they weren't jaw-droppingly overwhelming. The jump from standard-definition to HD content was very noticeable, but the jump from 1080p to 2160p doesn't appear to carry nearly the same visual wallop.

That leads to one of UltraHD's current biggest failings. Once you've got this great high-resolution TV—what are you supposed to watch on it? "Standard" over-the-air HD or HD satellite and cable will look fine, of course, as will Blu-rays (or upscaled DVDs), but they're only going to be delivering a quarter of the pixels you could be seeing in native 2160p resolution. Where's the 4K content?

There are a number of methods to get 4K video playing on your new 4K TV, but Sony is hoping that you'll spend $699 on its FMP-X1 4K media player, which provides content specifically mastered to for UltraHD's 2160p resolution. Currently the player comes preloaded with a collection of films that could best be called eclectic: Bad Teacher, Battle: Los Angeles, The Bridge over the River Kwai, The Karate Kid (the crappy new one, not the good one I grew up with), Salt, Taxi Driver, That's My Boy, The Amazing Spider-Man, The Other Guys, and Total Recall (again, the crappy new one and not the hilariously bad 1990s one). Out of those, I'd certainly love to see Bridge over the River Kwai in UltraHD—and without looking it up I'd bet it's one of the few films on the list filmed with actual for-real film—and most of the others promise to be at least, well, loud.

Additional content will be available in fall 2013, via "a fee-based video distribution service offering a library of 4K titles from Sony Pictures Entertainment and other notable production houses." Note that this is going to be Sony's fee-based video distribution service; at least for now, Netflix and other streaming services appear to be shut out of the 4K game.

There are some competitors, like high-end digital cinema company RED, which has for years been planning on selling a competing UltraHD movie player under the name of REDRay. REDRay sounds impressive; it would use an advanced compression scheme to cram a full-length UltraHD movie down to a 15GB file with a data rate of about 20Mbps, which would make it streamable over many consumers' high-speed Internet connections. The only problem with REDRay right now is that although it's been "coming soon" since at least 2008, the $1,450 player still isn't available.

Enlarge/ The REDRay home media player promises to deliver 2160p content to your UltraHD TV... when you can eventually buy it.

RED

So be careful if you ride that early adopter wave and bring 2160p into your living room. As with each new TV and movie technology that's been introduced over the past couple of decades, you won't have much to watch on it other than fancy demo content for the foreseeable future.

Lee Hutchinson
Lee is the Senior Technology Editor at Ars and oversees gadget, automotive, IT, and culture content. He also knows stuff about enterprise storage, security, and manned space flight. Lee is based in Houston, TX. Emaillee.hutchinson@arstechnica.com//Twitter@Lee_Ars

For those looking to jump to 4K, several sets have gotten affordable lately. Toshiba has one for ~$4650 USD and I found a Sharp monitor for ~$4480.

I'm hoping that prices drop further I'm hope to replace a triple 1920 x 1080 Eyefinity monitor setup with a single display. The 3240 x 1920 virtual resolution I use those three monitor is nice but there are the bezels. I'm eager to ditch the bezels but dropping nearly $5k USD to get rid of them is a bit much. In comparison those three monitors cost me less than $1000 several years ago and they were nice IPS panels.

Honestly, I think TVs are seeing the same problem that's facing the PC industry: what we have is good enough.

Seriously. I built my own computer last year and the only reason I spent money on it was because I actually like the project. I'll never buy another desktop again. The laptops I have? Good enough to last for a long time. The only thing that could convince me to spend money is something that is intrinsically better. A combination laptop/tablet would be something I would buy...if it's actually better than the product it's replacing (in this case, a 2011 MacBook Air).

3d isn't inherently "better". 4k could be, but at range, the visual clarity above 1080p isn't really there. You have to be close to see all that data...and people don't sit that close to their TVs. So there isn't much to sell people on "better" other than stuff like the ridiculous race over Hertz (look 120 Hz TV! No wait! Ours is 240! Check out our 480 Hz TV! Oh Boy!), and that gets stagnant quickly.

But the truth is, that most people don't need it. Just like a new laptop. A lot of companies are looking around and blaming everyone else and trying to prognosticate why oh why sales are dropping. Simple. Because we've advanced so quickly that there are few great "leaps" to make.

Now, of course, we'll get to the next level and end up with great stuff. But this is the lull, and honestly? The new stuff just isn't so much better that it requires buying every time.

One of the huge problems with this technology is that you need to get a relatively huge screen, and then sit rather close to it, to see the benefit. So you have to find the combination of people with enormous amounts of disposable income and small living rooms. Maybe city dwellers?

I hope it doesn't die out entirely though. I want panel manufacturers to take those displays, shrink them down a bit, and stick them in computer monitors. The initial switch to flatscreens caused a really noticeable stagnation in pixel densities in the latter half of the 2000s, and we're only now starting to see higher resolutions commonly again.

Resolution is kind of hitting the point of degrading returns in a big way.... at least for the current size of tv that most people will have. I'll just keep holding my breath that OLED will mature enough to replace the current led backlight at a price more affordable than its been. Better (non dynamic) contrast and more even blacks will win me over...

55" screen at normal living room distance? Can't imagine there is a lot of difference between HD and 4k. I would think that 4k would be noticeable on a 15' projected screen at living room distance. Seems like 4k would be better in projectors?

You and me both! I mean, $350-650 isn't terrible, but it's a pretty big jump from nice 1080 prices.

Of course native 4k content will be slow in coming, but with the upscaling baked in to the tv's (at least the Sony ones have it baked in) a bluray collection will see a little bump. Also, while you'd have to be pretty close to an 84" 4k tv to appreciate the full benefit of 4k vs 1080, you do notice the improvement from reasonable distances

For those that say 4k is a waste or that it's ridiculous to spend so much, increased pixel density doesn't hurt you at all, and you can feel good about saving so much money on a 1080 set, but just because you aren't the target market for a 4k tv doesn't mean there isn't a market for them.

For those looking to jump to 4K, several sets have gotten affordable lately. Toshiba has one for ~$4650 USD and I found a Sharp monitor for ~$4480.

I'm hoping that prices drop further I'm hope to replace a triple 1920 x 1080 Eyefinity monitor setup with a single display. The 3240 x 1920 virtual resolution I use those three monitor is nice but there are the bezels. I'm eager to ditch the bezels but dropping nearly $5k USD to get rid of them is a bit much. In comparison those three monitors cost me less than $1000 several years ago and they were nice IPS panels.

Getting closer; and the Sharp is the first one I've seen in a high end computer monitor resolution, but still a bit more than I'm willing to spend. The first 30-35" 4k panel to hit the $2k mark probably will consume my tech toy budget for the year it comes out in.

These TVs need to come down in size next. No way in hell I'm putting a 55" TV in my living room. Also, Comcast doesn't even offer 1080P HD service yet, and the manufacturers are already pushing new standards? Let's meet my current TV's (already antiquated) standard, first. Then we can talk about upgrades.

55" screen at normal living room distance? Can't imagine there is a lot of difference between HD and 4k. I would think that 4k would be noticeable on a 15' projected screen at living room distance. Seems like 4k would be better in projectors?

It's for normal visual acuity (20/20) so fairly accurate. For a 55" you'd start to tell the difference between 1080 and 4k around 7.5 feet away. For Sony's 84" you'd start to tell the difference between 1080 and 4k around 11 feet.

Many people are content with tv's far from ideal viewing distance, but home theater enthusiasts often use the THX standard for viewing distance (screen size divided by 0.84), which puts you well within the range of appreciating the difference between 1080 and 4k. (Ie. 84" / 0.84 = 100" = 8.3 feet)

Like the article says, the difference isn't as significant as SD to HD was, but it's not negligible. I'd imagine a much slower adoption than 1080, but I'm glad 4k is out there already.

"and Total Recall (again, the crappy new one and not the hilariously bad 1990s one)"

hilariously bad? I do agree that the new one was really really bad... But the 90s one?

Dude, the 1990s Total Recall was both hilarious and awful. It is an awful, awful movie. It's an awesome bad movie, definitely, and one that's in my top 5 Schwarzenegger films (a list that would also include Commando, Predator, T2, and Conan the Barbarian), but there is no conceivable universe where it's anything approaching a good movie.

Compare the new TR, which is just plain bad, without the hilarious honesty in the first movie.

4K is okay, but everything I'm hearing is the real jaw-dropping transition happens when you start looking at 8K. My gut feeling is we don't need 4K, but need to jump ahead right to 8K.

I think it really comes down to screen size/use. 4k is mostly going to be wasted for TVs. For 20" class monitors 4k gives >200 DPI which will minimize visible pixels/greatly reduce the need for AA in gaming while offering an integer scaling mode to make legacy software mostly work out of the box. For the 30" class though you really can't do the same without at least a "5k" (5120x2880) resolution; I wouldn't be surprised if that resolution gets jumped over in order to go strait to 8k though.

Honestly, I think TVs are seeing the same problem that's facing the PC industry: what we have is good enough.

That's pretty much it.

480i dominated for six decades, and only got supplanted because TVs got much, much bigger. As impressive as 4K is, I don't see anyone, outside of technophiles, itching to get it. Most people would rather give "HD" a few more decades.

The main thing that needs improvement is content delivery. Lots of cable channels are still in SD, and the ones that are in HD suffer from horrible compression at times. Internet streamingneeds to be more reliable, too. Having high-quality 1080p content everywhere would mean way more to me than 4K.

Does anybody know how 1080p content looks on one of these things? Are we going to see the same artifacting we saw with non-upscaled DVDs on HDTV's?

Probably not; or at least much less severely. DVD is typically 480p which doesn't linearly scale to HD resolution which means some degree of artifacts is inescapable; 1080p to 4k scales as a simple pixel doubling in each direction. The simple and artifact free option would just be to map each input pixel to a quad of output pixels at the cost of losing any higher DPI benefits. Averaging the intermediate pixels would let you get a full 4k output but with much less artifacting than from up scaling DVD content.

I think it really comes down to screen size/use. 4k is mostly going to be wasted for TVs. For 20" class monitors 4k gives >200 DPI which will minimize visible pixels/greatly reduce the need for AA in gaming while offering an integer scaling mode to make legacy software mostly work out of the box. For the 30" class though you really can't do the same without at least a "5k" (5120x2880) resolution; I wouldn't be surprised if that resolution gets jumped over in order to go strait to 8k though.

I love technology so can't knock SONY down for bringing this tech into the market. However, with all the money they are loosing each year, may be it would be more beneficial to concentrate on bringing the prices down for their existing televisions and making them more competitive to Samsung or LG.

As the BBC displayed during the Olympics (and as was reported on Engaget at the time) ignoring all the technical hurdles, 8k scales to cinema sized screens without effort. It is utterly, completely, incomprehensibly pointless in a living room tv.

Now, in a business and combined with touchscreen for presentations, realtime network analysis, code design, give it to me. Give it to me now.

"and Total Recall (again, the crappy new one and not the hilariously bad 1990s one)"

hilariously bad? I do agree that the new one was really really bad... But the 90s one?

Dude, the 1990s Total Recall was both hilarious and awful. It is an awful, awful movie. It's an awesome bad movie, definitely, and one that's in my top 5 Schwarzenegger films (a list that would also include Commando, Predator, T2, and Conan the Barbarian), but there is no conceivable universe where it's anything approaching a good movie.

Compare the new TR, which is just plain bad, without the hilarious honesty in the first movie.

Wait, wait wait. Why is 'The Last Action Hero' not on your list of top 5 Ah-nold movies? Say what you will, but it also had an awesome soundtrack...

On topic, 4k adoption is going to be much like bluray adoption. Its a classic example of 'good enough'. VHS was awful. DVD was far superior. Bluray was overkill and thus why the adoption was slow to take hold especially with player prices in the high 3 figure range.

4k TV is in the same boat. Standard def was not good. 1080p is vastly superior. For most people, 4k is overkill and the only time it really becomes useful (at this exact moment in time) would be in a huge physical format (ie: 60"+ tv size - or maybe the insane 110" size...) where you just flat out have a huge area to cover.

It certainly seems that our hardware development cycle has vastly outpaced the content and software ones. Without game developers and movie and television studios keeping pace, consumer demand for novelty may be the only force funding consumer technological advance.

Just as there are only a few game titles that can fully utilize the hardware of my 2-year-old Nvidia GPUs, there are only a few movies I own that might be improved by 2160p and most of those are nature documentaries.

There are so many terrible skeuomorphic graphics cluttering news and sports programming already at 1080p. Why should I upgrade for those programs when what I really want is conservation of the pixels I already see from flashy watermarks and corporate logos?

There is also the issue of "upscaling" and "remastering" the old movies that I love to watch. In many cases, beautifully shot films from the 80s and 90s look terrible in 1080p. A wide-angled sweeping shot of the highlands of Scotland looks flat and blue in the latest release of Braveheart, Instead of vibrantly green and brown. It seems strange that adding more pixels to the scene removed depth somehow, but I really see the difference. In other movies the texture and substance of objects is so clear and sharp that I can instantly recognize them as props or painted foam. I believe that a a lot of CGI effects, both in current films and from the previous decade, seem more believable at smaller resolutions. Without improvement of these graphics, particularly when they render objects in motion, 4k may not improve existing films at all.

Although it would be interesting to have my base overrun by fist-sized zerglings or walk through the streets of Whiterun with those 2048-bit textures just a foot or two from my nose, my interest isn't enough to justify the expense for such novelty.

Today, almost all affordable computer screens are 1080p to align television and computer screens.

I want 2160p (4k, whatever) to become ubiquitous so that high-res computer monitors become the norm. My computer screens fill a much larger solid angle of my field of vision than my television ever does (both desktop and laptop). At least on a computer I have a chance of resolving 2160p from 1080p.