LAS VEGAS, NEVADA—Here at CES, every TV maker is showing off massive 4K "Ultra HD" TV sets. The high prices for this first wave of 4K TVs—Sony's 84-inch set is a cool $25,000, for instance—are going to make them strictly an early-adopter luxury for now. But digital cinema camera maker RED has the first consumer-ready solution for native 4K playback, perhaps solving the chicken-and-egg problem that threatened to plague adoption of 4K as a home entertainment standard.

Seeing is believing

The quality of image on a 4K Ultra HD TV is quite astounding in person. If you have ever noted the difference between a Retina and non-Retina display on an iPhone, iPad, or MacBook Pro, you can think of 4K as "Retina for TV." Images show a surprising level of detail—you can see every hair on the back of someone's neck, or every slat of the blinds in apartment building windows. Many of the demos on the CES show floor are jaw-dropping.

I can't help but want one.

But the massive Ultra HD resolution—which nearly matches the 4K digital cinema you might see at a newer, well-equipped movie theater—is a waste without native content. TV makers are hawking the quality of upscaling circuitry for existing 1080p content like Blu-ray. It's impressive enough, but certainly not what will drive adoption of the new standard.

Instead, owners are going to want full 3840×2160 resolution content to really make an Ultra HD TV shine. And RED says it can deliver that with its REDRAY 4K Cinema Player. The device was first announced in 2008, and we discussed how it could eventually bring 4K digital cinema technology to the home theater in 2009. In the meantime, RED's engineers have continually improved the product, building it around a 1TB hard drive instead of using discs.

"All those companies that make their bones making shiny plastic discs have realized it's the end of the road for shiny plastic discs," RED's top marketing honcho Ted Schilowitz told Ars.

Toshiba partnered with RED to use a REDRAY player with its Ultra HD TVs in its booth at CES, so we got to see a working one under glass. The $1,450 device is available for preorder now and will ship within a couple months, according to Shilowitz.

"All these companies with 4K panels are really interested in this, because most of them are driving their demos with these exotic server-like boxes," Shilowitz said. "There's no real consumer device to deliver native 4K content. But the REDRAY player is a very modern machine; it just connects using a single HDMI 1.4 cable and delivers full 4K resolution at data rates lower than Blu-ray."

Building a 4K world

Some observers have been skeptical that 4K technology would ever come to the home. "People keep writing that story over and over, how we didn't think 4K was coming. But at RED, we saw this coming seven years ago," Shilowitz said.

Part of the strategy involved building one of the first digital cinema cameras on the market—the RED One—and convincing filmmakers and TV producers to use them.

"We're not really scared of anything," Shilowitz said. "We built these cameras because we thought it was the right way to move to digital capture. We said HD wasn't good enough for capture, and it wasn't good enough for distribution. And a lot of big filmmakers weren't scared either, like Steven Soderberg, like David Fincher, like Peter Jackson."

Now many major motion pictures are shot on 4K equipment. And many TV shows have moved to digital capture as well. "The producers of ER moved to RED at the end of the last season," Shilowitz said. "Now they produce Southland exclusively with RED cameras." Criminal Minds and Justified are other examples that Shilowitz cited.

Even those movies and TV shows still shot on film are typically scanned in, with all post-production happening in the digital realm, and mastered and delivered in a digital format. So loads of native 4K content already exists, though some work may be required to remaster a final cut from the original post-production work files.

Consumer ready

Soon enough, consumers won't have to go to a theater to see 4K content in all its glory. "Every manufacturer that makes panels have 4K TVs here at CES," Shilowitz said. "And these aren't just prototypes—these are devices that are planned for sale soon, and some are already on sale now."

It's not an exaggeration—Ultra HD TVs were everywhere on the show floor. Panasonic, Sharp, LG, Samsung, Sony, and others all had ready-to-launch large, Ultra HD models on the show floor. Even HiSense, a Chinese brand that is perhaps best known for making the cheap HDTVs that line the shelves at your local Walmart, had dozens of Ultra HD TV models on display.

Enlarge/ China-based HiSense, known largely in the US as a cheap "off brand," had a dozen Ultra HD TVs on display.

Chris Foresman

The only problem now is getting all the 4K content being produced to those Ultra HD TV that are already trickling out to consumers. And REDRAY is currently the only viable solution. On the hardware side is the REDRAY Player. On the software side, RED offers the RRENCODE plug-in, which uses wavelet-based compression to funnel massive 4K video files down to a relatively tiny .red file.

.red files have a data rate of 20Mbps, which is "totally doable for home Internet connections," according to Schilowitz, and the large 1TB drive in the REDRAY player can buffer and cache movies for later playback. The bigger concern might be overall file size; an average 100-minute movie will clock in at about 15GB. Watching a dozen or so 4K movies over your home broadband connection and you might bump up against your ISP's bandwidth limits. That's why the REDRAY player can play back 4K .red files from an SD card or attached USB drive as well.

RED is partnering with Odemax to create a Netflix-like 4K distribution channel, though details on the service are scarce. We don't know exactly how much 4K movies will cost to rent or buy, and neither does RED. However, the service is supposed to be up and running by the time the REDRAY player ships around March or April.

Still, Ultra HD TVs are new, and it will be some time before they start filling living rooms in great numbers. But Shilowitz thinks adoption will happen faster than the transition from standard-definition to high-definition TV. "4K is great for people that can afford it now, but you're going to see price reductions every 6-9 months," he said. "The road [to 4K] might be a bumpy one, but come along on this ride with us and you'll end up with this amazing payoff."

Promoted Comments

I'm more interested in the RED codec. How it compares with H264 or the successor H265(with what we can know at this time) or how complicated it is to decode/encode...

RED was not at all forthcoming about their compression technology, except to say that they have about a dozen of the smartest minds in this field that have been working on REDRAY for the last 5 years.

The proof of the pudding is in the eating, as they say, and I watched the demo reel of footage shot with RED cameras and compressed using RRENCODE playing on Toshiba's TVs directly from the REDRAY player. Multiple times. It works as advertised—one scene was a slow motion capture of some girls looking in a store window, and you could see every hair on their heads flowing smoothly in the wind. It appears infinitely more detailed that even the best 1080p Blu-ray transfer. I can only think of terms like "astonishing," "amazing," "impressive," "phenomenal," and "breathtaking" to describe the viewing experience.

There may not be much 4K content available for purchase to consumers but that doesn't mean there haven't been plenty of movies and TV shows filmed in 4K that were just never released to consumers because of the lack of 4K TVs and devices capable of playing it. If you don't build hardware capable of supporting 4k then you will never have any media for it because no one can play it back. This push for higher resolution TVs solves the chicken or egg conundrum.

True as far as it goes. There is indeed a bunch of stuff that was shot at 4k, or shot on film and can be scanned at 4k. Red recently did a whole film festival of just 4k films. Red cameras are quite common, and they aren't the only way to shoot in 4k. Even the new GoPros will shoot 4k with a reduced frame rate, and those are justa few hundred dollars.

But, the reason I know about that Red film festival is because at work, somebody had gotten in and was trying to figure out how to deliver 4k. All the editing and finishing had been done in HD. Actually getting this project which had been shot in 4k, delivered to a 4k screening, proved to be an amazing hassle. As long as these projects are being finished in HD, going back to the raw material (hopefully it got archived, not guaranteed on many projects) and refinishing can be a very expensive thing to do. If a film maker did coloring himself, maybe it's just a lot of work. But, if color was handled by a separate company (very common), they won't deliver a new version for free. For an independent film maker, that cost may be impossible to come up with, just for some extra pixels. Ideally, it would just be a question of typing in a new render size, but in practice, something that you used in making the film was working in absolute pixel sizes so titles get shoved to a corner, or power windows in the grade show up on the wrong things, etc. So, yu really have to pay people to go through the whole process of redelivering a film.

In the immediate term, for a consumer who isn't going to be showing off their own content, the fact that a bunch of 4k content technically exists somewhere in the world, in storage vaults, behind closed doors, doesn't yet make purchasing these TV's a rational decision. If you can't actually watch any of the content, it's kind of a moot point. OTOH, next year it'll be $2,000 instead of $20,000. Two years from now, it'll be $200 at Wal Mart. Once the installed base of early adopters reaches some sort of critical mass, and somebody starts broadcasting in 4k. (Where Netflix potentially counts as "Broadcasting.") People will be much more agressive about finishing in 4k instead of just shooting, and then the tipping point will move rather quickly. But, it takes time with a few people watching their own content, a few people finishing for distribution for those few people, an installed base slowly building up. Itll take most of five years before 4k is a no brainer the way 1080p is today.

139 Reader Comments

There may not be much 4K content available for purchase to consumers but that doesn't mean there haven't been plenty of movies and TV shows filmed in 4K that were just never released to consumers because of the lack of 4K TVs and devices capable of playing it. If you don't build hardware capable of supporting 4k then you will never have any media for it because no one can play it back. This push for higher resolution TVs solves the chicken or egg conundrum.

True as far as it goes. There is indeed a bunch of stuff that was shot at 4k, or shot on film and can be scanned at 4k. Red recently did a whole film festival of just 4k films. Red cameras are quite common, and they aren't the only way to shoot in 4k. Even the new GoPros will shoot 4k with a reduced frame rate, and those are justa few hundred dollars.

But, the reason I know about that Red film festival is because at work, somebody had gotten in and was trying to figure out how to deliver 4k. All the editing and finishing had been done in HD. Actually getting this project which had been shot in 4k, delivered to a 4k screening, proved to be an amazing hassle. As long as these projects are being finished in HD, going back to the raw material (hopefully it got archived, not guaranteed on many projects) and refinishing can be a very expensive thing to do. If a film maker did coloring himself, maybe it's just a lot of work. But, if color was handled by a separate company (very common), they won't deliver a new version for free. For an independent film maker, that cost may be impossible to come up with, just for some extra pixels. Ideally, it would just be a question of typing in a new render size, but in practice, something that you used in making the film was working in absolute pixel sizes so titles get shoved to a corner, or power windows in the grade show up on the wrong things, etc. So, yu really have to pay people to go through the whole process of redelivering a film.

In the immediate term, for a consumer who isn't going to be showing off their own content, the fact that a bunch of 4k content technically exists somewhere in the world, in storage vaults, behind closed doors, doesn't yet make purchasing these TV's a rational decision. If you can't actually watch any of the content, it's kind of a moot point. OTOH, next year it'll be $2,000 instead of $20,000. Two years from now, it'll be $200 at Wal Mart. Once the installed base of early adopters reaches some sort of critical mass, and somebody starts broadcasting in 4k. (Where Netflix potentially counts as "Broadcasting.") People will be much more agressive about finishing in 4k instead of just shooting, and then the tipping point will move rather quickly. But, it takes time with a few people watching their own content, a few people finishing for distribution for those few people, an installed base slowly building up. Itll take most of five years before 4k is a no brainer the way 1080p is today.

If the data rates and sizes are smaller than a BluRay then (just an insane thought) wouldn't it be possible to enhance the BluRay encoding to use similar algorithms? They changed it before after all and the PS3 at least should be powerful enough to playback and uncompress pretty much anything you throw at it.

So Sony go make something useful instead of implementing bullshit like Cinavia.

In spite of what Red is saying, there is NOT a lot of content that has been captured in 4k. Much of the film content that has been or could be scanned in 4k may not become available for some time due to the costs involved. Finally, of the content that is available or could become available in 4k, how much of it has Odemax licensed? They have made no mention of that...

The story also ignores another entrant in the market, Sony. In fact, Sony is the only company that has released premium 4k content to date [1] and they have announced a 4k download service that is due in the summer.

Streaming 15gb movies to your $1450 player to your $20,000 TV when there is no content shot in the 4K format. God I'd love to sell some stuff to the people that buy into this at this stage.

Huh? The article clearly states that there *is* a lot of 4K content. See the quote below, from the article:

Quote:

Now many major motion pictures are shot on 4K equipment. And many TV shows have moved to digital capture as well. "The producers of ER moved to RED at the end of the last season," Shilowitz said. "Now they produce Southland exclusively with RED cameras." Criminal Minds and Justified are other examples that Shilowitz cited.

Even those movies and TV shows still shot on film are typically scanned in, with all post-production happening in the digital realm, and mastered and delivered in a digital format. So loads of native 4K content already exists, though some work may be required to remaster a final cut from the original post-production work files.

As a consumer, I am very interested in this. I don't have the $26k required to get in to this, but it means that I can put off upgrading my current HD TV for a while longer, and I can (maybe?) put off upgrading from DVD to Blu-Ray if 4K is going to be a streaming service. If/when this drops to more reasonable levels ($1-2k for a 60" 4K TV?), I'll take a closer look.

I hope that we are allowed to do more in terms of local storage though - 1TB is ok, but that's only 50 movies. Adding a 3-4TB drive via USB will help, but I would strongly prefer to add a proper NAS to that, and have my own (legal) repository.

But in the scheme of things that is NOT a lot of content. Not when you consider the availability of content on DVD and even Blu-Ray. Granted, going forward I'd guess we'd see more content developed if this tech really takes hold.

But the question is will it take hold in any meaningful way? Given that broadcasters can barely manage to get 1080i out the door in many markets and our current bandwidths are already fairly well saturated I have to wonder if it will. My guess is that within the next 5 years this won't really become a major breakthrough technology. It'll be limited to niche markets and home theater setups barring some major breakthrough in transmission technologies.

As it sits, streaming isn't even really a viable option. My current internet is $50 a month at 12Mbps. I could opt for 20Mbps for another $20 a month. And with so many ISP's pushing for data caps on their service I think streaming a 15GB movie isn't going to play well.

It's cool stuff but without major improvements to our data infrastructure I don't see that we have the bandwidth to support it on a mass consumer scale.

If the bandwidth and size requirements are less than Blu-Ray stuff, why not re-use Blu-Ray hardware that supports HDMI 1.4? It just needs a new decoder, right? I know most BR players won't be able to be updated for it or might not have the special hardware for it, but something like a PS3 could probably pull it off just fine. That leaves the current disc-based distribution network in place for people without internet connections.

While the focus is on digital files right now, disc-based media must not disapear. Why? Because there are always two factors that can negatively influence streaming: multiple gateways to watch content and bandwith cap from the ISP. Under my old ISP I tried and sample Netflix (ca) for a few month. Loved it, but I quickly ran afoul of the paltry 60 gig per month limit set by my ISP by another 60 gigs. And thats while keeping Netflix quality to the bare minimum to be watchable. Unlimitted plans here in Canada are non-existent form the majors ISP (deserving more thant 80% of the population because they also serve TV and Phone) and even in the lesser ISPs, unlimmited plans are usualy more than 60$, a luxury for most families.

Unless we get cheap unlimited Internet, I really do not see streaming (or downloading) as a future, let alone in 4K.

The rise of 4k TVs is closer to the initial introduction of (720) HD TVs than it is to the introduction of Blu-Ray. It will be years before there is a reliable/reasonably priced distribution network for the content, but it will come.

I like the push to higher and higher resolutions because it makes "normal" gear cheaper and more accessible. What I'm not sure about, having watched a film recently in a cinema with a 4K projector, is that it is really worth the bother: I found the level of detail distracting, especially with make-up continuity on actors. I even saw a brief, small reflection with what looked like some of the camera crew in it that had escaped post-production. This was not good as far as the plot was concerned, as I found myself unconsciously picking faults from there on - that wall was cardboard, I saw the tiny wobble; the date I can just see on that magazine is from the future, etc. Once you start like that it's hard to stop. Same with really high-quality sound, you can hear cars driving past the set above the noise floor even though it's meant to be in outer space.

Would "Metropolis", "The Third Man" or even "Goodfellas" be a better experience at ultra-resolution? I'm not sure. "Transformers 4: Everything Explodes!!" might be, but even then there's got to be a limit...

I think the next level of reality is not going to be 8K or 16K, it'll be resolution independence, where the scene is decomposed into a mathematical (vector) description of the objects and lighting, then re-rendered on demand, like a sort of moving pdf. For an individual viewer, the area of greatest detail could be cued to where they're looking. And that's another issue: all that bandwidth used for stuffing 8m+ pixels 24 (or 48) times a second when our eyes can only take in that level of detail in a small part of the display seems a bit of a waste.

The real problem with current HD video, whether internet streaming or cable/satellite, isn't the resolution. It's the compression artifacts. I have to imagine at 20Mbps the compression will be fairly noticeable.

Yeah I was surprised when they mentioned the 1TB hard drive. Are you going to get 3-4 4k movies on that? Then they mentioned the compression to 15GB. Seriously? Spend that much money and then have to watch it compressed? I am sure the red file format has some fancy new compression algorithm, but I cannot imagine you would not be able to notice something on the order of an 80-90% compression rate.

Have you ever watched a DVD? Can you fathom a guess as to the compression rate it uses?

.red files have a data rate of 20Mbps, which is "totally doable for home Internet connections," according to Schilowitz,

Not for most people in the US. I have a 20Mbps connection, and I may even get that speed on a good day, but I'm one of the lucky ones. Of course, I probably won't be in the market for a $20,000 TV or a $1,450 player any time soon, either.

I am really hoping that the next TV i buy has modern amenities. I do not understand why these companies focus on 3d and crap like that when tv's should come standard with voice input (via button initiation, wireless and wired ethernet, mirror cast, remote control via any device on the network with software that interfaces with it (ie roku remote control via android), standard streaming services built in (netflix, hulu, amazon, youtube etc), and recording capabilities, even if i have to attach my own external harddrive (or stream to my local nas). A cheap arm soc with gpu should be capable of handling the processing (perhaps offloading voice input to a web service).

I do not mind buying a new expensive tv, but we are past the point were we should be relying on external boxes to provide the content, and voice input should be standard at this point.

So long as someone, somewhere, releases a TV that is just a large panel with a tonne of inputs and outputs. I don't want a TV with "Android Inside", with Netflix, Hulu, etc, or even Ethernet. Just a screen, some speakers, and HDMI/DVI/Component connectors.

Of course it is, the kind of people buying $20k 4k TVs aren't going to buy a cheap player.

I've never had anything close to a $20K TV. But the player price seems perfectly reasonable to me. Adjusted for inflation, it's on par with what I spent on my first DVD player. (One of the Pioneer DVD/LD combo players. Hell, there were only 11 movies on DVD at the time and I had no idea if the format would take off or not.)

The obvious comparison to Blu-ray is even more bizarre considering it doesn't even use a disc, and thus features no "rays" at all. But I guess I lack the marketing genius that results in a company choosing the name "RED" (in all caps no less).

It took me until the end of the of the article to figure out what this "player" even does. My best understanding is that it is a box that connects to your TV on one end (over a standard cable) and the internet on the other and decompresses video files. As another poster said, why not just use a computer? Or, as someone else said, even a smartphone or tablet.

The only thing I can think of is that they are using a proprietary CODEC to decode a proprietary filetype that only works on their proprietary hardware. I guess we all know how well that works out.

I for one am incredibly impressed by the massive amounts of information that can be stored on a "plastic shiny disc" at extremely low cost. At this point though I guess plastic discs just aren't high-end enough for a company like RED. I would have been really impressed if they had created a new disc format using something like monocrystalline sapphire or even diamond for the substrate... DIAMOND-RAY!

I hope that we are allowed to do more in terms of local storage though - 1TB is ok, but that's only 50 movies. Adding a 3-4TB drive via USB will help, but I would strongly prefer to add a proper NAS to that, and have my own (legal) repository.

I completely agree with you here... the trouble is that we're in the minority. Your average consumer wants something to "just work" and the simple act of adding settings for external storage can throw them for a loop. So even if it is not technically much harder to offer this, there is incentive for device manufacturers to do the exact opposite.

I hope that we are allowed to do more in terms of local storage though - 1TB is ok, but that's only 50 movies. Adding a 3-4TB drive via USB will help, but I would strongly prefer to add a proper NAS to that, and have my own (legal) repository.

I completely agree with you here... the trouble is that we're in the minority. Your average consumer wants something to "just work" and the simple act of adding settings for external storage can throw them for a loop. So even if it is not technically much harder to offer this, there is incentive for device manufacturers to do the exact opposite.

Also, don't forget the standard practice of using built-in storage space to differentiate the product, especially when it's not upgradable. I'm sure they'll be happy to sell you a 3TB version for $400 extra.

Yeah I think it is still going to take a physically delivered medium to make it really wide stream. There are TONS of people in the US and around the world who absolutely do NOT have a 20Mbps connection.

From a personal perspective I wouldn't trust it unless I could build a library that I am storing. Sure I have netflix and it is all nice and good, but I also have a few dozen Blu Ray titles (not including my .mp4/.mkv library). From a personal perspective, I am not about to pay for a movie that I can't transcode. Yeah, lock me up, but screw DMCA and DRM when it comes to things I am spending my money on.

It allows me to pop a blu ray title in to my blu ray player and watch in full high bit rate 1080p glory when I can put up with possibly forced previews, menu, etc (by the way, screw you movie studios and forcing me to watch previews, sometimes without even being able to fast forward them, on something I paid you $13-30 for. Screw you very much). Or I can pop it in my computer and transcode it to a high bit rate 1080p h.264 file to quick pull up on my media player/computer later (or 720p if I care less about ultimate quality). Or I can transcode the disk or a master mp4/m4v/mkv file down to 720p or 480p to throw it on my phone or tablet or cram a bunch on a USB thumb drive for heading to the in-laws to bring some movies for the kids along to watch.

I am blessed with 75/35Mbps FIOS internet (if a little expensive). My in-laws, the best they have is 10/2 cable. Vacation house I was staying at with the extended family this past Thanksgiving, 3Mbps/256Kbps DSL. Yeah, try even streaming netflix on that DSL connection and you'll fail at pretty much anything other than SD, let along 4k.

How long is it going to take to buffer that 15GB file with a 10Mbps connection? What, maybe an hour or two before you can start watching and maybe 4-5hrs sucking your entire bandwidth to donwload it and store it for later? Even a 20Mbps means you are going to have to dedicate your entire connection to streaming the movie and I bet a lot of people don't get 20Mbps of USABLE bandwidth on a lot of their cable connections.

I don't want to have to select my movie and twiddle my thumbs a couple of hours, or go do something else for awhile, just waiting for my movie to buffer or download in its entirety before I can watch it. Especially since 1TB doesn't exactly give you much storage space for a downloaded library (maybe 60 odd movies).

I think the redbox is an interesting device and it is certainly not a terrible idea, but I still don't think it fixes the delivery issue of 4k content. If BR is not up to the task, at least the physical media, then someone needs to come up with some other physical delivery media that is. It sounds like if 20Mbps is the bitrate and you have 15GB files, Blu Ray disks should be able to handle it with aplomb, so long as the player firmware/hardware supported the file format.

Of course there is still the question to of lossiness. Sure it might look grand, but will at 50Mbps 4k file look better than a 20Mbps 4k file? Or a 100Mbps 4k file? My guess is, yes, you could probably notice at least a small improvement. Which again, if you are paying the huge bucks for 4k, you should have a storage method to deliver the highest quality (resonable) for the video possible. Redbox sounds excellent as a streaming source, at least to those with moderately good to very good internet connections. It fails for those with modest or no high speed internet connections and isn't necessarily good for those with moderate internet connections.

We still need a physical media spec and player for 4k before I think 4k has the chance of high adoption rates (well, and of course lower costs for the TVs, but even if the prices were significantly lower, I still don't think you'd see huge adoption until you solve the physical media delivery issue).

The obvious comparison to Blu-ray is even more bizarre considering it doesn't even use a disc, and thus features no "rays" at all. But I guess I lack the marketing genius that results in a company choosing the name "RED" (in all caps no less).

I believe the original specs used a disc, but between then and now it went to an HD.

It took me until the end of the of the article to figure out what this "player" even does. My best understanding is that it is a box that connects to your TV on one end (over a standard cable) and the internet on the other and decompresses video files. As another poster said, why not just use a computer? Or, as someone else said, even a smartphone or tablet.

To smoothly decode a wavelet encoded video stream at 20Mbbs with a 4k resolution, you are going to need a hell of a lot more processing power than your average computer has. I suspect it uses custom hardware rather than powering through with generic CPUs. It's also meant to be installed in rich people's home theater setup and just work, not be tinkered with like an HTPC.

A lot of you are apparently very unfamiliar with the high end home theater market, this stuff is not like what you can pick up at Best Buy, and money is largely not an issue. The first wave of new hometheater tech is for rich people. The second is for the enthusiasts and tinkerers. Then it trickles down to the people who think the most expensive TV at Best Buy is the best, then to the people who get electronics on sale at Walmart. We're still at the first step.

The proof of the pudding is in the eating, as they say, and I watched the demo reel of footage shot with RED cameras and compressed using RRENCODE playing on Toshiba's TVs directly from the REDRAY player.

I think the proof should be based on science, not on a demo.

I recently did some research on loudspeakers and it turns out people can be deceived quite easily. Even trained listeners thought a certain 10.000 euro speaker sounded better than a 600 euro speaker, when the test wasn't blind. Later in a double blind experiment they thought the 600 one sounded best. Even trained listeners can be deceived by their own brains.

If RED only showed new material you never saw before on a screen you never saw before, can you really make a good judgement? Was there a direct comparison with the same material on another player? On another screen? What distance were you sitting? Especially with more distance, your brain will happily fill in some extra details if you're looking hard for them.

I can totally see the benefit of 4k in the theater, but from this article last year I was under the impression you can't even see the difference between 4k and 1080p in a normal viewing situation (40-50" TV @ ~7'). Aside from crazy home cinema setups, what's the benefit?

At this point, the axis of growth isn't "how close can you sit to your TV?", but "how big can your TV be?" Everyone's scifi future is wall sized TVs with photo-realistic graphics that can replace windows. You can't stretch a 1080p picture over a 12 ft. screen because the pixels would be the size of toasters. But with 4K, 80-100" will probably become the norm.

Syon wrote:

I'm also not clear on why the REDRAY player doesn't use bluray discs. The video size and bandwidth listed in the article are less than what bluray is capable of, but the way the article reads it seems (to me at least) to suggest that bluray disc wouldn't work for whatever reason. Am I missing something here?

Licensed vs. proprietary technology. They don't want to pay Sony, they want to corner a new market.

I recently did some research on loudspeakers and it turns out people can be deceived quite easily. Even trained listeners thought a certain 10.000 euro speaker sounded better than a 600 euro speaker, when the test wasn't blind. Later in a double blind experiment they thought the 600 one sounded best. Even trained listeners can be deceived by their own brains.

Tests are always good, but human's visual acuity is far better than our aural acuity. Given a large enough TV, well shot 4k footage is blindingly obvious.

Actually many desktop computers today likely have displays attached that are rather close to 4K resolutions already.

Doubtful, display resolutions have largely stagnated around 1080p for the last few years.

CRT monitors were able to display at lest that. I have dual 30" flat panel monitors but they don't hold a candle to my old CRT. Better resolution, color reproduction, and so on. the only reason I made the move to flat panel was that when my CRT finally died I couldn't find another at a reasonable price. I'd like to know why most computer flat panel monitors max out at 1080.

I recently did some research on loudspeakers and it turns out people can be deceived quite easily. Even trained listeners thought a certain 10.000 euro speaker sounded better than a 600 euro speaker, when the test wasn't blind. Later in a double blind experiment they thought the 600 one sounded best. Even trained listeners can be deceived by their own brains.

Tests are always good, but human's visual acuity is far better than our aural acuity. Given a large enough TV, well shot 4k footage is blindingly obvious.

I have to agree. When people were fighting over BluRay vs DVD I never saw the "huge" difference people claimed there was. I saw a slight difference but not enough to warrant my buying an expensive BluRay player (at the time). Most seemed to be seeing what they were told they should see. When it came to computer animations then yeah I saw a big difference but live action not so much.

At this point, the axis of growth isn't "how close can you sit to your TV?", but "how big can your TV be?" Everyone's scifi future is wall sized TVs with photo-realistic graphics that can replace windows. You can't stretch a 1080p picture over a 12 ft. screen because the pixels would be the size of toasters. But with 4K, 80-100" will probably become the norm.

This doesn't hold water. You local Movie Theater has no trouble at all "stretching" 1080p to a 50+ Foot screen.

The Majority of Digital Cinema screens are using 2K projectors (essentially the same resolution as HDTV).

To smoothly decode a wavelet encoded video stream at 20Mbbs with a 4k resolution, you are going to need a hell of a lot more processing power than your average computer has. I suspect it uses custom hardware rather than powering through with generic CPUs. It's also meant to be installed in rich people's home theater setup and just work, not be tinkered with like an HTPC.

A lot of you are apparently very unfamiliar with the high end home theater market, this stuff is not like what you can pick up at Best Buy, and money is largely not an issue. The first wave of new hometheater tech is for rich people. The second is for the enthusiasts and tinkerers. Then it trickles down to the people who think the most expensive TV at Best Buy is the best, then to the people who get electronics on sale at Walmart. We're still at the first step.

To be fair, $1,450 will also buy a hell of a lot more processing power than your average computer has.

If you are correct that the hardware in this box can do something a standard PC can't (decode a wavelet encoded video stream at 20Mbbs with a 4k resolution), then I suppose that is at least something. In fact, it seems like the interesting technical advance here is in the wavelet decoding. An article that goes into the details of that would be great to read, though it sounds like the company hasn't released much information. As it reads now, it just sounds like marketing.

As far as the high end home theater market is concerned, I admit that I am happily unfamiliar with it. I'd say that in general, Ars rarely covers that market anyway, except to highlight interesting technology (and, of course, it's CES). Unfortunately, this story doesn't do much to make the REDRAY technology sound noteworthy, except for the price tag. That the price is mentioned along with $20,000 for a TV (when cheaper 4K TVs were also announced) seems to confirm that focus.

Considering how quickly processor speed and storage space grows, I guess I'm just surprised that it is such a big deal to display 4x more pixels than we've had for many years now. Isn't 1080P 3D already halfway there (assuming actual 120FPS framerate)? It almost sounds like it would be cheaper to literally buy 4 1080P TVs, 4 Blu-ray players, and a movie split into 4 discs. But again, I admit I'm not the market.

At this point, the axis of growth isn't "how close can you sit to your TV?", but "how big can your TV be?" Everyone's scifi future is wall sized TVs with photo-realistic graphics that can replace windows. You can't stretch a 1080p picture over a 12 ft. screen because the pixels would be the size of toasters. But with 4K, 80-100" will probably become the norm.

This doesn't hold water. You local Movie Theater has no trouble at all "stretching" 1080p to a 50+ Foot screen.

The Majority of Digital Cinema screens are using 2K projectors (essentially the same resolution as HDTV).

Exactly how many homes have you been to where the TV was as far away from the couch as the seats are from a movie screen?

Your flawed logic can't refute mathematics. Hyperbole aside, a 1920x1080 picture on a screen 10 ft wide would have pixels that were over half an inch long. In the graphics industry, that's known as Real Damn Big.

Edit: no, my math is off by a factor of 10. They would be about .06 inches. But that's still far from ideal for a TV set.

If the data rates and sizes are smaller than a BluRay then (just an insane thought) wouldn't it be possible to enhance the BluRay encoding to use similar algorithms? They changed it before after all and the PS3 at least should be powerful enough to playback and uncompress pretty much anything you throw at it.

So Sony go make something useful instead of implementing bullshit like Cinavia.

Like their 4k upscaling BluRay player that's been out since last summer? I've heard talk of 4k BluRay discs launching this year, but we'll see. I don't think this Red player will take off, when the much bigger players have 4k plans of their own.

I'd like to know why most computer flat panel monitors max out at 1080.

Because the manufacturers can sell 1080p panels for a lot of uses, while higher res panels are a niche market at this point.

Quote:

I have dual 30" flat panel monitors but they don't hold a candle to my old CRT.

High quality LCDs are just as good now, but you pay a premium over the cheap ones most people buy.

Quote:

A single 2560x1440 monitor is not uncommon, and 3x 1080 monitors is pretty common too (5760x1080).

Common where? Maybe among graphics pros and enthusiasts, but you are going to have trouble finding them at Best Buy or in the workplace. You have to really want a higher quality monitor to get one, because there is a large price gulf.

Quote:

This doesn't hold water. You local Movie Theater has no trouble at all "stretching" 1080p to a 50+ Foot screen.

And they suffer for it. It only seems better in most theaters because their film projectors were old and out of focus.

Quote:

An article that goes into the details of that would be great to read, though it sounds like the company hasn't released much information. As it reads now, it just sounds like marketing.

It's not as much of a theory advancement as it is a practical one (wavelet codecs have been around for years). Video decoding has been crippled by lack of processing power since we started having digital video. The more processing you can throw at it, the less bandwidth you need, and we aren't up against any insurmountable walls yet. BluRays, at least the earlier ones, are very inefficiently encoded so the set top boxes could decode them without big processors and fans. Not we have specialized chips that can decode H.264 in phones, so it's easy. In a couple of more years they will be able to handle 4k too.

At this point, the axis of growth isn't "how close can you sit to your TV?", but "how big can your TV be?" Everyone's scifi future is wall sized TVs with photo-realistic graphics that can replace windows. You can't stretch a 1080p picture over a 12 ft. screen because the pixels would be the size of toasters. But with 4K, 80-100" will probably become the norm.

This doesn't hold water. You local Movie Theater has no trouble at all "stretching" 1080p to a 50+ Foot screen.

The Majority of Digital Cinema screens are using 2K projectors (essentially the same resolution as HDTV).

Exactly how many homes have you been to where the TV was as far away from the couch as the seats are from a movie screen?

Your flawed logic can't refute mathematics. Hyperbole aside, a 1920x1080 picture on a screen 10 ft wide would have pixels that were over half an inch long. In the graphics industry, that's known as Real Damn Big.

Edit: no, my math is off by a factor of 10. They would be about .06 inches. But that's still far from ideal for a TV set.

I've posted this in prior discussions, but here is a chart that shows the screen size to viewing distance ratio required for someone with average visual acuity to notice the difference between the various resolutions.

When you look at the ranges shown here, you realize that the majority of people are viewing from much farther than 'ideal'. If you're already sitting too far back, then obviously you won't be able to recognize the difference that 4k provides. Those that use the THX suggested viewing distance will notice the difference between 1080 and 4k.

A single 2560x1440 monitor is not uncommon, and 3x 1080 monitors is pretty common too (5760x1080).

Common where? Maybe among graphics pros and enthusiasts, but you are going to have trouble finding them at Best Buy or in the workplace. You have to really want a higher quality monitor to get one, because there is a large price gulf.

2560x1440's are in the $350-1k range typically, they aren't ubiquitous by any stretch, but they aren't unicorns either. Yes, more popular among graphics pros and enthusiasts. You can pick up 3 1080's for under $400, I don't know how common it is in the workplace, my evidence is purely anecdotal, but my peers mostly use 3 monitors at work.

At this point, the axis of growth isn't "how close can you sit to your TV?", but "how big can your TV be?" Everyone's scifi future is wall sized TVs with photo-realistic graphics that can replace windows. You can't stretch a 1080p picture over a 12 ft. screen because the pixels would be the size of toasters. But with 4K, 80-100" will probably become the norm.

This doesn't hold water. You local Movie Theater has no trouble at all "stretching" 1080p to a 50+ Foot screen.

The Majority of Digital Cinema screens are using 2K projectors (essentially the same resolution as HDTV).

Exactly how many homes have you been to where the TV was as far away from the couch as the seats are from a movie screen?

Your flawed logic can't refute mathematics.

What flawed logic? You are the one who just made unbounded claims about screen 1080 not working on screens of 12 feet without factoring in viewing distance. That is your failure not mine.

Then there is the factor that most people simply aren't that comfortable sitting super close to a screen to the point that it makes watching a movie and exercise in moving your head back and forth.

That tends to to hold the viewing angle constant regardless of screen size, thus making relative pixels size also constant, thus making 1080p Viable for any screen size for most people.

Thinking people are going to want 12ft screens in their living room, and then sit close to them, just doesn't seem that realistic to me. It seems half the people I know have projectors these days and no one sets them up at close enough positions to warrant > 2K projection. It just isn't comfortable for most people to sit at these extremely close viewing angles.

1080p is viable at all screen sizes, because the comfortable viewing angle tends to push the majority of people out to the distance where visual acuity makes higher resolution pointless. It's a constant relationship.

It sounds like anyone can distribute movies using this service, which is intriguing.

This part jumped out at me:"The .RED content will only play on REDRAY players; if encrypted you can specify precisely which machines it will play on from all players down to a specific one. Content can be limited to only playing once or several times and can be limited to a time of day. If you charge money for people to see your content Odemax will handle the billing and take 30 percent, leaving the content producer with 70 percent."

So, to answer my own question, it looks like the reason to buy a REDRAY player is to have access to Odemax content. I suspect the actual decoding hardware and technical details are secondary to that.

Interestingly, the REDRAY player can apparently output over four HDMI cables to four 1080P televisions, so technically you don't even need a 4K TV.

Tests are always good, but human's visual acuity is far better than our aural acuity. Given a large enough TV, well shot 4k footage is blindingly obvious.

Blindingly obvious? We have blind tests for that! :-)

Seriously though: I do understand that large screens need more pixels. Question is: how large and at what viewing distance? Probably not easy to answer, since eyes aren't straightforward pixel matrices, so one-on-one pixel comparisons can't be made I guess.