When 4k Is Not 4k

For the second year in a row, 4k Ultra High Definition was all over the Consumer Electronics Show in Las Vegas. This year, the manufacturers promise not only more 4k TVs, but (with the arrival of Ultra HD Blu-ray) some actual 4k content to watch on them. There’s just one catch: Most of the movies you’ll watch in “4k” aren’t 4k at all.

Here’s the dirty secret about the industry’s move to 4k or higher displays: The majority of modern movies are either photographed digitally at 2k resolution or have a 2k Digital Intermediate. While it’s true that some movies are indeed starting to be photographed with 4k cameras (and movies shot on film may get scanned at 4k resolution), most of them still get downgraded to 2k for the post-production workflow. The higher pixel resolution of 4k requires a big increase in bandwidth resources that most post houses can’t handle. And, ultimately, most viewers can’t tell the difference between 2k and 4k anyway.

Think I’m exaggerating? Let’s look at some of the launch titles that have been announced for early release on the Ultra HD Blu-ray format this spring.

Here are the titles that Warner Home Video has announced:

‘The Lego Movie’ – Animated on a 2k DI
‘Mad Max: Fury Road’ – Shot in 2k, with a 2k DI
‘Man of Steel’ – Shot on 35mm, with a 2k DI
‘Pacific Rim’ – Shot in 5k, but only a 2k DI
‘Pan’ – Shot in 3k, with a 2k DI
‘San Andreas’ – Shot in 3k, DI is not listed but probably 2k

Yes, every single film that Warner plans to release on the 4k Ultra HD format is a 2k movie.

The 20th Century Fox release titles are only marginally better:

‘Exodus: Gods and Kings’ – Shot in 5k, with a 2k DI
‘Fantastic Four’ – Shot in 2k, with a 2k DI
‘Kingsman: The Secret Service’ – Shot mostly in 2k, with a 2k DI
‘Life of Pi’ – Shot in 2k, with a 2k DI
‘The Martian’ – Shot in 5k, with a 2k DI
‘The Maze Runner’ – Shot mostly in 2k mixed with some 5k, with a 4k DI
‘Wild’ – Shot in 2k, with a 2k DI
‘X-Men: Days of Future Past’ – Shot in 2k, with a 2k DI

That’s 13 launch titles from two major studios, and only a single movie was actually produced at 4k resolution (‘The Maze Runner’) – and even that one was mostly photographed in 2k. And these aren’t just old movies made before 4k was possible. Even major big-budget tentpole blockbusters from the past year were made in 2k. Many more will continue to be made in 2k this year and going forward too.

Only Sony appears to have a genuine commitment to making movies in 4k. Here are that studio’s Ultra HD Blu-ray launch titles:

‘The Amazing Spider-Man 2’ – Shot on 35mm, with a 4k DI
‘Chappie’ – Shot in 5k, with a 4k DI
‘Hancock’ – Shot on 35mm, with a 4k DI
‘Pineapple Express’ – Shot on 35mm, with a 2k DI
‘Salt’ – Shot on 35mm, with a 4k DI
‘The Smurfs 2’ – Shot in 4k, with a 4k DI

Forget About 3D with Ultra HD

In all the hype about Ultra HD, the manufacturers and home video studios have also been careful to downplay another issue that some viewers will find disappointing. If you happen to be a fan of 3D (and it seems that fewer and fewer people are these days), you’re completely out of luck. The Ultra HD format does not support 3D. I say again for emphasis: The Ultra HD format does not support 3D. At all. Period. End of discussion. It’s not in the spec. Nobody has any interest in adding it to the spec anytime soon. As far as Ultra HD is concerned, 3D is dead.

How can this be? Why would the new, super-advanced format drop a feature that’s already available on regular Blu-ray?

The first thing you need to understand is that there is no such thing as a 4k 3D movie at the present time. Not in theaters, not anywhere. All 3D movies are 2k. Yes, this includes that special overpriced screening of ‘Star Wars: The Force Awakens’ you just saw in super deluxe IMAX 3D Laser Projection from dual 4k projectors. Even that was upconverted from 2k. Nobody in Hollywood is making 3D movies at 4k. The resource requirements are too huge. Given that the public’s interest in 3D is waning, there’s been no big push in the industry to invest in 4k 3D. That being the case, the Ultra HD Alliance decided to dump it altogether.

If you enjoy 3D and want to continue watching movies in that format, you’re stuck with standard Blu-ray.

Ultra HD Is About More Than 4k

If most of the films getting released on 4k Ultra HD Blu-ray are really 2k movies, what’s the point of Ultra HD at all? Honestly, the increase in pixel resolution from 1920×1080 to 3840×2160 is the least interesting thing about Ultra HD. At the screen sizes available in almost all home theaters, 1080p already hits a sweet spot for delivering richly detailed images with no visible pixel structure. Our human eyes are not capable of resolving much of the additional detail 4k may offer, except on perhaps the largest of projection screens. That extra resolution is more beneficial on a huge 50-foot cinema screen, but for the needs of home theater, it’s basically irrelevant.

Fortunately, Ultra HD brings other new improvements over regular High Definition. The most notable of these are enhanced colors and High Dynamic Range.

You may have read about how Ultra HD will offer millions of new colors that HDTVs of the past were never capable of reproducing. While technically accurate, those claims are largely overblown. The 10-bit color depth and expanded color gamut will be subtle improvements. Ask yourself when was the last time you watched a Blu-ray and thought it wasn’t colorful enough? (Please spare me the inevitable snark about watching black-and-white movies.) Many of the new colors in the expanded gamut are beyond the range of human vision – and of those that are visible, most of today’s two-tone, digitally graded, teal-and-orange movies will never use them. However, the 10-bit color depth means the elimination of banding artifacts in color gradients, which are a genuine limitation of the 8-bit color that standard Blu-rays are encoded with. Artifacts like that are already pretty rare, but Ultra HD shouldn’t suffer them at all, which is a good thing.

High Dynamic Range is by far the most interesting development of Ultra HD. HDR movies have much darker darks and much brighter brights than those of the past, yielding a richer, more vibrant and lifelike image. HDR projection started rolling out to theaters over the past year, and the response from viewers has been overwhelmingly positive. Now that experience is coming to the home as well.

With that said, be aware that not every movie is HDR. A movie has to be specifically graded for the extended dynamic range in post-production. So far, only a handful of movies have undergone that treatment. The very first HDR movie was Disney’s ‘Tomorrowland’, which was released theatrically on May 22nd of last year. Other notable HDR titles include ‘Inside Out’, ‘Pixels’, ‘Mission: Impossible – Rogue Nation’, ‘The Martian’ and ‘Star Wars: The Force Awakens’.

Not every movie that gets released on Ultra HD Blu-ray will be encoded in a High Dynamic Range format. (The UHD spec contains three competing HDR standards.) However, it is possible to re-grade older movies into HDR, and of the supporting studios, Warner Bros. has announced that it plans to do so for all of its Ultra HD Blu-ray releases. I’m not entirely sure how I feel about this. Re-grading a movie for HDR is a form of revisionism that the filmmakers did not intend when they originally made the movies. If those filmmakers are still alive and approve the decision, I might be interested to see the results, but I have no more interest in ever watching ‘Lawrence of Arabia’ in HDR than I’d want to watch ‘Casablanca’ colorized.

That 4k TV You Just Bought Is Already Obsolete

Sadly, the Ultra HD rollout has been a confusing mess. The UHD Alliance only just recently settled on some of these critical features, and 4k TVs purchased in the past (even many still available in stores today) may not be compatible with either the enhanced colors or High Dynamic Range. To truly take advantage of everything that Ultra HD Blu-ray offers, you need to have a display labeled with the new “Ultra HD Premium” branding.

Even then, with three competing optional HDR standards, there’s no guarantee that the HDR decoder built into any given Ultra HD Premium set will be able to decode the HDR format on a specific Ultra HD Blu-ray disc. What a disaster!

About Josh Zyber

Josh Zyber is a veteran movie and video disc reviewer from Laserdisc to DVD and beyond. He's previously written for DVDFile.com, DVDTalk.com and Home Theater magazine. These days, he wastes most of his free time managing this blog and writing the occasional Blu-ray review for High-Def Digest.

164 comments

Chris B

So if you DO own a 4K tv that was built in late 2014/ early 2015, it WILL still display the new UHD format but the only added feature will be the increased pixel count? Which you said will be subtle at best? Correct?

You just caused me to check my specs, as i JUST purchased a Sharp 4K two weeks ago. 4K wasn’t a priority, but the price was too good to pass up. Thankfully, it checks out as compatible. Thanks for the informative read.

James

Yashar mehrfar

Yes your samsung js8500 is ok. It has a 10 bit panel with the inputs being able to handle hdmi 2.0a and hdcp 2.2 compliance. I would highly recommend getting your TV calibrated to give you the most accurate colors, pre calibration samsung Tvs are ok, post calibration is when it will really stand out.

Zed

I know nothing about 4k, just bought my tv yesterday and now read your forum..

Is my model a good one or should I take it back. Should I get the supreme model, that was 700$ more? I’m feeling a,little bit screwed here…. should I wait 5 more years to by a 8k tv.. I don’t know what to do. It sounds like it’s going to be messy as far as the eye can see.

Jared Chamberlain

Josh, correct me if I’m wrong, but most Samsung, LG, and Sonys that came out in 2014 and 2015 are fully capable of receiving a UHD signal with full hdcp 2.2 support. Also, I believe that most 2014 and newer Samsung UHD TVs, and 2015 Sonys also are able to support HDR 10 with a firmware update effectively updating the HDMI to 2.0a. From what I’ve read, at least Dolby Vision actually sits on top of HDR10 just like Atmos is encoded with a core TrueHD track, so, most HDR10 sets should display that core HDR metadata if not the full Dolby standard. I may be misinterpreting what I’ve seen, but that’s what I’m getting from the info I’ve seen…

I haven’t followed what specific manufacturers have offered that closely. As I said, owners will have to check their models’ specs.

As for Dolby Vision being transcodable to HR10, I hope that’s true, but even if so, it sounds like you’ll still miss out on some Dolby Vision features. Why the UHD Alliance decided to approve three competing standards is perplexing to me.

Jared Chamberlain

It’s definitely a racket of sorts. What’s hilarious to me is that, in theory, older movies shot totally on film with practical effects will look better with a rescan than modern ones shot digitally and upconverted… Can you imagine how terrible those crocodiles will look in Eraser when it gets a UHD release?

I made the mistake of buying the new Ghostbusters on 4K for $10 (What a waste) but, it did prove itself somewhat valuable. I purchased the NEw 4K masters of the original 2 films and last night was watching Ghostbusters II, which has a beautiful transfer and compared it to the new one. The Original films both look far better in 4K and even the old effects are somewhat well delivered in the new standard.

THe new film just looked exactyl like its BluRay does… and still sucks

Richard

I believe that only 2015 Samsung UHD models (and older models that have an upgraded SEK3500 OneConnect box) have received the HDMI 2.0a upgrade. However, they can still play HDR10 through streaming and USB. They will still be able to play the new UHD Blu-rays (as long as they have at least 1 HDMI 2.0/HDCP 2.2 port) – they just won’t be able to decode the HDR metadata.

As to Dolby Vision, no currently released TV can play it yet. Even the new Samsung UHD Blu-ray player coming out next month doesn’t support it. But here how it works for UHD Blu-ray discs:

The Blu-ray Disc Association (BDA) has mandated that any Ultra HD Blu-ray disc always start from a generic (SMPTE BT 2084 standard) HDR10 base layer (which requires an HDMI 2.0a input) and, if the content provider so desires, a proprietary Dolby Vision layer can then ride on top of that.

Stated differently, all Ultra HD Blu-rays will include basic HDR10 metadata while some will add Dolby Vision HDR metadata that will ride on top of the basic layer.

Richard

Well for most current TV, I’d say “not too much” – not that we will be able to notice anyways.

What I mean by that is that the “extra features” that Dolby Vision bring to the table need much more “capable” TVs than are currently available to the consumer market (such as TVs with much higher brightness/nits levels and so on).

Dolby Vision uses 12-bit metadata (sent through a 10-bit pipe) and offers support for a peak brightness up to 10,000 nits, as well as support for legacy hardware (backward compatibility).

It also uses what it calls “dynamic metadata”.

Dynamic metadata, which Dolby developed, sits on top of the static metadata, and is a scene-based set of metadata that is regenerated for every single scene. Every time the scene cuts and there is an edit, there is a new set that informs the display management engine inside the television how to best map the color volume of the source content, where the color volume is the combination of the dynamic range and the color gamut.

C.C.

By moving forwards we are going backwards. It will be hilarious when the 8k Guardians looks like a freakin Soap Opera on steroids- and they will scramble to apply the film look process to undo all of the “video look” that they paid millions for.
When you are pushing up a hill with technology, you have to realize when you have hit the top and any extra is just going back down the mountain on the other side.

njscorpio

I feel that the future for 3D entertainment is not with 4k, but with Oculus Rift (and the similar products being released). I’d rather invest in a Rift, and a PC to run it, than in a 4k display & player, but I know we are talking apples and oranges now.

Deaditelord

I am a fan of good 3D too NJ, but I’m disappointed by this news. For me, there are two aspects of 4K that interest me, the noticeable improvement of passive 3D, and HDR. With 3D pretty much dead in the home market now (sigh… here’s hoping glasses-free takes off in the future), all that’s left is HDR and even that feels like a crap shoot with their being 3 competing standards. I hadn’t considered the Oculus Rift as a 3D substitute, but I’m somewhat doubtful that the images produced inside the Rift will be comparable to what can be displayed on a high-end HDTV. It certainly won’t sound as good as a home theater system with Dolby Atmos. Besides, I’m not convinced VR is going to take off. The minimum PC requirements needed to use VR require a very expensive computer. Typically computers that barely reach the minimum spec don’t run all that great. Factor that with having to drop another $600 for the RIft itself and that seems like a recipe for failure.

Todd A.

I don’t know where people are getting the idea that home 3D is dead. So far all the premium 4K OLED sets worth owning (if you are going to want 4K and HDR, OLED is the only way to go) support 3D. They aren’t touted as “3D tvs” in their names anymore, but they do support 3D . They make existing 3D blu-rays look even better than current 1080p sets since you don’t have to halve the resolution on passive 3D displays. 3D is so cheap and easy to support in any modern tv, they don’t advertise it is a premium feature any more than touting “now with color”, so that might be why it appears to have less support. The other thing is that blu-ray isn’t the only source for 4K movies. There’s no reason a VUDU or other service can’t add 4k, HDR 3D movies for digital download at a trivial cost. UHD blu-ray is already a collection of too many standards for hardware manufactures to implement in cheap SoC, so they probably just said “let’s just stop now or we’ll never get this thing finalized”. This is why physical medium is on its last legs. “But streaming looks like crap!” I hear you say. That was true at first, but bandwidth and codecs improve constantly while blu-ray hardware is pretty much set in stone. Many fiber customers already have better throughput than the max 80mbs/sec that UHD blu-rays allows, and that number of people will get larger each day. More efficient codecs can be deployed at will on digital downloads since anyone downloading a movie can also download a software update, while adding anything new to blu-ray is a years-long battle and a potential customer support minefield with standalone players. The 100GB discs it took them a decade to get standardized are too little, too late when I can buy a 128gb flash drive for $30 that I could use over and over again to transfer a movie from a Redbox kiosk. And if they didn’t fix the slow load times of Java menus on existing blu-rays, I want no part of UHD blu-rays.

eric

Paulb

It is dead in the sense that it isn’t included in the UHD Blu-ray format and the market for 3D isn’t growing (and it never had a chance given that no one rented 3D discs and people selling 3D tv’s never made it clear to consumers that 3D blu-ray was the only real way to get it it). VUDU’s 3D isn’t promoted and it is only VUDU. You need Apple, Netflix and Amazon to support it if it is going to have a chance. What you are talking about is simply reasons why it doesn’t need to die but without studios, hardware makers, and streaming providers promoting it (which they aren’t), it isn’t going to take off.
And while the 4k 3D tv is exciting to get full 1080p per eye with passive glasses, the problem is that you just bought a fancy UHD TV and we have to make a choice on buying discs. The UHD titles are shipping with 2D blu-ray. You either have to pay $50+ per movie for both discs or you have to make a choice to drop one.
The only hope 3D movies has is with the future of VR googles and that will take a few years to settle and become mainstream. The display tech to give similar resolution (have to have pixel density more like 8k per eye that close to match a 4k tv) and HDR, Color, etc is potentially further off.

Deaditelord

Three HDR standards?! I was not aware that there were competing HDR formats. What a complete cluster****. Unless each HDTV can decode all 3 HDR formats (unlikely), we’re forced to settle on the one HDR format our 4K HDTV supports and just hope that the studio decides to encode the UHD with it. If it doesn’t, we’re SOL and that 4K blu-ray is essentially no better than the current blu-ray on the market. I’ve always been one to embrace technological advances, but UHD feels more and more like a step sideways (or even backwards if you count the loss of 3D playback), a format whose existence is entirely based on bilking consumers out of their money under the pretense that UHD movies will look better than regular blu-ray. That’s clearly not going to be case in many situations. Here’s hoping that UHD fails and fails hard.

Pedram

It’s really sad that most of the UHDBDs will basically contain upscaled content. It’s reminiscent of the low quality Blu-Rays released where they just took the DVD masters and upscaled them to 1080 and called it a day. How are they getting away with this? I know it’s about more than just resolution, but when you’re selling 4k mostly on resolution (to the average person at least), it’s very deceptive to not really give them proper 4k (yes, technically “UHD”) content.

Also, it seems kind of silly to make a 4k DI of Maze Runner when it was shot mostly in 2k and it would only benefit a few scenes they shot in 5k. Does it make any difference to upscale when making the DI rather than upscale the DI when making the disc?

And here’s to holding out for a possible 3D addon to UHD way down the line. I’d even take an upscaled 2k 3D DI if it meant that they could also add more colours, HFR, HDR and newer sound formats (e.g. Atmos) to the disc.

Timcharger

Elizabeth

If it makes you feel any better, during that brief period when they tried to release high resolution audio on disc, I bought a DVD player capable of playing both DVD Audio and SACD. Then both formats failed.

Timcharger

In keeping with the spirit of helping others,
by steering you away from my choices. I should inform you all
that I just upgraded my receiver for Atmos capability. Now, I
tried to game against the my fate by selecting a receiver that
will have DTS:X capability by software update later this year.
So I hedged by owning both formats.

If the Fates are merciful, Atmos and/or DTS:X will survive. But
there is a part of me that thinks my kiss of death will mean
that both will now fail. You’ve been warned.

C.C.

Bill

Uh Tim. You are aware I hope that it looks like Dolby Atmos going forward will only be on Ultra HD BDs and rarely on 1080P BDs? I’m not sure if that also applies to DTS’ new flavour by the way.

I agree with Josh. 3D was badly botched by jumping the gun on hardware/software. You’d think the manufacturers, suppliers etc. would have learned from that fiasco and not introduced anything until the details had been settled.

Deaditelord

I’m glad HFR never caught on. I had a chance to watch The Battle of the Five Armies in HFR and it looked really strange, like someone was fast forwarding the movie but the audio was fine. Even after two hours of watching the HFR seemed unnatural.

Shannon Nutt

I’m heavily invested in 3D and my 3D Blu-ray collection (unlike many, I LOVE the look of 3D on home video), so I’m holding out as long as possible for any 4K upgrades. I suspect that the format will flop…I think 4k TVs will catch on, but most will use them for 4k streaming, not buying physical product.

Csm101

I’m pretty balls deep in 3d myself, and I’m ok if they don’t want to make 4k 3d at this point, but I need them to still make 3D capable tv’s in case mine crap out. That has me a bit worried for the future. It would’ve of been nice to be able to watch some of my longer 3D movies on just one disc if there were a way to utilize the extra disc space to fit at least 1080p 3d on a 100 gig disc.

Deaditelord

That’s where I’m coming from too. I want to buy another 3DTV once HDR HDTVs becomes available at a more reasonable price, but I’m concerned that there won’t be any available since companies are starting to move on from 3D (Vizio has dropped 3D support).

Bill Tullis

WAY TO GO, JOSH, FOR EXPOSING THE TRUTH!!! The industry, to its shame, relies on consumer ignorance of advanced technology to promote its wares. 4K is the new slogan to suck us in while most will never realize that there must be 4K production throughout the entire chain in order to see a true 4K image on one’s display. Thank you, Josh, for your explanation of this emerging technology. For those interested in similar mythbusting in the audio realm, read the posts by Dr. Mark Waldrep of AIX Records (Dr. AIX) at http://www.realhd-audio.com/ to understand how the music industry repackages its catalog as “high resolution” products with little to no discernible improvements in sound quality.

itjustWoRX

I have absolutely no desire to adapt to 4k. This isn’t anything close to the switch from DVD to HDDVD/Blu. If they start pulling crap with “4K exclusive yadda yadda,” it will just push me further towards switching to a streaming catalog versus a physical collection.

I believe the idea is that when people change or upgrade their older tv, 4k will be their next step as the TVs are so cheap. The KU6500 Curved Samsung, which is the base level of the series and still beautiful with a ton of features retails for $697
Walmart even had a “Fake 4K” Philips 55″ on black Friday for $299 (It streamed 4K but could not recieve content, found out the hard way)

There is No huge jump like there was when we first embraced HDTV then Full HD with significant cost. I believe I paid $3000 for my first HDTV and another $2600 a couple years later for 1080P when it became available (Might have been more) where I can buy a very high end , 65″ Samsung with all the bells and whistles for roughly $1500 and a excellent 4k in the same size range for around $500-700 now.

The Xbox One S is a wonderful 4K BluRay player and retails at $250 with all the additional functionality so when I was feeling my Sony Bravia, which has served me well for many year was starting to get a little long in the tooth and its apps were lacking by today’s standards, the jump was seamless for me. Pay a little more than half what I did for the Sony, get a amazing picture and sound and tons of apps .. so yeah, I would say if you have a nice 1080p setup and it isn’t any issue at this time, no reason to change, but I couldn’t find a reason not to step the tech up when it was time to buy a new one as 90% of the 1080P sets I found in the price range had less features and none looked as good as the SUHD with Quantum Dot Technology

Charles M

Pedram

No, not exactly. That’s what I initially thought, and wondered why we need new TVs when regular monitors can display HDR photos just fine. Basically, from what I understand, regular TVs can’t show good detail/colour in bright and dark spots at the same time, but HDR TVs, given the right content, will be able to.

Just as an added note, photo HDR doesn’t have to be so controversial. It can be done right to show more detail and less blown out images (closer to what you’d see with your own eyes), or it can be overdone to produce hyper-realistic images, (which you’d never see with your own eyes, unless you were tripping out). The latter is where it gets more controversial I’m guessing.

WebDev511

I just replaced the DMM chip in my 2007 Samsung LED driven DLP set, so this just re-enforces what I was going to do anyway which is hope I can get another 8 years out of my DLP. I figure by that time 4K OLED will drop in price as well as movies getting much cheaper. I’ll let someone else pay to beta test 4k.

The 2D version of the movie had a 4k DI. That DI had to be downsampled to 2k for the 3D conversion. There is no production workflow for 4k 3D in all of Hollywood. That requires way too much processing power and bandwidth.

Sadly, only the JS9500 line will receive and display at HDR10. The other sets are compatible with a firmware update, but they won’t necessarily display at that luminosity and are not FALD from what I understand.

cardpetree

I know. The whole thing, especially with three competing HDR standards, is pretty damn confusing, but essentially, it’s about color depth, of delivering a wider array of colors, deeper blacks and cleaner whites.

A combination of peak brightness and black level either:
More than 1000 nits peak brightness and less than 0.05 nits black level or
More than 540 nits peak brightness and less than 0.0005 nits black level

The first part is for LED LCD TVs, the second is for OLED TVs.

In fact, even the JS9500 doesn’t quite make it to 1000 nits (but it gets very close).

The JS8500 maxes out somewhere between 500 and 600 nits.

Now this does not mean that the JS8500 you own is obsolete, just that it won’t deliver as good HDR experience (the ‘specular highlights won’t be as bright) as the approved “Premium” certified sets.

The UHDA minimum specs for HDR relates to TVs being “eligible to license a ‘Ultra HD Premium’ logo from the UHDA for promotional and marketing purposes”. (See quote below)

“Products and services which meet the performance metrics will be eligible to license a ‘Ultra HD Premium’ logo from the UHDA for promotional and marketing purposes.”

It is not about “controlling”/”regulating” the production, distribution, and viewing of HDR content.

All HDR ready/capable TVs that do not have the “Ultra HD Premium” designation will still be able to play HDR content regardless of whether or not they meet the minimum UHDA HDR specs for the “Premium” certification. Including the JS8500.

Richard

David Staschke

Honestly, this is getting ridiculous. I just watched a DVD on my 65″ HD TV and it looked fine. The movie was Starred Up and its not on blu-ray, so I rented the DVD and I was perfectly happy with the way it looked upconverted on my PS3. Blu-ray was actually a much-needed upgrade in terms of AV quality for the movies, but 4K is just a marketing gimmick to lure suckers into wasting money. Any older movie shot on film and scanned in 2K or 4K on blu-ray looks great. As long as the video is mastered correctly, I’ve never watched a blu-ray of a modern movie and thought, “This needs improvement.”

Justin

I didn’t notice if this question has been mentioned yet, if it has I’m sorry.
We get 4K/UHD will not support 3D. But is it safe to assume that the 4K/UHD machines will be able to still play our regular 1080p 3D Blu-ray Discs?

C.C.

I just recently upgraded to a 4K Samsung 65″, replaced my receiver with an Onkyo capable of receiving Atmos, and True HD, and upgraded my speaker arrangements to encompass Atmos, including front Pioneer Elite Atmos and a center channel Definitive Technology Mythos Seven . It seems that my setup is currently compatible for any of the changes, though I assume it will be obsolete sooner or later. Everything is also 3D compatible, though glasses must be worn.

All I know is that the combined picture and sound though naturally smaller, will rival anything I’ve seen and heard in the theaters. The 3D is awe inspiring clear, the sounds envelopes you front, back, left right, and now up and down. An action movie can be so engrossing that there’s an actually feeling of exhaustion at the end.

Like any other technology, if we wait for the latest updates/upgrades, we’ll never jump in. It’s sort of like computers…grab what’s best now and don’t worry about tomorrow. Like everybody else…including the experts…I’m confused by the choices, yet a little bit of research and smart shopping has led me to a system that is absolutely beautiful and that integrates through ACC in miraculous ways. Instead of grousing…I love the changes…confusing as they are.

Richard

I suspect that someone will take up a class action lawsuit over this topic before too long.

As a video professional, I have kept clear of 4k as much as possible. I haven’t even upgraded my home system yet. The main reason is: the broadcast standard for UHD/4K was very recently approved (and they are still arguing about it). Therefore, very likely none of the current UHD/4K tv’s will receive or decode over-the-air UHD/4K broadcast signals without an additional box or some kind of upgrade (does this sound familiar?). TV stations – many of which just upgraded to 1k HD equipment – are not going to want 4k for years. Upgrading cameras, studios and transmitters isn’t cheap – and Netflix, Amazon and others are taking their audience so they are struggling to get advert dollars. For my business (local weddings, sports, corporate video), NOBODY is asking for UHD/4K content. Just a few have asked for HD .MP4 files. Nobody asks for Blu-Ray.

Here’s the thing: good 4k requires obscenely expensive cameras, editing computers require lots of expensive horsepower, lots of storage (more money), and backup storage (more money). I absolutely love what I have seen of (pseudo) 4K, and Dolby Atmos is amazing… but until the content creators and broadcasters catch up (with sports broadcasts especially), UHD/4K will continue to spin its wheels in the sand. Then there will of course be 8K…

Ross

This just solidifies my reasons to not upgrade. I’ll never say never but but the odds are I won’t. I’ve been “into” home theatre since I was a kid purchasing VHS. Then of course DVD… Blah blah blah I’ll stop. But seriously I have close to 1000 BD’s and studios can’t get transfers right on Standard BD. Do we expect that they will get it right with this format? How many years has it taken to get a good transfer of Goodfellas? We finally got one last year. HDR was the only feature I was really interested in was HDR and now reading this article I’m concerned. I think the possibilities for new film is great but the thought of the studios altering classics is very concerning. I was looking at upgrading my 1080p projector and was holding off on pricing for 4K. After reading this I think I will go ahead and upgrade to a new 1080p projector. It will take many years for this mess to get sorted out. For now my 103 Oppo and Epson projector are perfect for my needs. After all I spent a lot of money and spent many hours calibrating my projector to maximize its accuracy.

It claims that certain newer series is streamed in 4K. Having just binge watched House of Cards, it was as clear as if it were a blu ray, discernibly more clear than titles not being sent via 4K. Is it “truly” 4k? I have no idea, but the picture is an absolute “wow” and the sound is crystalline. . Amazon does not offer this yet.

Elizabeth

Actually Amazon offers 4K with HDR which is a step above what Netflix is offering (no HDR). I’ve watched episodes of The Man In the High Castle and Mozart in the Jungle on my LG OLED in 4K with HDR. I believe other series are in 4K as well, though I don’t think all of them offer HDR. But I would like to know if they were filmed and completed in true 4K. They’ve looked pretty fantastic but for what I paid for that TV, a test pattern should be beautiful enough to induce orgasm.

This article is the whole truth. We don’t need 4k. We just need higher bandwith and more colors. All movies in cinemas are 2k, and we never complain about resolution. But a cinama movie is 250 Mbit/s, 12-bit per pixel, and XYZ colorspace.
Blu-rays are 8 bit per pixel in limited range, that is, 16-235 luminance values. We need the Rec.2020 colorspace to approximate to cinema color. We don’t need upsampled 4k image. We want movies shot that way but treated right to 2k with maximal quality. It’s the same as digital music. DSD and 96 KHz/24-bit is good for studios, but we have enough with 44KHz/16 bit, which benefits from the larger sample.

Before the new push for something to market, after the great 3D flop, we always referred to resolution in terms of the number of vertical lines displayed. Most of us should remember this. 480p actually included more vertical and horizontal lines of resolution, I think 520 x 854, but because they were generally not displayed on the then new 480p TVs, they were not counted.

1080P High Definition sounds much better than “1K”, but that’s what it actually is. 2K is simply a doubling of 1K resolution and 4K a doubling of 2K. UHD is 2K. When the idea of going to the next level from 1080p was first bandied about, for a few months people were referring to it as 2K, then because that didn’t sound sellable to consumers, who were understandably getting sick of buying and rebuying all their movies and newer equipment to play them on, the term became changed to the more sellable, non-quantitative, “Ultra High Definition”. But after having heard about 480p (!) and 1080p (!!!) for so long, people wanted a quantification, so the little white lie of calling 2160 x 3840, 4K was invented. The justification that seems to have kept the manufacturers out of court was that you were getting double the resolution on both axises, horizontal and vertical, thus “4”K. Which is a flat out misleading lie invented for sales.

So I find it rather humorous that we are talking about things being mastered in 2K, and not up to snuff with our “4K” TV’s and projectors, which would benefit most from “true” 4K masters, when in fact our sets are all 2K displays that could never render a true 4K image without scaling it down by 50%.

The bigger concern I have is that UHD programming will be trimmed by approximately 10% in order to fit bandwidth limitations, necessitating a loss of image information, and the same mathematical rounding and lack of pixel for pixel accuracy that drove us from 480p to 1080p in the first place.

Wouldn’t it be nice if manufactures would present their products honestly, and move us forward without creating a loss of quality we will have to fix somewhere down the road–no doubt creating another sales market. “Now watch FULL 4K! Get all the pixels you’ve been missing, WITHOUT the image softening and motion blur!!” Just dip into that wallet …..again….

Omar

Elizabeth

So if all these movies are 2K and UHD is in truth 2K despite being called 4K, doesn’t that mean that most of the article gets tossed out and we will be getting an actual 4K (in TV marketing lingo) and not a 2K image upscaled to 4K?

Or for simplicity: 2K DI = “4K” UHD Blu-ray
(both would have the same pixel dimensions ignoring black bars needed for aspect ratio).

You’re working under a few misconceptions and inaccuracies here. The actual relevant numbers are:

Digital Cinema “2k”: 2048×1080
High Definition Home Video: 1920×1080

Digital Cinema “4k”: 4096×2160
Ultra HD Home Video: 3840×2160

The differences between the cinema and home video numbers comes down to the fact that the aspect ratio standard for digital cinema is 1.89:1 while the aspect ratio standard for home video is 1.78:1 (a.k.a. 16:9). Although it seems like the home video numbers are smaller than the cinema numbers, the difference is not really significant.

Yes, it’s true that the cinema master has to be scaled down slightly for home video, but the concerns about scaling and a lack of 1:1 pixel mapping are not particularly relevant in the modern age. The fact is that the image is already scaled by the time you see it in a cinema anyway. The digital cameras used to shoot theatrical movies are higher resolution than the cinema projection standards. For example, the Arri Alexa camera (one of the most popular in the industry) has a 2.8k sensor, but is used to create a 2k Digital Intermediate. Some movies are shot with 4k or 5k cameras and yet still scaled down for a 2k DI. The extra resolution during image capture allows the filmmakers a little latitude to adjust their framing and composition in post-production by sliding the 2k window up, down or side to side if they don’t like the way the framing came out during photography.

Any movie you watch today has already been scaled at some part of the production chain. Both the HD and UHD formats are high enough resolution, and modern scaling algorithms are advanced enough, that image softening isn’t often a serious concern anymore.

OmarF

I agree with you that there is little practical difference between 2048×1080 and 1920×1080. But since when did either become 2K? The relevant number is still the vertical resolution, which is 1080 in both cases, or 1K. To be clear, “K’ is the metric abbreviation for “1000”, which for all intents and purposes, is what 1080 is. 1000 lines of vertical resolution. Whether we tack another 100 or so lines of horizontal resolution on or not, is irrelevant.

Something very manipulative has happened in the way the industry is currently defining resolution, and it’s due to marketing and wanting to give people the perception they’re getting 4 times what they had before, when in fact, they are not. They are getting double.

We need first to agree on the fundamental point that it is VERTICAL lines of resolution which is used to define the resolution of an image, and has been for decades.

We called 480p, “480”, because of its 480 vertical lines of resolution.
Same goes for 720p—720 lines of vertical resolution.
Same even goes for 240, which was the VHS and previous television standard.
We called 1080p, “1080”, because we were counting vertical lines of resolution.

The same thing applies to what you wrote here:

Digital Cinema “4k”: 4096×2160
Ultra HD Home Video: 3840×2160

Flipping the numbers around and placing Vertical resolution at the end (2160), and Horizontal resolution at the beginning (3840 or 4096), does not suddenly, magically, make your resolution 4K. Both of these are STILL 2K, because, again, video resolution has always been defined by lines of Vertical resolution, which in both cases is, 2160. 2K. 2000.

I’m sorry Josh, but you’re the one working under a misconception, here, and I get the feeling there are a lot of other movie fans who’ve been bamboozled, as well. It appears that the consumer market has been brainwashed and reprogrammed with new marketing nomenclature, and manipulated into believing a total lie as the new standard.

Now suddenly, just because we flip V x H, to H x V, we have double the former resolution? Obviously, in widescreen format, the horizontal resolution is almost double that of the vertical. So if you flip the expression 1080 x 1920, around to 1920 x 1080, you can give the illusion of having “2K” resolution, but that is a cheap cheat, probably snuck in by marketing strategists. I call foul on this, based on decades of precedent. 1080 was always 1k, and doubling it to 2160 will only give you 2k, which is what UHD actually is.

Just because frame *aspect ratio* has commonly been referred to as Horizontal x Vertical, 16×9, does not mean that is how we count *screen resolution*. They are two different topics. Again, screen/pixel resolution has always been determined by counting VERTICAL lines of of pixels.

UHD is 2K. There is no true 4K for the consumer market, outside the imaginations of the consumers, and the marketing schemes of manufacturers desperate for something HUGELY newer and bigger to sell.

Even in your statements above, look how you’re talking about a “2k Digital Intermediate”. By your own definition (2048 x 1080), it’s nothing but a 1080p image with a 128 extra lines of horizontal resolution tacked on, and not 2K at all.

Can you find any examples of any reputable person in the industry talking about 1080p as “2K” at the time of its inception? I followed the oncoming of 1080p very closely from around 2000 until it’s eventual decent into the mass market, and I don’t recall ever ONCE hearing it being referred to as 2k. This is something new.

Regarding 1:1 pixel mapping.
I was not referring to the downscaling of the original cinema master for UHD mastering, as you seemed to think I was saying. I was referring to significant pixel loss during actual playback between a UHD source device and a UHD television. Joe Kane talked about this at length on an AVS interview a few years ago. Because of the bandwidth limitations of HDMI, and fear that consumers would balk heavily at AGAIN having to repurchase all their equipment to get the video bandwidth needed to transmit the full video signal of UHD along with full audio, the decision was made to trim the source material by 10 or 12% to allow for passage through the insufficiently narrow HDMI 2 bandwidth. The content on UHD discs will actually be trimmed before passing to your television. Then either your player or TV has to rescale (read STREEETCH) the downgraded image to fit its full array of pixels. There will certainly be a loss of clarity due to this process, what to speak of the boarder information in your picture.

This kind of crap is exactly why videophiles jubilantly switched from DVD to Blu Ray. No more 3:2 pull down, no more motion blur, jaggies and scaling artifacts. A true, 1:1 pixel presentation! Now, shot right back to hell by the new format.

For my eyes, the best use of my Samsung 9500 is to be a perfect line doubler for my blu rays. Fed by an OPPO 105D, with a little extra depth and sharpness added by Darbee, I get a rich, dense image every bit as nice as the 4k material I’ve sampled on netflix and amazon prime.

Richard

Digital Cinema 2K and now 4K and 8K has ALWAYS been based, unlike TV resolutions, on the HORIZONTAL resolution not the vertical resolution. And NO, the resolutions have NOT been reversed or “flipped” either.

As to your comment about the 2160 resolution being double the 1080 resolution,

1920 x 1080 = 2,073,600 pixels

3840 x 2160 = 8,294,400 pixels

8,294,400 divided by 2,073,600 = 4

A UHD 2160p TV has exactly 4 times the amount of pixels as an HD 1080p TV.

Subscribe to The Bonus View RSS

Get all latest Bonus View content delivered to your email daily!

Welcome to The Bonus View presented by High-Def Digest!

This blog serves as a catch-all for topics, beyond Blu-ray, that interest us as home theater junkies including movies in theaters, high-definition gear, television shows, video games and 3D programming.