Joint Resolution: A Superior TV Viewing Experience with UHD and HDR

By

With all the hoopla regarding “4K” UHD TVs within the past year or so, another related term has received increased attention: High Dynamic Range (HDR). The casual consumer might think that UHD TVs and HDR TVs are one and the same, but they’re not. Note that HDR Imaging for TVs is quite different than HDR for photos. As CNET’s Geoffrey Morrison notes, “TV HDR: Expanding the TV’s contrast ratio and color palette to offer a more realistic, natural image than what’s possible with today’s HDTVs. Photo HDR: Combining multiple images with different exposures to create a single image that mimics a greater dynamic range.”

4k TVs have twice as many pixels horizontally and twice as many vertically as full HD (2k) TVs—hence four times the spatial resolution. Each pixel, however, is only capable of 256 brightness levels (8 nits, for the technically inclined) of brightness in each of the color channels…. HDR TVs are capable of rendering 1024 or 4096 different brightness levels (10 or 12 nits) in each pixel. Today’s HDR TVs are largely also 4k TVs, but there is no particular reason why they couldn’t be 2k instead.

In this review.com video, the Senior VP of LG Electronics Nandhu Nandhakumar explains how HDR gives a bigger “bang for the buck” for TV viewers:

Brighter Brights and Darker Darks

Videophiles prioritize two considerations in a TV: contrast ratio (the vividness of darks and brights on the screen) and color accuracy (how realistic colors on the screen appear). Most observers agree that a TV with superior contrast ratio and color accuracy is superior to one with greater resolution, i.e., more pixels. The former just looks more realistic and natural. So even an HD TV with better contrast and color tops a UHD TV with only average contrast and color. HDR TVs provide more details in both dark and light areas of the screen—thus shadows really look like shadows instead of black holes.

One part of HDR is something called Wide Color Gamut (WCG). This breakthrough heightens the realism of colors on a TV screen. Historically TVs have been restrained by the limitations of cathode ray tube (CRT) technology, characterized by a picture with “softer” edges and inferior color/focus compared with LCD TVs. It’s acknowledged that CRTs are “sharper than LCDs at other than native resolutions.” Too, CRTs cause electrical and “Moire pattern” interference and are susceptible to “geometric distortion and screen regulation problems.”

CRTs are also not as bright as LCDs; you may remember squinting at a CRT TV while on the patio or by the pool in broad daylight. And HDR TVs are even brighter than standard LCD HDTVs. To illustrate how HDR TVs are brighter than standard HDTVs, compare their backlight systems. HDR TVs output 1,000 nits or more while standard HDTVs/Blu-rays only offer 100 nits. (A nit is a unit of visible-light intensity.) Just as flat screens have replaced the “tube,” so will HRD and WCG soon supersede the faded colors and blurred images associated with legacy TVs.

Another part of HDR is what is known as “bit depth.” Although current HDTV technology is capable of ~16.7 million colors (2563 blue/red/green colors), the phenomenon of “banding” still occurs. “Bands” are stripes of colors rather distinct patterns. Even though 16.7 million colors is a substantial rainbow, the possibility still exists that the viewer won’t be able to see subtle shades like the hues of an eggplant or an orchid. Thus the picture on the screen is less than life-like. HDTV is based on an 8-bit system. UHD and HDR TVs, on the other hand, utilize a 10-bit system. Doing the math with HDR bit depth means we now can see images with over a billion colors.

Contented Yet?

For HDR TVs to perform at their full potential, they require HDR content. As late as last year, HDR content was scarce as hen’s teeth. Until recently only Amazon offered a handful of TV shows and Sony-released movies in HDR. Now Netflix is on board but other streaming service such as Hulu, Showtime and Sling TV have yet to join the HDR bandwagon. Roku has a search service that looks for HDR movies; Vudu and YouTube have a growing catalog of HDR content. Also, Ultra HD Blu-ray specifications provide for HDR versions of theatrical releases, with the Dolby Vision standard optional. (See Coda below.)

Concerning hardware, your existing high speed HDMI cable can accommodate HDR. But your media player, e.g., 4K Blu-ray player must use a HDMI 2.0a cable to send required HDR metadata to your TV. Same for your digital receiver. Some devices such as the PS4 have received a HDMI 2.0a firmware update; others will soon follow suit if they’ve not already received one. If the PS4 serves as a typical example, hardware incompatibility between HDMI 1.4 and 2.0 will not be an issue.

Coda

As with everything tech-related, HDR is subject to standards overseen by an alliance or study group. In fact, HDR has two governing bodies: HDR10 and Dolby Vision. These rival formats are slugging out much in the manner that Betamax and VHS did back in the late 1970s and early 1980s. Manufacturing behemoths Sony and Samsung have cast their lot with HDR10 while Dolby, an established brand dating back decades, is already deeply entrenched in existing home audio and Hollywood studio home video releases.

Luckily, TV manufacturers Vizio and LG accommodate both standards, as do many content providers such as Amazon and Netflix. For those who want to add Dolby Vision to their Smart TV via a software update, think again since it’s only available with a firmware addition (with related certification and licensing fees). On the other hand, HDR10 is an open standard. Videophiles, take note: according to CNET’s David Katzmaier, “Only Dolby Vision’s is dynamic, capable of changing on a scene-by-scene or even frame-by-frame basis. The HDR10 metadata, on the other hand, is static, so it only applies once to each piece of content.”