Television resolutions have become a moving target in the last ten years—every time a consumer decides to jump in with both feet and buy the apparent latest model, better screens seem to appear on the shelves within weeks. TVs took decades to go from standard to high-definition resolution. Only a few years have gone by, and now “Ultra HD” is the new glass ceiling.

At CES 2012, a few companies showed “4K” displays, resolutions with four times the pixel count (2160p lines of resolution, usually 3840x2160) of a full HD display (1080p lines of resolution, usually 1920x1080).

This year, all companies seem to have transitioned to showing “Ultra HD” displays instead of “4K” ones. Where did 4K go? Why are we back to describing displays in terms of HD?

Ultra HD, like vanilla HD, is a term defined by the International Telecommunication Union (ITU), a body that has existed since 1865 and, as an agency for the United Nations, acts as the allocator for global radio spectrum. One of the ITU’s sectors sets standards in areas like networking, signaling protocols, and telecommunications (which includes television resolutions).

Casey Johnston

The terms “Ultra HD” and 4K have co-existed for some time. The first Ultra HD prototype was developed by NHK Science and Technical Research Laboratories in Japan (the same lab that developed HD) back in 2003, for which they had to create a special camera to make sufficiently detailed footage. But just as the term “HD” before it technically covers both 720p and 1080p-resolution screens, “Ultra HD” describes two resolutions: 4K, or 2160p, as well as 8K, or 4320p, which is visually detailed enough to compare to IMAX.

It’s hard to say definitively why nearly all of the major TV manufacturers (LG, Samsung, Sony) switched from calling their TVs from 4K to Ultra HD, but we have some guesses. As 4K makes the transition from “wildly expensive product of the distant future” to “regular consumer purchase,” 4K may be a bit hard to parse for the average customer.

For instance, if one is asked whether he’s in the market for an HDTV or a 4K TV, 4K sounds like overkill—he’ll just go with HD, thanks for asking. A customer who is asked to decide between an HDTV and an Ultra HDTV is made to feel like there’s a cohesive, continuous transition between the two resolutions. One is simply a natural evolution of the other.

We hate to see a level of informational detail lost to marketing, especially since we already know what it’s like to deal with the ambiguity of the term “HD” and whether it refers to 720p or 1080p. Some manufacturers refer to 1080p as “full HD,” but its usage is still not consistent or abundantly clear. And when 8K displays roll around, here’s hoping no one burdens consumers with the term “full Ultra HD.”

4K, or Ultra HD, or whatever you want to call it, still faces the same problems that plagued HD when it first permeated shelves: first, there’s little to no content for such displays (and that content, when it does become more prevalent, will be expensive to create and distribute due to the sheer amount of information packed into 4K video). Second, the displays remain wildly expensive: LG’s 84-inch 4K monster is priced at $19,999. Since we can’t yet afford such finery, we’ve stocked up all we can on mental and digital images of these displays at CES while we await the price war.

Promoted Comments

Plain and simple, I assume the names have been focus group tested, and "Ultra HD" tests substantially better than "4k". Not surprising really, and I think the article nails it:

Customer: What is that?Salesmonkey: It's a 4k TV!Customer: What does the 4k mean? Is that the price?Salesmonkey: Ha hah, you wish! No, it's the TV resolution, which is 2160p, also defined as...Customer: zzzzzz, I'm sorry, I must have nodded off there.

Or

Customer: What is that?Salesmonkey: It's an Ultra HD TV!Customer: Sweet, I like my things in ultra!

Still, until these beasts start coming down to just a bit more than normal HDTV prices, I don't think they're going anywhere. I think they'll have a similar transition rate between DVDs and BluRays, rather than VHS and DVDs for the same reasoning. The discernible quality difference between VHS and DVDs and standard definition TVs versus HDTVs is huge compared with the next leap up to BluRay and 4k TV. So I don't think many people will see a need to upgrade quickly. Plus, as being one of the people who spent $1,400 on a new TV a few years back, I'm in NO hurry to go buy the new shiny at 10x that cost. Let me get 10 years out of that thing (I hope), and I'll get back to you.

I think the term 4K is confusing. Until very recently, I thought that 4K meant around 4000 line resolution as we always refer to current resolutions by mentioning the number of lines (720p, 1080p) I simply assumed they also referred to the number of lines and not to 4 times the area in pixel as it is actually the case.