Television resolutions have become a moving target in the last ten years—every time a consumer decides to jump in with both feet and buy the apparent latest model, better screens seem to appear on the shelves within weeks. TVs took decades to go from standard to high-definition resolution. Only a few years have gone by, and now “Ultra HD” is the new glass ceiling.

At CES 2012, a few companies showed “4K” displays, resolutions with four times the pixel count (2160p lines of resolution, usually 3840x2160) of a full HD display (1080p lines of resolution, usually 1920x1080).

This year, all companies seem to have transitioned to showing “Ultra HD” displays instead of “4K” ones. Where did 4K go? Why are we back to describing displays in terms of HD?

Ultra HD, like vanilla HD, is a term defined by the International Telecommunication Union (ITU), a body that has existed since 1865 and, as an agency for the United Nations, acts as the allocator for global radio spectrum. One of the ITU’s sectors sets standards in areas like networking, signaling protocols, and telecommunications (which includes television resolutions).

Casey Johnston

The terms “Ultra HD” and 4K have co-existed for some time. The first Ultra HD prototype was developed by NHK Science and Technical Research Laboratories in Japan (the same lab that developed HD) back in 2003, for which they had to create a special camera to make sufficiently detailed footage. But just as the term “HD” before it technically covers both 720p and 1080p-resolution screens, “Ultra HD” describes two resolutions: 4K, or 2160p, as well as 8K, or 4320p, which is visually detailed enough to compare to IMAX.

It’s hard to say definitively why nearly all of the major TV manufacturers (LG, Samsung, Sony) switched from calling their TVs from 4K to Ultra HD, but we have some guesses. As 4K makes the transition from “wildly expensive product of the distant future” to “regular consumer purchase,” 4K may be a bit hard to parse for the average customer.

For instance, if one is asked whether he’s in the market for an HDTV or a 4K TV, 4K sounds like overkill—he’ll just go with HD, thanks for asking. A customer who is asked to decide between an HDTV and an Ultra HDTV is made to feel like there’s a cohesive, continuous transition between the two resolutions. One is simply a natural evolution of the other.

We hate to see a level of informational detail lost to marketing, especially since we already know what it’s like to deal with the ambiguity of the term “HD” and whether it refers to 720p or 1080p. Some manufacturers refer to 1080p as “full HD,” but its usage is still not consistent or abundantly clear. And when 8K displays roll around, here’s hoping no one burdens consumers with the term “full Ultra HD.”

4K, or Ultra HD, or whatever you want to call it, still faces the same problems that plagued HD when it first permeated shelves: first, there’s little to no content for such displays (and that content, when it does become more prevalent, will be expensive to create and distribute due to the sheer amount of information packed into 4K video). Second, the displays remain wildly expensive: LG’s 84-inch 4K monster is priced at $19,999. Since we can’t yet afford such finery, we’ve stocked up all we can on mental and digital images of these displays at CES while we await the price war.

Promoted Comments

Plain and simple, I assume the names have been focus group tested, and "Ultra HD" tests substantially better than "4k". Not surprising really, and I think the article nails it:

Customer: What is that?Salesmonkey: It's a 4k TV!Customer: What does the 4k mean? Is that the price?Salesmonkey: Ha hah, you wish! No, it's the TV resolution, which is 2160p, also defined as...Customer: zzzzzz, I'm sorry, I must have nodded off there.

Or

Customer: What is that?Salesmonkey: It's an Ultra HD TV!Customer: Sweet, I like my things in ultra!

Still, until these beasts start coming down to just a bit more than normal HDTV prices, I don't think they're going anywhere. I think they'll have a similar transition rate between DVDs and BluRays, rather than VHS and DVDs for the same reasoning. The discernible quality difference between VHS and DVDs and standard definition TVs versus HDTVs is huge compared with the next leap up to BluRay and 4k TV. So I don't think many people will see a need to upgrade quickly. Plus, as being one of the people who spent $1,400 on a new TV a few years back, I'm in NO hurry to go buy the new shiny at 10x that cost. Let me get 10 years out of that thing (I hope), and I'll get back to you.

I think the term 4K is confusing. Until very recently, I thought that 4K meant around 4000 line resolution as we always refer to current resolutions by mentioning the number of lines (720p, 1080p) I simply assumed they also referred to the number of lines and not to 4 times the area in pixel as it is actually the case.

It’s hard to say definitively why nearly all of the major TV manufacturers (LG, Samsung, Sony) switched from calling their TVs from 4K to Ultra HD, but we have some guesses. As 4K makes the transition from “wildly expensive product of the distant future” to “regular consumer purchase,” 4K may be a bit hard to parse for the average customer.

Guess? They invested a crap load of money beating the "HD" term into various human languages. They can start fresh or pick a name that is obvious to consumers and already has marketing value.

I would suspect that marketing is the answer to the question. TV manufacturers don't want consumers to see a "4K TV" and an "8K TV" in the future, they want something that is "Ultra HD" which implies it's better than HD but related. It also provides some ambiguity to help sell what will eventually be 'lower end' 4K sets when 8K sets become the norm. "Hey, they are both Ultra HD!"

Damn near impossible to see a difference between 1080p and 4k without having a large set and sitting far closer than you'd ever comfortably want to at home. When you're pushing the limits of what the human eye is capable of detecting, I think it's time to pack it in.

We need more than just higher resolutions for TVs. The Hobbit is being shown at 48FPS, while Avatar sequels will be 60FPS. People *can* see the differences and many will prefer the higher frame rates on their programming.

Whatever the updated delivery mechanism (physical or streaming) to these higher resolutions should also include higher frame rate possibilities as well.

Plain and simple, I assume the names have been focus group tested, and "Ultra HD" tests substantially better than "4k". Not surprising really, and I think the article nails it:

Customer: What is that?Salesmonkey: It's a 4k TV!Customer: What does the 4k mean? Is that the price?Salesmonkey: Ha hah, you wish! No, it's the TV resolution, which is 2160p, also defined as...Customer: zzzzzz, I'm sorry, I must have nodded off there.

Or

Customer: What is that?Salesmonkey: It's an Ultra HD TV!Customer: Sweet, I like my things in ultra!

Still, until these beasts start coming down to just a bit more than normal HDTV prices, I don't think they're going anywhere. I think they'll have a similar transition rate between DVDs and BluRays, rather than VHS and DVDs for the same reasoning. The discernible quality difference between VHS and DVDs and standard definition TVs versus HDTVs is huge compared with the next leap up to BluRay and 4k TV. So I don't think many people will see a need to upgrade quickly. Plus, as being one of the people who spent $1,400 on a new TV a few years back, I'm in NO hurry to go buy the new shiny at 10x that cost. Let me get 10 years out of that thing (I hope), and I'll get back to you.

I think the term 4K is confusing. Until very recently, I thought that 4K meant around 4000 line resolution as we always refer to current resolutions by mentioning the number of lines (720p, 1080p) I simply assumed they also referred to the number of lines and not to 4 times the area in pixel as it is actually the case.

What will they call the next standard after that? Super Awesome Exclusive Deluxe Limited Edition Ultra HD?

Not quite. Industry standards call for one added descriptor/adjective per generation. Next one would be the Limited Edition Ultra HD, followed by the Deluxe Limited Edition Ultra HD, and so forth. The TV you described would be an eighth generation unit if factored for Standard Definition and HD.

Nitpick, 8K is NOT 7680p, it's 4320p, in other words: 7680 × 4320. (Since 1920x1080 is 1080p, etc..). I was doing the math in my head and realized that 8k is either badly named, or something else is up (7680p, at 16x9 aspect ratio, would be ~13654x7680, or 12 times the pixels of 4k, rather than 4x the pixels).

Isn't the goal to not see the pixels? There is a distance beyond which you will not see the difference between 720 and 1080, and likewise a distance beyond which you will not see the difference between 1080 and 4K, but that's not about image quality, it's about getting value for expense.

We have the TV too far away, which has the result that the viewed image is smaller as a proportion of field of view, but it looks clear enough.

We shouldn't bump up the resolution again until we can trivially capture, store, transmit, and display 1080p at 4:4:4, with lossless compression.

What's the point of increasing the resolution of a noisy luminance channel while crushing the ever living hell out of the chrominance channel?

It'd be so awesome to see a nature documentary that any frame you pause on (where the camera isn't moving) is camera quality (allowing for the fact that the video camera is doing autofocus and isn't optimizing exposure/depth of field like a photographer would).

As is, the best blue ray when paused, looks like you took a jpeg and compressed it 4 more times.

Awesome. No one really cares, because there is no content, and the ridiculous re-tooling of all our infrastructure is just not going to happen nearly as fast as the ITU (or whomever) is bumping TV set resolutions.....

Awesome. No one really cares, because there is no content, and the ridiculous re-tooling of all our infrastructure is just not going to happen nearly as fast as the ITU (or whomever) is bumping TV set resolutions.....

There is content but not from the major studios and usual providers yet.

There is no such thing as "enough" when it comes to screen size or resolution. No matter how big it is, your TV is not big enough until it fills your vision, and then it's just time to move further away. At 6' away, 60" is too small.As for the ITU, allocators gonna allocate.....

Damn near impossible to see a difference between 1080p and 4k without having a large set and sitting far closer than you'd ever comfortably want to at home. When you're pushing the limits of what the human eye is capable of detecting, I think it's time to pack it in.

There is some weight to that sentiment but I don't think it's entirely true since it mostly depends on the content. While video will likely see small benefit to most users, the jump is quite significant when you're dealing with vectors and computer generated images like you would see in console gaming. Under those scenarios, 4k resolution offers the same benefit as the 'retina' displays in current Apple products. It's effectively 4x SSAA without the downsampling.

I would agree that televisions aren't the prime candidate for such resolutions but they make the most sense from the marketing perspective of a panel manufacturer. The best actual application lies in those computer displays that will follow in a few years.

Isn't the goal to not see the pixels? There is a distance beyond which you will not see the difference between 720 and 1080, and likewise a distance beyond which you will not see the difference between 1080 and 4K, but that's not about image quality, it's about getting value for expense.

We have the TV too far away, which has the result that the viewed image is smaller as a proportion of field of view, but it looks clear enough.

That's what I mean. In order to get a noticeable decrease in quality on my TV, I need to sit closer than is comfortable. My seat about 7-8 feet from my screen, but it needs to be about 5-6 feet away before the image degrades. An Ultra-HD screen would need to be absolutely massive in order to justify the number of pixels when sitting 8 feet away.

When do we get uncompressed HD? Most "HD" Stations looks like utter crap. I shouldn't be seeing artifacts at the price they charge for cable/fios.

Just because a lot of video is compressed too much, getting rid of compression is not the answer.Cranking up the bit rate will improve quality significantly, but after a certain point the return is extremely diminished. Getting rid of compression will then perhaps improve perceived quality from 99.99% to a 100% (if at all, given digital noise at the sensor level), but explode the file size so much that it just isn't a good a deal. There are always better uses for bandwidth.

We need more than just higher resolutions for TVs. The Hobbit is being shown at 48FPS, while Avatar sequels will be 60FPS. People *can* see the differences and many will prefer the higher frame rates on their programming.

Whatever the updated delivery mechanism (physical or streaming) to these higher resolutions should also include higher frame rate possibilities as well.

I thought that human eyes were not able to see the difference with framerate over 25-30 fps? In the case of the Hobbit I think the 48 fps is for 3D.

That's mostly myths and misconceptions as the human eye doesn't really see in frames per second. There was research done that I remember reading years ago that found that eyes could adjust to almost any frames they were given ignoring frames as needed. The human eyes don't visually see every frame as far as I understand; they might see frame 1 and frame 5 while the human brain will fill in the frames in between. That gives the appearance of a smooth moving object.

Increasing the frame rate to a point seems to have a hyper realistic and almost jarring effect on some people watching the new Hobbit. I would speculate this is something like your eyes are seeing more frames visually than they want and while they're trying to simulate the frames in between they are also seeing the frames and the two aren't matching up.

I've only done casual reading about the subject and it was mostly done awhile ago but I actually did some tests myself at the time. While playing some first person shooters at 120 frames per second for a month or two and then switching down to 60 frames per second I noticed a difference visually but it's hard to explain; it's like your eyes are trying to re-figure out how to simulate the frames in between since it didn't used too.

(Edit)I wish I remembered the sites I read at the time; it was always a huge debate in the first person shooter communities especially Half Life and Counter-Strike. I ran a few dedicated Counter-Strike 1.x servers and we talked about it a lot; with some saying it doesn't matter past 30, 60, and some saying they notice a difference even between 140 and 120 frames per second. Human eyes are crazy things.

Historically, video worked in "lines" and used to be interlaced. Hence, the HD standard named resolutions as "lines" and added "i" or "p". 720p, 1080i, 1080p. Nobody cared that 720p was 1280x720 and 1080p was 1980x1080. That's because on analog video the number of lines was fixed but the horizontal resolution was defined by the bandwidth of the (recording/transmission) system UP to a theoretical maximum.

Graphic resolutions are measured as an array of pixels and the longest dimension (the width) was selected for marketing monikers : 1K (1024x768), 2K (2048x1536), 4K (4096x3072), in 4:3 aspect ratio.

Somehow, when the DCI set about defining the resolutions for Digital Cinema, they decided to use the VESA widths (2048, 4096) and the broadcast heights (1080, 2160). This way they got an aspect ratio very close to the regular 35mm flat widescreen (1,89 is very close to 1,85) and easy conversion of broadcast material to Digital Cinema (just put very small pillars on the sides).

Another case of TV manufacturers trying to sell us features we don't need so they can keep prices high. Sadly for them, the added value to us keeps getting lower and lower. We just got switched over to "full HD" sets with 1080p, so 4k has quite a wait before consumers care about it.

Why are consumer electronics companies trying to move away from the "4k" terminology to "Ultra HD"? For the same reason Apple calls their screens "Retina Displays" and not "326 Pixels Per Inch Displays."

I predict 4K or ultra HD or whatever else it is called will never catch on. Very few people want their living rooms dominated by a monstrous screen. And even fewer people watch a movie and focus on the weave of the leading lady's knickers or the grain on the protagonist's leather sofa.