4K Resolution Does Matter – Here’s When

4K Resolution Does Matter – Here’s When

4k resolution televisions are now widely available and potential buyers are wondering if the extra resolution is worth it. In some cases it is, but in most, it’s not. The details below can help you decide.

4K (and 8K) Resolution Defined

The older 1080p HDTV standard has a resolution of 1920×1080 (2.1 million) pixels. The UHD resolutions are multiples of this base 1080p resolution.

4k resolution is named for the approximately 4,000 (4k) pixels that make up the horizontal resolution across the image. More specifically, the resolution is 3840×2160, which gives 8.3 million total pixels – 4 times that of 1080p. (4k is sometimes called 2160p, and is also known as QFHD – Quad Full High Definition.)

8k resolution has about 8,000 horizontal pixels. The resolution is 7680×4320 (33.2 million) pixels, which is 16 times that of 1080p. 8k is also called 4360p.

The ITU and the Consumer Electronics Association have officially dubbed both 4k and 8k resolutions as “Ultra High-Definition”, but to complicate things, these resolutions are also commonly called Ultra HD, UHD, UHDTV, and even Super Hi-Vision.

HDMI 2.0 (or later) is required to fully support the 4k specification. (The older HDMI 1.4 spec has partial 4k support, but is limited to a frame rate of 30 frames per second. But most components with HDMI 1.4 don’t contain the electronics to support 4k resolution, even though the HDMI interface does.)

How to Tell if You Will Notice the Additional Resolution

To be able to detect the additional resolution of 4k (or 8k), the screen must be quite large and you must sit fairly close. So how do you know if your particular setup would benefit? Here’s your answer…

Based on the resolving ability of the human eye, it is possible to estimate when 4k resolution will become apparent. A person with 20/20 vision can resolve 60 pixels per degree, which corresponds to recognizing the letter “E” on the 20/20 line of a Snellen eye chart from 20 feet away. Using the Home Theater Calculator spreadsheet as a base, I created a chart showing, for any given screen size, how close you need to sit to be able to detect some or all of the benefits of a higher resolution screen. (Click the picture below for a larger version.)

(Note for those of you not used to reading charts, just jump to the calculator below)

What the chart shows is that, for a 84-inch screen, 4k resolution isn’t fully apparent until you are at least 5.5 feet or closer to the screen. For a “tiny” 55-inch screen, you’ll need to be 3.5 feet or closer. Needless to say, most consumers aren’t going to sit close enough to see any of extra resolution 4k offers, much less 8k.

It’s important to note that research by Bernard Lechner (former VP of RCA Laboratories) found the average viewing distance of American TV viewers is 9 feet. This is substantially farther than the 5.5 foot distance required to fully resolve normal-sized 4k screens. I don’t imagine people rearranging their living rooms to take advantage of the otherwise unnoticeable UHD resolution benefits.

Verification of Calculations by Sony and THX

Sony lists identical required viewing distances in the Frequently Asked Questions section of their product description. Checkout the Amazon.com product description FAQ for the Sony 65X900A 4k Ultra HDTV. It shows the same distances I have calculated (i.e. 3.6 feet for a 55″ screen and 4.2 feet for a 65″ screen.) If you don’t believe my numbers, confirmation from Sony should help convince you.

Quote from Sony FAQ:

How close to the TV must I sit to appreciate 4K?

The short answer is that between 5 and 6 ft. is the ideal viewing distance for a 55” or 65” Sony 4K Ultra HD TV. However, on a 55“, you can now sit as close as 3.6 ft and enjoy a visibly smoother and more detailed picture (e.g you won’t see the individual pixels). On a 65“ TV, you can sit as close as 4.2 ft. to appreciate 4K.

On a 50-inch 1080p HD display, most consumers can begin to distinguish individual pixels only when standing within six feet of the screen. Therefore if your viewing distance is 10 feet or greater, an Ultra HD 50-inch display will likely have little perceived benefit in terms of image clarity and sharpness [source]

Availability of 4k and 8k Content

If you are among the rare few who has a giant screen and sits close enough to it to benefit from 4k resolution, you still need UHD content. Here’s a summary of your options:

Highest Quality Options (less compression, highest bitrate):

Ultra-HD Blu-ray players and discs are available starting in 2016. This will be the highest-quality offering, with bitrates of up to 128 Mbps, giving the highest quality audio and video possible. Though discs don’t offer the convenience of streaming, it will be the best source of 4k video in 2016 and beyond. The quality of Ultra-HD Blu-ray will likely remain ahead of online stream options for years to come.

Video download boxes such as the Sony FMP-X1 4K Ultra HD Media Player and the FMP-X10 4k Ultra HD Media Player support 4k. These devices download a limited set of movies from Sony Pictures in 4k resolution to an internal hard drive. Due to the limited amount of content, high price, and low adoption rate, this would seem to have only marginal impact on availability of UHD content.

Kaleidescape Strato Players download full bitrate 4k movies from the Kalidescape online movie store. These are identical in quality to Ultra-HD Blu-ray. The company has had some recent financial issues, but appears to be up an running again. The hardware is expensive, but the quality is excellent.

Moderate quality options (more compression, lower bitrate):

The built-in Netflix and/or Amazon Prime Video apps on most 4k smart TVs will play 4k for the few titles they stream in that format. The bit rate is only about 16 Mbps, compared to 48 Mbps for 1080p Blu-ray. What this means is that picture and sound quality are sacrificed in other ways (color depth, contrast ratio, frame rate) to achieve the 4k resolution, so don’t expect perfection.

The Microsoft Xbox One and Sony Playstation 4 (and later versions) have hardware capable of 4k resolutions. Steaming video apps such as Netflix will be able to play 4K on these platforms. However, most games can’t be rendered in full 4k.

The Sony PlayStation 3 can display static 4k pictures (not moving video) using the HDMI 1.4 connection at 24 or 30 Hz refresh rate. This may be worthwhile for photographers, but probably not for anyone else.

Cable and Satellite: Cable and satellite companies are offering some 4k content on their new boxes. The quality is better than their 1080p channels, but it’s still highly compressed as compared even Blu-ray, and substantially lower than Ultra HD Blu-ray, and is generally comparable in quality to streaming services.

Roku and nVidia Shield both offer versions with 4k outputs and apps that support 4k streaming.

Dubious quality options (upscaling of lower resolution content)

Most 4k UHD TV advertise the ability to “upscale content to 4k”. The highest-end, stand-alone video processors offer only moderate improvements in quality. The video processors inside HDTVs are generally low-end, offering very little improvement in quality, and can make some up-converted content look worse. Don’t count on video processor upscaling to deliver any significant picture quality improvement.

Conclusion

The benefits of 4k and 8k are marginal. You have to sit unrealistically close to see the full detail and you need 4k source material, which is not readily available. If you use a 4k display as a computer monitor to view high resolution source material, you could benefit. Other than that, save your cash and purchase 1080p instead.

My recommendation for achieving the best picture quality for the lowest price is to focus on contrast ratio and look for these features:

Look for the HDR (High Dynamic Range) feature: HDR adds a much more perceivable picture quality improvement than does higher resolution. HDR increases the contrast ratio between the brightest and darkest regions of the screen, which is the most beneficial thing you can do for image quality. Keep in mind that HDR source material is required for this to work, but I expect this to be much more broadly available because it can be “backwards applied” to existing 1080p content.

Look for OLED instead of LED/LCD: the near infinite contrast ratio of OLED will offer a superior quality image. A 1080p OLED TV will have an overall better picture than a 4k LED/LCD. OLED is more expensive, but the prices are starting to come down.

ISF states that the most important aspects of picture quality are (in order): 1) contrast ratio, 2) color saturation, 3) color accuracy, 4) resolution. Resolution is 4th on the list, so look at other factors first. Also, be sure to calibrate your display! I recommend the following calibration tools.

“Just tell me what resolution HD TV to get”

If you don’t like reading charts and are looking for a quick answer, enter your screen size below to see how close you’ll need to sit to fully appreciate various screen resolutions.

Enter screen size: inches diagonal

For 480p (720×480) resolution, you must sit:
feet or closer to see all available detail

For 720p (1280×720) resolution, you must sit:
feet or closer to see all available detail

For 1080p (1920×1080) resolution, you must sit:
feet or closer to see all available detail

For 4k (3840×2160) resolution, you must sit:
feet or closer to see all available detail

For 8k (7680×4320) resolution, you must sit:
feet or closer to see all available detail

Note about “or closer” viewing distances calculated above: if you sit closer than the distances shown above, you will be able to see some (but not all) of the detail offered by the next higher resolution.

194 Comments

To Jeff and all Home theatre die hards,
Lets get one thing straight from the beginning. We can throw around plenty of scientific figures, but at the end of the day I have yet to be shown from any Home Theatre specialist stores a demonstration of 1080p v 4K ? If someone has ever witnessed a 1080p v 4k setup with a 1080p & 4K screens of the same size and make at the high end of the sale spectrum, with the same brand of cables and player to visualise with your own eyes plus playing the same Bluray movie to witness personally, any noticeable difference? I other thing to remember dont get caught up in what the brands feed you through there TV! Also, 8K is still to come out and eliminating any blurring will only be done with OLED. Cheers!

I think sitting as close as possible before starting to notice the pixels is the optimum viewing experience combined with getting a grand cinematic large experience is what the explanation is about if you can see the pixels then sit six inches further away until you cant see them

I’m a scientist that have been working with digital images since the late 70’s. I find amazing the propagation of errors in the subject of TV screen size. Resolution is not the same a Sampling. The sampling theorem or Nyquist-Shannon therorem states and proves that to separate two line pairs twice as many sampling elements are needed.

To verify this, apply the same test you propose to determine the human visual acuity (http://carltonbale.com/visual-acuity-viewing-distance-test-it-for-yourself/) to a TV screen or monitor . You will see that a 1080p screen is unable to resolve 540 horizontal lines. Each pair of lines requires at least 4 pixels to be resolved. The maximum chequered board that can be resolved with a 1080p screen has 480×270 black squares.

In fact the resolution is worse after taking into account that colour reproduction includes adjacent pixels.

In other words, it is wrong to assume that the viewer would like to see the individual pixels in the TV screen (Sampling), what it is needed is to match the TV resolution (minimum two pixels) to the viewer’s eye resolution.
Therefore the viewing distance diagram above is wrong by a factor of two. This means that the optimum distance for matching the resolutions are twice as large.

The article at clarkvision is correct. In order for a display to be limited by human visual acuity pixel spacing must be 0.3 arc minutes apart. That means if you were to print a grid of black-and-white lines spaced 0.6 arc minutes apart (alternating black and white pixels) you’d just be able to perceive them as distinct lines.

Apple’s marketing people made this mistake when they first started promoting their retina iPhones. They claimed 300pixels per inch at 1.5feet viewing distance was the “retina limit” because normal human vision can resolve 300 line pairs per inch at that distance. However, in order to actually draw 300 pairs of lines in an inch you need to have at least 600 pixels. High quality photo prints are often done at ~600 pixels per inch for this reason.

Thing is even getting close (i.e. within a factor of two) to that kind of display resolution is pretty much pointless for video playback where your scenes are going to be limited by things like motion blur, and you’ll practically never be portraying scenes with extremely high contrast.

IMO there’s a far more appreciable difference in picture quality moving from 24 FPS and 48 FPS film (e.g., the latest Hobbit films) than the difference between native 1080p content and 4k content. At 24fps motion blur limits the resolution of most scenes, not the eye, not your display.

is the world going crazy?. Why is there such hostility and resistance to 4K?

When 3D TV came out, I read nothing bad about 3D technology, even though in my opinion it was a total waste of money and a sales gimmick.

I looked at a demonstration of a 4K TV, it looked impressive. When you have a football match, your eye can focus and pick out a face in the crowd. If your eyes zoom into the detail, yes you can see the detail.

I agree with Michael on his remark about picture quality at 24FPS and 48FPS. I can see motion blur, especially in action shots.

I am no expert, but digital video be it SD or HD works by compressing video, my eyes keep picking up defects in certain places. There must be ‘bugs’ in those compression algorithms. It irritated me so much so, that I clung onto my old analogue CRT TV for years!

I have stood in front of many expensive flat screens and said “NO” and walked away without buying.

It has nothing to do with hostility or resistance. It has to do with science. It has to do with human optical capabilities. It has to do with expectation bias. It has to do with what you actually see rather than what you believe you see.

I can take 100 people and show them two identical 1080P televisions and label one 4K. I bet that the majority of people tell me that the 4K television has a better picture. I’ve done listening trials before and watched this happen there.

You complain about artifacts from compression algorithms for SD and HD and yet you want to embrace 4K, which requires four times the amount of data. You are kidding yourself if you believe that content providers will just launch more satellites and happily quadruple your bandwidth to accommodate 4K. They will employ even more drastic compression with a likelihood of even worse artifacts that are more visible.

I am someone who loves progress — but I want real progress that can be proven beneficial, not just a way to divert money from products that actually improve the home theater experience (better speakers, better electronics, better acoustical treatments) to ones that just enrich manufacturers.

I agree that 4k is great – if you sit close enough and if the source isn’t overly compressed. Given the general limits to bandwidth (antenna, cable, satellite, streaming), I see no benefit to 4k content when 1080p content is already over-compressed and full of color banding and artifacts. 4k Blu-ray on a giant screen will be noticeable. Over-compressed 4k streams on a small screen viewed from a far distance is worthless…

Hi, The problem I have with 4K or UHD is firstly there is little if any content on disc or Broadcast transmissions and last but not least, if 4K is to be transmitted via cable and stream to homes, they would require cables capable of being able to transmit 100 Terabits or 250 Bluray disc movies data in 1 second. That is not out there at the moment not to mention the Australian Broad Band rollout that is costing us TAX payers BILLIONS does not cater for such high transfer rates. So in my eyes I will never see it not in my life time. So what is the other solution? There is non!

Paul, The Ultra Blu-ray spec has been published and we’ll start seeing discs and players by the end of 2015. But I don’t think there will be enough movies available to it a worthwhile investment until at least 2017.

For streaming, the bit rate will be lower than discs, but the h.265 compression is more efficient and will lessen the impact on bandwidth. Still, I think the bandwidth would be much better utilized by focusing on 1080p HDR instead of 4k. Better pixels vs more pixels…and 4k is more pixels than most people need in most situations. So hopefully HDR get’s the attention it deserves. It’s included in the new HDMI spec and I think every TV will include it by year end.

Quickly tried your excel calculator, using the calculator to try and get a PPI that is close to the 530 PPI 20×13 inch photo used as an example on that site results in an optimum viewing distance less than half the 20 inches that Clark suggests.

In the past, I fully supported Carlton’s calculations. 4K only matters if you sit ridiculously close to a television — too close to enjoy a theater experience. And I wrote about it in my own Blog: http://awildduck.com/?p=2755

But, this week, I have jumped horse. With 50″ UHD TVs at just $399, all bets are off. Even if you sit close enough only occasionally, I say “Go for it!” My comment about the cost trade off appears below. But separately…

Separately, I am beginning to suspect that a demonstration of visual acuity is not the supreme test of resolution enjoyment. Let’s say that the average person cannot discern a pattern tighter than 0.3 arc minutes. But acutance perception often goes far beyond visual perception measured by a subtended angle. Acutance is the subjective perception of sharpness that related to the edge contrast. It is the reason that humans peering through a microscope can distinguish when two hairlines cross paths, even if the width of the converging hairs is considerably.

Similarly, it is the basis of the printed unit bars on a calipers. The user looks at the millimeter lines of two slightly different scales (one is stretched by a few percent). They read the calipers by finding the opposing lines that create one smooth longer line. The technique results in highly precise and consistent readings, yet the lines are too small to provide accurate read without the acutance trick.

Just as with edge enhancement in a photo (it makes the image “pop”), I am beginning to believe that 4K may provide tangible benefits beyond the classic observation of a tiny arc angle.
_______________

OK Readers. It is just over 1 year since I advised against buying a 4K television. I said:

I will not spend $1 more to get a TV that goes beyond 1080p
—and—

Thundering bass, contrast and black level are [more germane to] an immersive and a more exhilarating entertainment experience.
Retractions in my Blog are uncommon. I cannot recall the last time that I completely reversed my opinion on an issue, even one that simply reflects personal taste in entertainment gear. But this is one such situation. This week, TigerDirect is selling a 50″ UHD (4K) TV with decent specs for just $399. Since this post will live past the current market, let me point out that this is less than ¼ of the market price last year and about ¼ the cost of a smaller 42″ HDTV just 3 or 4 years ago.

With the advent of cheap and ubiquitous Netflix dongles from Google, Roku and Amazon, I don’t care about so-called Smart TV features. What I do care about is contrast, motion index, sound and black level. If these things are on par with major brands, then we have only one question to face. Can the eye can discern the tight-grain pixels of 4K TV? As we discus above (and as Carlton Bale has explained in detail, very high resolution is only discernible at a close distance…

…But this ridiculously low cost skews my past arguments. Even if you rarely sit close enough to enjoy the additional 6.8 million pixels, I say “Go for it!” At this price, all bets are off.

If TVs are designed to take advantage of it, there can advantage to having pixels that are significantly smaller than the resolution limit of the eye.
The line “full benefit of 4k visible” refers to the distance at which you can resolve individual pixels. However, you need at least 120 pixels/degree to be able to display image content with 60 line/degree features. (Nyquist theorem). This is perfectly analogous to how CD’s have sample rates of 44kHz so that they can playback audio content with frequency content up to 20kHz (hearing limit).
In practice you still see perceptible benefits with even higher pixel beyond the nyquist limit because real world anti-aliasing algorithms are imperfect and can still result in perceptible degradation of edge contrast even at 120pixels/degree.
With sub-resolution limited pixels, dithering techniques can also be used to improve the effective dynamic range of a display. For 1080p content on a 4k TV. 4 8-bit pixels ganged into a single superpixel could be used to display amplitude with 10-bits/color.

Screens are getting bigger and bigger and we are not sitting farther away. As 4k screens become larger and more affordable we will notice the difference. 4k is becoming more and more viable as costs go down and media content becomes more available. 4k may indeed be our limit, screens can only be so big before there to big and 1080p did not reach that limit.

well, go to a store that has two 50inch screen side by side one at 1080p and one at 4K and sit between them in a normal viewing distance. In my experience there was a photo of a newspaper and in 4k it was well better viewable than 1080p. And I dont mean near the TV I mean in a nice couchy distance the difference was big.

I have a fairly large family room thus my primary viewing distance will be 12 to 13 feet from the screen. I am considering increasing my current 65 inch to a 75 inch. I am a little confused regarding the advantages of 4K’s additional pixels. Will the additional pixel of 4k be advantages at a 12 to 13 feet viewing distance?

My biggest reason for not bothering with 4k just yet is that there simply isn’t enough media to enjoy it fully. Sure, I could hook it up to my PC for 4k gaming, and perhaps enjoy photos, but as for movies and whatnot… there just isn’t anything worth watching yet. The Blu Ray 4k encode standard has only just been finalized now, so we may start to get something interesting soon, but I think I’ll wait a year or so before jumping on the 4k bandwagon.
Besides, I find at the distance I sit (7 feet for a 50 inch plasma), contrast and colour balance are far more important to image quality.

I agree. And you have to upgrade everything to get 4k: source device, receiver, display device. Pretty big investment for marginal return. But eventually it may be cheap enough and common enough to justify.

With the Oculus Rift, it’s important to note that the screen might physically be 5 cm from the eye, but optically, it’s much farther away. The eye cups within the headset contain 1.5-inch diameter lenses that “zoom out away” from the screen. I can’t find the exact optical specifications, but the lenses effectively make the screen look like it’s farther away. My guess is that if each eye had 1920×1080 worth of resolution, it would be near the resolution needed for best picture qualify. A 4K screen resolution, with each eye receiving half of those pixels, would give roughly double the resolution of a 1080p screen to each eye and look fantastic. I don’t think there would be any benefit to 8k or higher resolution.

Thank you very much, Carlton. All is short and straight to the point. A reasonable approach always leads to the same conclusion – more is not always better. Though other things being equal (contrast ratio, color saturation, color accuracy, price) the higher resolution won’t be superfluous. Waiting for OLED 4K TVs for the price of $2000 for a 55″ screen. And until this moment hasn’t come, I’ll be happy to watch videos on my plasma TV Panasonic 50″ FullHD 😉

I entered 42 inches into the calculator above. Here is my result: “For 1080p (1920×1080) resolution, you must sit: 5 feet or closer to see all available detail.” This is misleading. If you sit closer than 5 feet the image on the TV will show pixels and the image quality will be terrible and nearly unwatchable.

Another example, with HD 1080p on a 70 inch TV you have to be 9 feet to see all available detail. That means you should NOT sit less than 9 feet from the TV, not 9 feet or less! If you sit less than 9 feet from a 1080p 70 inch TV you will see pixels. At 70 inches it’s important to get a 4K Tv. Why? Because at 4K 70 inch TV will be great from 4 feet to 9 feet. Perhaps you don’t need to sit 4 feet away, but if you did, it still looks good! If you sit closer than 4 feet from a 4K 70 inch TV, you will see pixels! If you want to sit closer than 4 feet, you need a smaller TV anyways.

It is time to upgrade our Sony analog tv. Our room is small so 55″ is max size. The SONY 3D 4K ultra HD TV XBR-55X950B was suggested (is this the newest model?) -OR- is it better to wait a bit for the OLED technology?
Please advise re: best 55″ for now or later. Thanks!

OLED isn’t going to happen, not at any near point in time. Only LG has stayed on the OLED horse for now.
It’s interesting to read Carlton’s blog on this topic; theoretically he’s correct. In practice, he’s not.
I’ve been shooting 4K and now 5K, for nearly a decade. My eyes are “attuned” to it, and it’s what I prefer. Even “crummy” 4K from a cheap Sony F1000V camera is far superior to great HD from a similar-grade palmcorder shooting HD, whether displayed at HD or 4K.
For those that are measurebating (getting excited over numbers). Go to a store that has a decent viewing area. View a 4K stream vs HD stream on any 4K monitor. Only a blind man cannot see the difference. Further and more importantly, 4K is less stressful on the eye, and reduces fatigue. It’s closer to what our eyes expect to see in daily life. Forget the retinal pixels per inch resolving micropoints of luminous or chrominous superpixels at subatomic parsecs per gillion.

Ok! Lets look at this from a different angle. Firstly, OLED has yet to be proven with the test of time and it lasting the distance against a top the line Sharp backlite LED. For example, with the Blue Pixel apparently failing in its earlier years than the R&G pixels and those of us who are lucky enough to have a Sharp screen the Yellow. Secondly, with Sharp releasing an 8K and Japan now streaming 8K, who in there right mind would go out and buy a 4K anyhow? What a waist of money. Thirdly, No content plus anything worth watching in 4K. Not even sport, in and around the worlds main networks telecasts past 720p content. So why anyone would go to 4K with nothing really out there is beyond me. But this is the real cruncher and I have left this till last. LED backlite with local dimming has been proven with Sharps Elite TV’s that came out in the states a number of years ago has proven in LAB results that it is far superior with perfect BLACKS, Faultless in motion, contrast & colours to even surpass the Pioneer plasma Kuro tv’s. When using top line equipment like Cambridge AZUR 751R AV receiver and OPPO players and when calibrated by a Professional you can not tell the difference. Trust me as the old saying goes.
So for those who are thinking of spending there hard earned cash don’t waste your money. Your eyes can not tell the difference at the lower end of the sales scale anyhow! From Home Theatre Enthusiast who reads alot.

I have little doubt that the brightness and lifespan of OLED will continue to improve over the next 2 years as well. I wouldn’t hesitate to purchase one (if the price was right.) As fast as technology is changing, I don’t think a multi-decade lifespan is as important as it was with CRT TVs 30+ years ago.

I still don’t see a lot of benefit for 4k and definitely not 8k. There’s not enough content available for either, and average screen sizes will have to become substantially bigger (or people with have to start sitting closer) to fully appreciate the detail. Some time in the future, TVs could very well become flexible film displays that are delivered rolled-up in a tube, and then unrolled and attached to the wall (like a projector screen is today.) That would likely dramatically increase average screen sizes due to elimination of current size / weight constraints.

I have a 123-inch projector screen, and I think that’s where you start getting to the size where 4k become useful. I’ll probably upgrade to a 4k projector (with better contrast ratios, brightness, etc.) at some point in the future, but it isn’t a high priory for me right now.

I’m about to pick up a Samsung 60 inch plasma (still in the box) – the f8500. I have been trying to sort through all the choices, very confusing. The plasma is two year old technology, but it was top of the line at the time and the picture blows most LEDs away. Am I crazy to make this purchase? I’d love some feed back.

Hi, when you say that Plasma will blow most LED’s away, is like saying which Super car is quicker a Bugatti Veyron or Mclaren P1 around the Nuremberg in Germany. There are numerous factors that need to be considered before making such a statement. What Brand of LED are you comparing it to? “There are LED’s and there are LED’s”. Have they been calibrated Professionally? What cables are they using? What feed or hardware source are you using? It then goes on and on………..
All I will say is and I have been in the Home Theatre industry for over 20years before LED was a twinkle in a home enthusiasts eye, you need to compare apples with apples.
When you see a Professional review from a Home Theatre mag.they only give you the results on what they receive in Lab. They are giving it that 5 or 4 stars on products that they received from manufacturers in the last few years. They don’t receive other high end products (Source) to run these tests with. Keep that in mind also. You will not find one person out there who can categorically say that a Plasma is better than an LED because they need to be setup side by side with all of the same testing equipment and right environment going on simultaneously so the eye can see it first hand. Figures from a magazine to a point mean absolutely nothing.
But I will say one thing and on a Final note. When LG or whoever brings out OLED in the future with 8K and once they get the life span of the Blue pixel right, then you will have the mother of all SCREENS. But until then I will stick with my 80″ Japanese made Panel that has a 200Hz refresh rate and the 4 colours RGBY.

The other thing to keep in mind. If Plasmas were that great and so popular than why has Pioneer who had the Kuro, Hitachi and Panasonic stopped making them?
Good luck shopping.!

Thanks for your insight. All too true. I do not have the money for an OLED at present, and the 4K’s seem to be still being tweaked, though I have seen some impressive detailed brand demos in stores. My thinking is that I should buy 1080 HD now for such a great price, and wait for OLED to be perfected before I drop major cash.

Paul- I am a little shocked someone with ’20 years in the home theater industry’ is asking this question regarding why they stopped making plasmas. Simple economics, amigo. They could produce LCD/LED panels for less money and the demand the marketplace for what plasma can provide wasn’t there. Higher profit margins in the big box retailers kept more and more of the focus on cheap LCDs especially. Also, the unfortunately poor reputation of some of the earlier plasmas continued undeservedly with the later crop which (as you know) were quite outstanding, nearly matching (or arguably) exceeding the Kuro level PQ. We won’t even get into the technical issues of making plasmas thinner, which is what the marketplace was also demanding. Nothing surprising here that hasn’t been discussed or established elsewhere regarding LCD vs plasma.

Pepe, I dont need anyone to tell me why Plasmas have pretty much left our shores. From cheaper Led/Lcd imports, the size of our market is to small plus the quality of LED superseding Plasma’s. But I am quite surprised that you would think that consumers would be after a thinner Plasma. As you are aware they can be wall mounted also. That also includes Plasmas. Thickness of a screen is not really a credible reasoning why
Plasma’s went out of production. It is the size of our market, + cheap Korean imports. I feel sorry for the future buyers if Sony decide to pull the pin in the country.
Sharp and Sony will always be the best Produced quality screens. They have the Technical history to prove it. Remember Sharp DESIGNED and INTRODUCED lcd into the World market.
Cheers! Paul

Paul, let me be frank. Only a blind man (not necessarily implying you cannot see) would assert that their *similarly priced* LED (not OLED) would look better than either my Panny or Sammy plasma. It’s still no contest. Even my ‘lowly’ F5300 looks better in an average lit viewing space than most of the crap LEDs people are buying now at Target & Wally World.

You state that nobody wanted thinner plasma panels. That’s hogwash. Just use the numbers on what people were buying at the time- they wanted a thinner and *lighter* panel which was easier to move/mount/maneuver. It’s also harder to engineer a plasma to be as thin as LCD. LCD fit that bill because (for the reasons you mention) cheap nature of our throwaway consumer society. We will pay $5 for Starbucks but won’t shell out $1K for a decent HDTV that lasts a long time. Oh well.

Hi, Lets get one thing straight Pepe, I always go by the old adage “You get what you pay for” for starters. Secondly, with Plasmas besides the high energy costs
to run comparatively to LCD/LED they could never match the brightness level when calibrated properly. I have a backlight LED Sharp (Japan built screen). None of the Plasma’s on the market at the time and even now come close in picture quality. OLED might be the way to go
at the high end of the market, but until they can prove to the consumer that the ‘BLUE’ pixel does not die alot sooner than the R,G and Yellow. I would not touch an OLED with a 40ft barge pole as they saying goes. Watch this space.
Cheers!

I’ve never seen an LCD that is close to Plasma/OLED in black level performance. Similarity, I’ve never seen a Plasma/OLED that can reach same levels of brightness as LCDs. The best screen technology is going to depend on the viewing environment and user requirements; there is no absolute best for all situations.

Wide color gamut and HDR are biggest enhancements to picture quality in the past several years. There’s definitely more picture quality benefit from these technologies than there is from 4k resolution alone, especially for smaller screens.

After re-reading what you wrote above, I kindly disagree with what you say, at least partially regarding plasma vs LED.

The comparisons have traditionally been between models at similar price points, not a sub-$1K plasma vs a $2500 LED. The idea that somehow professionally calibrating the Vizio or TCL LED might make it perform better than many of the last plasmas that were produced is almost silly. I had plenty of opportunities to compare many and rarely did you find an LED that could come close to even some of the last 1080p Sammies, such as the F5300 & F8500, let alone the last Panny 1080p models. Yes, professional calibration certainly helped some of the Vizios & Sammies I tested look better than out of the box, but in no way did they look as good overall- even with some challenging motion content. YMMV, I suppose.

If the price is right, the 60″ plasma is probably a great option. In general, OLED and Plasma will give the best black levels and ANSI contrast ratios. LED LCD will give the highest brightness. If the TV is going in a sun room, go for LED LCD. If it’s a dark room with not much lighting, go for plasma. If the room is somewhere in between these two extremes, figure out which performance criteria is most important to you and you’ll probably be happy with either option. My guess is that for the price, the 60″ plasma will be a very competitive option.

Stick with the plasma my friend i have a pana 42inch model a few years old and its image quality is beautifully natural clear and sharp with good blacks and high detail. I can’t believe the latest model led tvs haven’t surpassed this yet although its about 4-5 years old/. I would love to buy a 55inch 4k plasma with hdr but they just dont exist, which is a pity. The nearest comparison is oled which i\m not prepared to pay the high cost just yet.

One minor aspect of this is color depth, which isn’t covered well in the article. They can be summarized in two specs. Rec 709 and Rec 2020. Rev 709 is the definition for the color space that HD TVs can reproduce. When a movie director digitizes (or shoots digitally) a movie they will adhere as closely as possible to the 709 standard so that, when viewed on a calibrated screen the viewer sees the same colors as the director intended.

The same goes for Rec 2020. The difference is in what those two cover though. Rec 709 does not have anywhere near the color depth (the number of different color shades that can be reproduced) that Rec 2020 has. This means that even if you do not get the benefit of the resolution while watching 4K on a smaller screen (getting a screen smaller than 70″ is useless if resolution is what you are going for) you will still get the increased color space.

So, when 4K content or even 1080p content with higher color depth becomes available, even on smaller screens the image will look SIGNIFICANTLY better than HD. From the increased color space alone.

Interestingly enough, increasing the color depth also makes the image “clearer” giving the impression of much higher resolution.

All aspects of a TV experience should be considered, and color is generally more important than resolution. Your 4K TV with 4K content at 250Mb/s will blow your HD content out of the water. Even on small screens.

I own the Sharp 80″ with 200Hz refresh rate + Backlite + Quattron. and calibrated by a Professional.
Looking at the heading ” VIEWING DISTANCE WHERE RESOLUTION BECOMES NOTICEABLE CHART ABOVE” For 1080p I would have to be sitting 10′ away for my 80″ screen.For the maximum benefit of viewing 4K content I would need to be sitting 5′ away. “I Dont think so!” It would be like sitting in the front row seats and a
normal cinema.
Unless you have a 152″ screen for 4K you can then enjoy the clarity and contrast difference at the same distance of 10′ for a an 80″ at 1080p.
Read under the Heading above: “WILL I BE ABLE TO NOTICE THE ADDITIONAL RESOLUTION?” Plus the first paragraph under the chart below it. Then you will understand where I am coming from. The Human Eye can only pick up a depth and contrast difference if sitting around 5′ away for the 80″ screen. Which is not practical in any home viewing. Unless you have a small LCD for a computer monitor. You are then sitting closer the 2′. Then and only then you will notice a difference. Between 1080p and 4K. Oh! there is an another alternative. Unless you have better than 20/20 vision! Which is highly unlikely!

So please forget this theory about colour space and depth because at the end of the day the human eye can only physically and humanly pickup a certain amount of detail.
To prove my point also, go and ask a Hometheatre Specialist that sells an 80″+ size, say a OLED/LED with 4K against a Sharp or Sony manufacturerd (High end models)TV with 1080p and ask them to put them side by side, at say 13 Feet away. With the same content, cables and player running through them both simultaneously. Plus calibrated by the same Video expert. THEN AND ONLY THEN YOU CAN TELL IF THERE IS A DIFFERENCE. But I am a betting man, this will never happen! Because the manufacturers want you to buy their products only. So they set up with just the default settings from factory. The Home Theatre specialist will never give you the opportunity to allow you to make a comparison. They only want to sell LED’S OLED’S and PLASMAS not allow the unwary consumer to make comparisons. YOU EITHER KNOW OR YOU DON’T!
From Home Theatre Enthusiast for over 20years.

I think wide color gamut and HDR are huge improvements in picture quality. They offer much more benefit than 4k does. These improvements don’t improve the perception of detail from increased resolution, but they do improve picture quality in other ways.

Hi Carton,
I know you have looked at 4K and 1080p, but put them side by side like my last paragraph states then and only then can you get a better perspective of which has better quality close up.
I have watched recently Pirates of the Caribbean on stranger tides on Bluray and I can tell you when you look at those face shots of Johnny Depp and see the
pores and wrinkles on his face it looks amazing of the detail. Skin tones and colours plus depth. How much better is there than perfect?
Regards

Hi Again Carlton,
Congratulations on this is a great forum site for information.
I agree with your comment about buying a 4K for those who have money. But the million dollar question is with Broadcasters, BluRay movies producers and internet service providers like Netflix, how much money will the consumer have to spend in buying into 4K when the above mentioned have not spent the money into providing the 4K content, and if at all. I know broadcasters here in Australia are only feeding 720p content. None are doing 1080p!. Plus with with Internet you will find the average consumer who has an internet connection will be struggling to stream 4K content unless they have optical fibre cable at their door step. Which would be the minority. The majority of the worlds movie lovers would be in the same boat. Plus only a handful of 4K movies, who is going to upgrade their Bluray content and spend thousands replacing their 1080p Bluray and DVD movies library especially if you are a collector? I surely won’t be. Not for that marginal increase in CLOSE VIEWING QUALITY if any!
Not to mention that the Large consumer electronic giants have not given any indication as to how far and how long it will take to roll out 8K.
On a final note the marginal difference between 1080p and 4K I will wait for 8K
and that will only be when screens are larger than 100″. Then it might be worth buying. But that will be a while away from the main stream market.
Keep up the great feedback.
Best Regards Paul

Ok, getting ready to buy a new LED flat screen TV. We sit between 12 and 14′ away from where our TV will be located. I was thinking of a 75″ Samsung or Sony.
Question is should i go with 4K or stay with 1080P?
Currently have a 50″ Pioneer plasma (9+ years old) that we are watching at this distance.

Rod, if you are used to the great Pioneer Kuro PQ, I would seriously consider holding off until you find time to look at a HDR compatible display in 2016. Even at that screen size, your distance is still too far away for the increased resolution to be significant for 99% of the content you will be viewing. HDR is a bit of a game changer IMNSHO, and could provide better PQ for panels at the just-as-effective 1080p resolution.

Hi Rod, I have the 80″ LED Backlite Sharp 1080p 940X and I sit 4m from the face of the panel to my eyes. I tried it at 3m but it was to much strain on my eyes because everything looked so big. But everyone is different. You need to try at least 3m for a 75″ but also look at HD and standard Definition and make your own mind up over time. Hope it helps. Regards Paul

One other think. How long has 1080p been about? ‘For years now’. 4K is just new on the market. Questions like content Bluray etc, Live Broadcasts, Cables that can stream such content, Players that can play and record, plus how does it handle fast motion scenes, Not to mention with all the 1080p and standard definition content out there how does the TV handle this content of a lesser resolution or quality? These are all questions that need to be asked and answered before departing with you hard earn’t cash.
hope this helps. Paul

At Best Buy, they had a 4K TV and i saw a bigger soccer field by about 4 times as much as compared to the 1080P TV. This means the cameras will record more of the field and we will get a greater field of view on a 4K TV! All of the discussions above are pretty mich missing the point of what 4K is really about!

Eddy, Go into a store and ask them to demonstrate 4K v 1080p with Backlite local dimming Say Sharp or Sony. Say 65″+ screen size and then you can make a judgement on what detail looks better and field of view. You will find no store can accomplish such a DEMO! They need to play the same Bluray (movie) content simultaneously through the same player using the same quality HDMI cables. It will be near impossible of such a demo to take place. I can tell you from experience the only reason you would buy a 4K over a 1080p at a screen size of 65+ would be if you wanted to see the detail in the brickwork for example in the distance. But in reality how obvious is it reality! Otherwise close facial shots are much the same and not worth paying the extra money.
Your comment about recording more of the field is not strictly correct either. It might have a higher resolution and therefore you might get a smidgen wider on field. But the reason for you paying for 4K through my eyes is not really worth it. Plus not really noticeable. I have witnessed in the past what I have described up top with one res next to another. It really is not noticeable. Major companies need to progress in Technology to maintain there existence.

I watch the majority of my TV viewing via cable ( Century Link Prism ), so the feed is at 1080. My TV room will have me at around 7′ from the TV at the closest. I was thinking that what I should buy is the LG 55eg9100 based on the reasons you stated above. I went to Best Buy to see one in person. Wouldn’t you know they didn’t have one. The sales guy tried to sell me an E6. He had one with a 4K feed and one with a 1080P feed. Both looked great. He told be how the 1080 was being upscaled to 4K. I found 9100s online for about $1500 which is around $2000 less than the E6. Now that the 2016s are available does your logic above still apply? I’m still leaning to the 9100. Your input would be greatly appreciated

I’d definitely recommend going for the lower priced option. Save the $2000, wait 5 years, and buy something substantially better for less than $2000. And enjoy a great (and lower cost) TV in the meantime.

It is interesting to note that with all the advancements in visual displays – the other key ingredient in the experience (audio) has been decades behind in resolution.
I am the inventor of H-CAT (Holographic – Cloning Amplifier Technology) which is the only process made for high resolution audio, With virtual zero harmonic distortion, the natural side effect is 3D holographic (acoustic) imaging. it will soon be used with 3D movies and television. It will be possible to “register” the sound source location with the projected (3D) object.

It is unprecedented and during demonstrations of full orchestras it leaves a room full of audiophiles weeping and shocked that audio could achieve this level of accuracy.

“4k resolution is named for the approximately 4,000 (4k) vertical lines of resolution it contains. More specifically, the resolution is 3840×2160 (8.3 million) pixels, which is 4 times that of 1080p. (4k is sometimes called 2160p, and is also known as QFHD – Quad Full High Definition.)”

4k has 2160 lines of vertical resolution. Your statement about 8k is wrong for the same reason. Neither 4k or 8k are named for their vertical lines of resolution.

It is surprising the errors have lived this long given I see date stamps from 2012.

Robert, the statements are correct, but it can be a little counter intuitive. Vertical lines go from the top of the screen to the bottom. So the number of horizontal pixels across the screen (3,840) defines the the number of vertical lines of resolution and each vertical line is made up of 2,160 pixels. Likewise, there are 2,160 horizontal lines of resolution, each containing 3,840 pixels.

3840 horizontal dots is to the point. “Vertical Lines” or “Horizontal Lines” obfuscates the intent of the message.

Our new display is 3840 x 2. Marketing is saying the 3840 lines of vertical resolution is “still” a 4k display, but at this new low price point this thing will sell like hotcakes. (yes, that was satire)

Carlton,
Knowing this info and applying it to my set up, (I’m 10-20 feet from screen depending on what seat you are in) I would not see a difference in picture quality, sharpness, etc. if I go from my 1080p to a 4K 50″ tv? My 50″ flat screen tv dominates the room as it is. Can’t go bigger. How about the curved screen?

There is no way that you would notice the resolution increase to 4k for a 50-inch TV from that distance. 4k is not a good reason to upgrade.

I don’t like curved screens because they make viewing from the side more difficult. But they can be better for rejecting light and reflections, especially if you’re sitting directly in the center.

HDR would be the only reason to upgrade. Much more vivid colors. And the black levels and brightness are both better on newer panels. LG or Sony OLED are the best; Samsung Quantum Dot LCD is also very good.

Hi Carlton Bale, Firstly happy new year. I have an 80″ Sharp with 200Hz refresh Back lit. It was calibrated by a professional for all the sources connected to it. In effect the brightness of the Led went down after calibration, compared to the default factory setting. For the better viewing. So no matter what I view and whatever source ie cable, DVD Bluray, etc the picture is perfect. Skin tones are what they should be. Grass is what it should look like and not an over saturated bright dark green. Blacks are as black as the Ace of spades. The reds and the yellows are great given the Quantom screen. But no matter what screen you choose I highly recommend it be calibrated and then be amazed. This is something no retailer will tell you. So the answer to the question of 4k upgrade I would not when I am getting the perfect picture from my 1080p. I dont want to see the blade of grass in a distance as a comparison to a 4K. Why would I? Regards Paul

Paul, I completely agree with you. Calibration makes a huge difference. There are some calibration discs listed in the article that are a great place to start. Professional calibration is the next level above that.

Look for someone who is ISF (Imaging Science Foundation) certified. There should be a “search for local certified calibrator” on their website. You can also check with local home theater installation companies.

It depends how close you’re going to be to it. For a laptop screen, you’re typically closer and can benefit from 4k on a 13-inch screen. For external monitors, the distance is farther and I recommend 24-inches minimum. I personally use a Dell 27-inch 4k monitor.

Interesting to see this conversation morph over time. I have a dedicated room – no external light sources with the doors closed, and the front row seating is 14′ from a 133″ screen with the projector mounted on the ceiling at 12.5 feet from the screen. I bought 1080p primarily because I understood two things – first, a 4k projector would cost at least 3x what a decent 1080p projector would cost, second because all the sources I read told me my eyes would not be able to discern a difference even sitting in my front row seats (back row is 17 feet away, even less likely from there). Given all the source material discussion points, knowing my wife won’t buy Blu-ray unless I specifically request she does, means I am watching compressed sources. I did a calibration, and despite all the pundits saying it wouldn’t make a difference on a $1k projector, they were very wrong. Getting the black level sorted out made a huge difference when viewing Blu-ray sources, particularly in darker scenes or on dark objects in brighter scenes.

I am glad I found this discussion (the Di2 stuff brought me here…) and from what I can tell, even though I have 20/12 vision, I will be very unlikely to see a distinct difference with a 4k (or 8k) projector in my environment. So I’ll spend that money on an Anthem receiver instead. I bet I’ll be happier.

This article is full of error. I’ve a Ph.D in vision and have been a consultant for one of the top TV companies. The notion that people can resolve 60 pixels per degree is dead wrong, as others have said. What matters is resolution acuity, not maximum resolvable angle. Humans can resolve 60 cycles/degree *under optimal conditions*. Each cycle is pair of on/off pixels. Think of a vertical line stripes, one pixel, one white one black. A 4K set has 2000 cycles. At 60 cycles/degree, that means that the resolution limit is achieved when the image is 2000/60=33.3 degrees wide. For a 55 inch diagonal (which is 48 inches or 4 feet wide), that limit is at about 6 feet. That’s because visual angle figures at (2*arctan(4 feet/(2*6 feet))=33.39. Sit further away and any benefit of 4K is lost. This is straight visual physiology and not a debatable point, despite what the many misinformed commentaries on the interest try to claim.

HOWEVER. The acuity limit is 60 cycle/degrees only under optimal conditions. Optimal conditions don’t often occur in natural viewing. Optimal condition are maximum contrast, no background clutter, and no motion. In fact, real pixels live in an image that has background that lowers contrast and creates masking clutter. When the images move, acuity is also lower. Just to show the effect. If real viewing lowers the acuity limit to 50 cycles/degree, then 4K offers no benefit beyond a viewing distance of only 5 feet. This is probably closer to a real limit.

I won’t even get into the contrast and color BS about TV’s. Theoretically, the eye can only see about as million colors in laboratory conditions. In natural conditions, the number is closer to 50K-100K. And even the best TV’s can’t even produce a large number of these. This new notion that TVs can create a billion color is utter nonsense. Same goes for contrast. People don’t see much more than 1000:1 even under ideal conditions.

Hi, very interesting article. If the matrix of the camera or monitor has 4K resolution, in pixels this is the size of the resulting images. Also, the image size is often indicated in megapixels, megapixels, such a designation is especially common when specifying the resolution of digital camera arrays. A brief note about the frequency that is applicable to screens of different resolutions. What does she give? This is (roughly speaking and simplified explanation) the flicker speed of your screen and the associated delay between image frames. The lower the frequency, the worse the change of frames is perceived: they become “ragged”. On the contrary, the higher – the smaller the pause between frames, the image becomes smooth and well perceived.