Archive for July, 2015

I’m prompted to express my view on this topic based on the number of absurd comments I’ve read in various forums over the past few years. As a brief background, Apple introduced the iPhone 4 in 2010 with a screen pixel density that far exceeded anything offered by the completion. Since products with this feature had a competitive advantage, Apple created a marketing term, “Retina Display”, to qualify their own products that met the necessary criteria. An industry expert provided an opinion on Apple’s claims which in turn prompted a misinformed member of the tech press to challenge Apple’s claims. This led to even more misinformed comments in technical article forums. This article attempts to explore this topic in more detail.

“Retina Display is a marketing term developed by Apple to refer to devices and monitors that have a resolution and pixel density so high – roughly 300 or more pixels per inch – that a person is unable to discern the individual pixels at a normal viewing distance.“

Apple’s own support FAQ on the topic explains it as having “a pixel density that is so high, your eyes can’t discern individual pixels at a normal viewing distance. “

What do we evaluate Apple’s claims and critic’s challenges?
In order to evaluate criticism or challenges, it is important to understand a few key items.

1. Apple’s marketing term
First and foremost, the term “Retina Display” is a marketing term invented by Apple and likewise can mean whatever Apple wants it to mean. Some people see the word “Retina” and feel the need to explore the unequivocal boundaries of every retina that exists in every human. That’s not the case.

2. Retina is a relationship between pixel density and viewing distance
Visual acuity is measured as an angle. Specifically, 20/20 vision is defined as the ability to discern the detail of 1 arc minute. As such, we can discern more detail as we get closer to an object and we discern less detail as we move further from an object. That is, a phone requires a significantly higher pixel density than an HD TV in order to qualify as a “Retina Displaly” because we view our phones (distance measured in inches) much closer to our eyes than we view large screen HDTV sets (distance measured in feet).

3. Difference between discerning pixels and discerning any perceptible improvements
Jobs was very clear in that his claim was that with a “Retina Display”, the pixel density is fine enough that you can no longer distinguish the individual pixels. He illustrated this by the letter “a” on a grid and demonstrated the difference between a Retina and non-Retina display. On the Retina display, the lines were smooth and you couldn’t see any sign of jagged edges as you could with a non-Retina display. The more advanced critics attempt to bring up the notion of various forms of “hyperacuity” in their challenges. While this is a reasonable argument, it key point here is that it doesn’t contradict Apple’s actual claims.

4. What was actually claimed
Most of the controversy I see on this topic seems to originate from people making assumptions on what Apple has claimed rather than actually verifying the facts for themselves. I reviewed iPhone 4 announcement again on YouTube and made a transcript of the part where specific claims were made. On stage, Steve Jobs said the following:

“It turns out; there is a magic number, right around 300 pixels per inch that when you hold something around 10 or 12 inches away from your eyes is the limit for the human retina to differentiate the different pixels. “

The specifics of this claim are very important as the criticism I generally encounter, particularly on article forums always comes up short when compared to the actual claims made. As an example, the most recent challenge I encountered involved a forum poster claiming “Apple claimed that after 300 PPI the eye could no longer see pixels and that was false”. As we can see, this is not what Apple claimed. Apple never expressed visual acuity in terms of the number of pixels alone. Rather, Jobs established both a resolution and a distance as the basis of his claim. He declared normal viewing distance for a phone to be 10 to 12 inches. Anecdotally, I held up my phone as I usually do and then took a ruler and measured the distance and found this claim to be accurate – for me.

a. effectively dispute the definition of 20/20 vision (the ability to discern the detail of 1 arc minute) or
b. effectively dispute the assumption that normal corrected vision is 20/20 or
c. effectively dispute the normal viewing distance for a phone

““Retina Display” is a great marketing name, and it’s the sharpest smartphone display available, 23 percent sharper than the nearest competitor, but objectively it does not meet the quantitative criteria for being a true Retina Display – it’s about a factor of two lower than the acuity of the human Retina. Rather, the iPhone 4 has a “20/20 Vision Display” because when it is held more than 10.5 inches away, a person with 20/20 Vision will not be able to resolve the iPhone 4 screen pixels, which are at 326 ppi (1 arc-minute resolution). But 20/20 Vision is the legal definition of “Normal Vision,” which is at the lower end of true normal vision. There are in fact lots of people with much better than 20/20 Vision, and for most people visual acuity is limited by blurring from the lens in the eye. The best human vision is about 20/10 Vision, twice as good as 20/20 Vision, and that is what corresponds to the acuity of the Retina. So to be a “True Retina Display” a screen needs about 652 ppi at 10.5 inches, or 572 ppi at 12 inches. Unfortunately, a “20/20 Vision Display” doesn’t sound anywhere near as enticing as a “Retina Display” so marketing and science don’t see eye-to-eye on this…”

In essence, Soneira’s comments actually verify the math I’ve provided above and verify the fact that for 20/20 vision, there is no benefit of having higher density displays at a distance of 10.5” or greater.
The basis of Soneira’s challenge is the fact that some small percentage of human beings (the actual percentage is debatable and likely considerably less than Soneira’s claim) have the ability to achieve 20/10 vision. Since Jobs mentioned the word “Retina” and didn’t specify “20/20” vision I can see where Jobs’ claim wasn’t specific enough to leave such claims open for challenge. 20/20 vision is the commonly accepted norm for human vision. Even with corrective lenses or eye surgery, most humans are not even capable of achieving acuity greater than 20/20. Finally, even Soneira himself acknowledges that 20/20 is the “legal definition” of normal vision.
In Soneira’s “iPhone 6 Shoot out” article, he reaffirms his position by saying:

“iPhone 4: Their most famous and aggressive innovation came with the introduction of the Retina display in 2010 for the iPhone 4, where Apple doubled the pixel resolution and Pixels Per Inch (ppi) up to where the screen appeared perfectly sharp for normal 20/20 Vision at typical Smartphone viewing distances of 10.5 inches or more. It was a brilliant technical and marketing innovation, and the competition was left in the dust…”

The point being, the basis of all criticism of Apple’s claims is rooted in Raymond Soneria’s original opinion on the topic. His opinion was picked up by Brian X. Chen writing for Wired magazine with an article entitled “IPHONE 4’S ‘RETINA’ DISPLAY CLAIMS ARE FALSE MARKETING”.

Devoid of any actual insight or analysis, the article simply regurgitated Soneira’s opinion and the “false marketing” meme spread across the internet echo chamber.
Yet, within the scientific community, Soneira’s criticism mostly stands alone. In fact, Soneira’s opinion was put in check almost immediately. The first rebuttal came from Phil Plait in an article entitled “Resolving the iPhone resolution“.

“As it happens, I know a thing or two about resolution as well, having spent a few years calibrating a camera on board Hubble. Having looked this over, I disagree with the Wired headline strongly, and mildly disagree with Soneira. Here’s why.”

In short, Plait challenge’s Soneira’s use of 0.6 arc minute (20/10 vision) in his calculations rather than using 1 arc minute (20/20 vision). The point being, there is a difference between “perfect vision” and “normal vision”. Soneira unsuccessfully attempts to conflate the two terms as if they were one and the same in order to provide validity to his challenge.
Another challenge to the Wired article came from William Beaudot. The article, Apple “Retina Display” in iPhone 4: a Vision Scientist Perspective, goes through another technical analysis and makes the following claims:

“This controversy started with the Wired article “Apple’s Retina Display Claims Are False Marketing” with which I disagree on 2 points:

2) its expert’s conclusion that the “Retina Display” is a misleading marketing term.”
and…
“In these conditions, refuting Apple’s marketing claim would be unfair and misleading. In my opinion, Apple’s claim is not just marketing, it is actually quite accurate based on a 20/20 visual acuity. “
and…
“As such, my second take-home message is that Apple new display can be called without dispute a “Retina” display.”

The bottom line is that the noise for Raymond Soneira has been put in check and the ridiculous Brian X. Chen Wired article has been soundly debunked.

What about concepts such as Hyperaccuity?
This is by far the most interesting form of challenge. So, what is Hyperaccuity? Wikipedia defines it as follows:

“The sharpness of our senses is defined by the finest detail we can discriminate. Visual acuity is measured by the smallest letters that can be distinguished on a chart and is governed by the anatomical spacing of the mosaic of sensory elements on the retina. Yet spatial distinctions can be made on a finer scale still: misalignment of borders can be detected with a precision up to 10 times better than visual acuity. This hyperacuity, transcending by far the size limits set by the retinal ‘pixels’, depends on sophisticated information processing in the brain.”

Unlike traditional visual acuity tests, Vernier acuity is a type of visual acuity that measures the ability to discern a misalignment among two line segments or gratings.

In simple terms, this concepts like this attempt to go beyond seeing jagged lines or individual pixels. It relies on both the visual capabilities of your eyes and actual processing in your brain in order to determine minute differences or misalignments in objects. The question then becomes, can people discern the difference between two images, even if they can’t see individual pixels? This is an interesting question that I have not seen sufficient evidence of one way or another. Anecdotally, I have compared various different phones side by side with varying pixel densities. The problem with doing this is making sure you understand the variables. For example, if the different screens belong to different platforms, they could be using different fonts and even font weights, different rendering algorithms, etc. To that end, I tried viewing both the iPhone 6 (326 PPI) and the iPhone 6 plus (401 ppi) using several different programs (web pages in Safari, iBooks, etc.). My focus was primarily on text and even italics where possible. The point of this test was to limit the variables as much as possible such that the only difference would be pixel density. Even at close viewing distances (less than 10 inches), I was not able to discern a difference in quality.

DPI is not the same as PPI
The Journal of the Society For Information Displays performed a study on this and concluded that at a distance of 300mm (11.8 inches), users can discriminate between 339 and 508 PPI.
When I first heard of this study, I was very intrigued as it seemed to contradict my understanding of reality. Even with just a cursory overview of the study, it became clear that there were several major problems with drawing the conclusion they did based on the study they performed.

The rebuttal to this test would be the following:

1. (question) The actual eye sight rating of the 49 people in the study was not made clear. Did they have better than 20/20 vision to begin with?

2. (major issue) The test was simulated. They didn’t have actual screens with such resolutions. Instead, they printed samples onto transparencies and projected a back light in order to simulate a smartphone screen. They are obvious variables relating to printing technology not discussed or accounted for.

3. (major issue) The rules for print resolution are often different than display resolution. Why? Because most pixels are actually composed of 3 sub-pixels (RGB). Electronic displays leverage this by performing anti-aliasing at the sub-pixel level. Likewise, there would need to be up to 3 times as many DPI on a printed image to match the detail you can achieve on a display with the same PPI density. Furthermore, there are no standards for the size or shape of printed dots. As such, there are printers with lower DPI ratings that produce higher quality output than higher DPI printers.

4. (observation) In the best case scenario, it suggests upper limits of human visual acuity, but it does not contradict Apple’s claims as Apple’s claims were specific to simply having smooth edges and not being able to see individual pixels.

What about display technology? What about Anti-aliasing?
Many articles on this topic go into some level of detail on this topic. Some articles provide examples and even magnify the screen displays so that you can actually see the various sub-pixels (RGB). I would consider this topic to be largely outside of the scope of this article, but there are a couple key points to be aware of.

1. Displays are not created equal. As an example. Some companies use an inferior PenTile based sub-pixel layout. This method uses 1/3 fewer subpixels. At the same pixel density, PenTile based displays have an inferior image quality. To compensate, some companies use PenTile based displays with significantly higher PPI ratings. Why? Marketing. Consumers are trained to look for specifications. Higher is usually considered better. If you’re impressed by a PPI rating for a device, you would be best served to learn more about the type of pixels you are comparing.

2. Antialiasing is a technique to help images and curves appear smoother than the sheer resolution itself allows. There are several techniques to accomplish this. Some techniques take advantage of contrasting colors such as black text on a white background by using a few grey pixels around a curved edge to help fool the eye. Other methods actually manipulate the pixel at the sub-pixel level in order to achieve a sharper image. For this reason, you can’t directly compare DPI from a printed source to PPI on a screen display.

Are there disadvantages to higher pixel density screens?
Generally speaking, higher the pixel density results in better screen quality in terms of clarity and sharpness. However, there reaches a point of diminishing returns whereby any increased resolution is no longer perceptible and the increased pixel density comes at a cost. For LCD screens, higher pixel density results in less light being able to pass through. Likewise, in order to achieve the same level of brightness, a higher powered backlighting system is needed. This is problematic for mobile devices. IGZO based LCDs have helped mitigate this, but the issue still exists. Regardless of the display technology higher pixel density means more work for the GPU which in turn leads to heat issues and battery performance issues. If there is little or no perceived screen clarity benefit, it’s hard to rationalize the continued pixel density race other than simply a marketing ploy.

Conclusion
Display screens have come a long way over the years. The original iPhone’s 163 PPI was considered very good at the time. Various Android based phones pushed the quality even further. Then, with the iPhone 4 at 326 PPI, screen technology reached a point of diminishing (if any) returns in terms of clarity and sharpness. The difference between a 163 PPI screen and a 326 PPI screen is striking and dramatic. I have yet to discern the difference with a higher pixel density screen, even from the same vendor’s line of products. These statements assume “all things are equal”. That’s not actually the case. The smallest element on a screen is actually at the sub-pixel level. Samsung in particular uses inferior PenTile based displays which use 1/3 less sub-pixels. As such, these displays require a higher pixel density in order to achieve equivalent levels of sharpness. Other than for marketing reasons, it’s hard to rationalize a legitimate reason for why companies would take this approach.
The following conclusions are observed:

1. Nothing from any source has been presented evidence which directly discredits Apple’s claims regarding their Retina Displays.

2. It is possible that some people are able to benefit from resolutions higher that what is required for Retina under certain conditions. Examples would include:
• The user having better than 20/20 vision.
• The user holding the device closer to their eyes and is able to focus properly at that closer distance.
• Display technologies such as PenTile based displays require higher PPI densities due to their inferior technology which uses 1/3 less sub-pixels.
• The notion of Hyperacuity comes into play whereby some can (potentially) see a difference in an image even if they can’t discern the individual pixels.

3. The point of “Retina Display” is widely considered the point of diminishing returns. If there is a quality difference beyond that point, few (if any) would be able to tell and even then, they’d have to study the display very closely via hyperacuity techniques.

4. We’ve reached a point such that increases in pixel density alone no longer make a display better. Manufacturers should be more focused on concepts such as color accuracy, color gamut range, improved dynamic range, improved display efficiency, etc.