[Update: Some people in the comments are complaining I didn’t use metric units in this article. I’ll assume they’re new to my blog; I usually do use metric. However, note the original quote by Steve Jobs is in Imperial units, not metric, so I used those for consistency. Also, the units don’t matter, since I could have used pixels per hogshead if I felt like it. What matters is the way the numbers compare to each other, as long as the units are consistent.]

With much bruhaha, Steve Jobs and Apple revealed the new iPhone 4 yesterday. Among other features, Jobs said it has higher resolution than older models; the pixels are smaller, making the display look smoother. To characterize this, as quoted at Wired.com, he said,

It turns out there’s a magic number right around 300 pixels per inch, that when you hold something around to 10 to 12 inches away from your eyes, is the limit of the human retina to differentiate the pixels.

In other words, at 12 inches from the eye, Jobs claims, the pixels on the new iPhone are so small that they exceed your eye’s ability to detect them. Pictures at that resolution are smooth and continuous, and not pixellated.

However, a display expert has disputed this. Raymond Soneira of DisplayMate Industries, was quoted both in that Wired article and on PC Mag (and other sites as well) saying that the claims by Jobs are something of an exaggeration: "It is reasonably close to being a perfect display, but Steve pushed it a little too far".

This prompted the Wired article editors to give it the headline "iPhone 4’s ‘Retina’ Display Claims Are False Marketing". As it happens, I know a thing or two about resolution as well, having spent a few years calibrating a camera on board Hubble. Having looked this over, I disagree with the Wired headline strongly, and mildly disagree with Soneira. Here’s why.

First, let’s look at resolution*. I’ll note there is some math here, but it’s all just multiplying and dividing, and I give the answers in the end. So don’t fret, mathophobes! If you want the answers, just skip down to the conclusion at the bottom. I won’t mind. But you’ll miss all the fun math and science.

1) What is "resolution", really?

Imagine you see a vehicle coming toward you on the highway from miles away. Is it a motorcycle with one headlight, or a car with two? As the vehicle approaches, the light splits into two, and you see it’s the headlights from a car. But when it was miles away, your eye couldn’t tell if it was one light or two. That’s because at that distance your eye couldn’t resolve the two headlights into two distinct sources of light.

The ability to see two sources very close together is called resolution. It’s measured as an angle, like in degrees. For example, the Hubble Space Telescope has a resolution of about 0.00003 degrees. That’s a tiny angle! I’m simplifying here a bit, but you can think of this as saying that two stars farther apart than that are seen as two objects; if they are closer together, even with Hubble they appear as a single object.

Since we measure resolution as an angle, we can translate that into a separation in, say, inches at certain distance. A 1-foot ruler at a distance of about 57 feet (19 yards) would appear to be 1 degree across (about twice the size of the full Moon). If your eyes had a resolution of 1 degree, then the ruler would just appear to you as a dot.

At a given distance, two objects closer together have a smaller angle separating them, making them harder to distinguish from each other. Note that in the image above, the circles on top are farther apart, with a bigger angle between them (imagine you are looking at them from the left, where the black lines intersect). At some point, the objects are so close together, and the angle so small, the two merge into one object as far as your eye is concerned. That’s your resolution limit.

What is the resolution of a human eye, then? Well, it varies from person to person, of course. If you had perfect vision, your resolution would be about 0.6 arcminutes, where there are 60 arcmin to a degree (for comparison, the full Moon on the sky is about 1/2 a degree or 30 arcmin across).

To reuse the ruler example above, and using 0.6 arcmin for the eye’s resolution, the 1-foot ruler would have to be 5730 feet (1.1 miles) away to appear as a dot to your eye. Anything closer and you’d see it as elongated (what astronomers call "an extended object"), and farther away it’s a dot. In other words, more than that distance and it’s unresolved, closer than that and it’s resolved.

This is true for any object: if it’s more than 5730 times its own length away from you, it’s a dot. A quarter is about an inch across. If it were more than 5730 inches way, it would look like a dot to your eye.

So you can think of this 5730 number as a scale factor; multiply an object’s size by that, and, if your vision is perfect (OOOooooo, foreshadowing!) you get how far away you can see it as more than a dot.

2) Power to the pixel

So what does all this mean for the iPhone? First, here are the claims.

Jobs claims the iPhone held at 12 inches from your face has pixels too small to be resolved by your eye. Soneira, the display expert quoted in the magazine articles, disputes that. He uses the 0.6 arcmin resolution for the human eye (so we use the scale factor = 5730). Let’s use that and run the numbers.

So if the pixels on the iPhone are smaller than 0.0021 inches in size, then Jobs is right. Your eye won’t resolve them. If the pixels are bigger, Soneira is right, and your eye can resolve them.

The actual iPhone 4 has 326 pixels per inch (the display is 960 pixels high, and about 2.9 inches in length). You have to flip that to get the size of the pixel in inches:

1 / 326 = 0.0031 inches

Uh oh! Things look bad for Jobs. The iPhone pixels are too big! At one foot away, your eye can resolve the pixels, and Jobs must be lying!

Or is he? Remember, Soneira used the 0.6 arcmin resolution of the eye, but that’s for perfect eyesight. Most people don’t have perfect eyesight. I sure don’t. A better number for a typical person is more like 1 arcmin resolution, not 0.6. In fact, Wikipedia lists 20/20 vision as being 1 arcmin, so there you go.

If I use 1 arcminute instead, the scale factor is smaller, about 3438. So let’s convert that to inches to see how small a pixel the human eye can resolve at a distance of one foot:

12 inches / 3438 = 0.0035 inches

Aha! This means that to a more average eye, pixels smaller than this are unresolved. Since the iPhone’s pixels are 0.0031 inches on a side, it works! Jobs is actually correct.

[Note: in the articles about all this, they used units of pixels per inch, whereas I’ve used the size of the pixels themselves. You can flip all these numbers to convert. The iPhone4 has a resolution of 326 ppi (pixels per inch). Soleira says the eye can resolve 1 / 0.0021 = 477 ppi. However, normal vision can see at 1 / 0.0035 = 286 ppi. So the density of pixels in the iPhone 4 is safely higher than can be resolved by the normal eye, but lower than what can be resolved by someone with perfect vision.]

3) So what does all this mean?

Let me make this clear: if you have perfect eyesight, then at one foot away the iPhone 4’s pixels are resolved. The picture will look pixellated. If you have average eyesight, the picture will look just fine.

So in a sense, both Jobs and Soneira are correct. At the very worst, you could claim Jobs exaggerated; his claim is not true if you have perfect vision. But for a lot of people, I would even say most people, you’ll never tell the difference. And if you hold the phone a few inches farther away it’ll look better.

So in my opinion, what Jobs said was fine. Soneira, while technically correct, was being picky. So I mildly disagree with him about that. I had to laugh, though: his dismissing (near the bottom of the Wired article) of the Quattro TV’s use of a fourth, yellow, pixel is dead on. When I first heard of that I knew right away it was a silly claim.

Still, the headline used by Wired.com was clearly incorrect; Jobs wasn’t falsely advertising the iPhone’s capabilities at all. I’ll note that I like Wired magazine quite a bit, and what we have here is most likely just an overzealous editor. But a lot of people read the headlines and it taints their view; someone reading that article may be more likely to think Jobs, once again, has overblown a product to excite people. He didn’t.

Interestingly, I can’t even resolve the pixels on the 3GS screen from 12″ away, and that has half the ppi of the iPhone 4. Can anyone else? I can only resolve them if I take my glasses off and hold the phone about three inches from my face (glasses off because my prescription makes me a little farsighted).

So how come I can tell the difference between papers printed out on a 600 dpi printer and those printed out on a 1200 dpi machine? It seems like I shouldn’t be able to distinguish them at reading distance, but I easily can. And even a 1200 dpi printout doesn’t have print as good as what is in a hardback book. Can someone explain? Is there more than just resolution that makes a difference?

300 dpi is the standard color photo print resolution in magazines. Chosen exactly because for the average human it is beyond the limit of visible resolution. Most photo print companies print photographic prints at an even lower 225 ppi as for many people even that is not really pixelated.

Isn’t that true for everybody who argues about technical correctness? Oh, and thanks for the info on the Quattro TVs, too bad Sulu is the huckster in that ad campaign. Guess that’s why they didn’t hire the science officer, he couldn’t have delivered the lines in good conscience.

I’ll note that I like Wired magazine quite a bit, and what we have here is most likely just an overzealous editor.

I don’t even own a smartphone yet, and I’m waiting to see what generation 2 of the iPad brings, but I love Apple products for one main reason: they make the self proclaimed hardcorepoweruser!!1! types froth with incomprehension at the success of the products. You have to love people (preferably from a distance) who cannot comprehend that someone *doesn’t* want to run a Python interpreter on their phone.

For the non-technical: no, a Python interpreter is not a parseltongue translator App.

I think Soneira’s main beef with Jobs was his use of “the limit of the human retina”‘. Eyesight is dependent on both the ability of the lens and cornea to properly focus the image and the resolution that the retina can resolve. People with perfect eyesight have eye’s with perfect optical geometry as well as a perfectly working retina. By claiming that the displays exceeds the “limit of the human retina to differentiate the pixels” you should be using the perfect eyesight resolution of 0.6 arcminutes. Perhaps Apple should not have used hyperbole, but then again, that’s what they do best.

It’s still an irrelevant and overstated claim: for the most part, as in computer monitors and really any LCD display, the main limiter of image quality is actually, well, the quality of the image itself. A youtube video in sub-standard definition will still look pixelated. A picture with tiny dimensions will still look like crap if you zoom in at all. Basically, as far as I’m concerned, Jobs has just added a sixth blade to the disposable razor.

If Jobs is “falsely advertising”, here, it’s mainly by attempting to convince us that the iPhone’s display is cutting-edge, top-of-the-line, or even a stand-out among smartphones, when really there is no difference a consumer could easily discern. Much more interesting and noticeable stats to state would be, say, contrast ratios or response times — but I doubt these are anything special, either.

Despite the fact that Jobs’ is advertising a razor that shaves, I somehow doubt it will stop the MSM (and much of the non-mainstream press, for that matter) from continuing to unabashedly hawk Apple products at every turrn.

Always foiling the plans of the Rebel Alliance Against Big [flavor of the day]. We had the perfect guide to global perfection laid out before you ruined it. 1) Exaggerate Jobs’s exaggeration. 2) We’re still working on this step. 3) WORLD PEACE!

That’s right, Mr “I’m so cool because I can reason” Plait. You’ve ruined the chances for world peace.

On to plan AeeVrR372tfW: release an article explaining how spotted owls are carnivores, and carnivores eat babies. Then, WORLD PEACE!

“Despite the fact that Jobs’ is advertising a razor that shaves, I somehow doubt it will stop the MSM (and much of the non-mainstream press, for that matter) from continuing to unabashedly hawk Apple products at every turrn.”

And it won’t stop any of the Apple hater/bashers like you to make unfounded critiques against the company and its products no matter what :LOL:

It has been reported widely by those who actually had the new iPhone 4 in their hands (e.g., Gizmodo, who first reported the leaked iPhone 4) that the display *is* absolutely nothing like they’d seen before, and that the difference between it and, e.g., the current iPhone 3Gs is like day and night.

All this fails to take into account anti-aliasing, which is the use of shades of grey to smooth out the jagged edges. (see the two illustrations in this article for a perfect example of jaggies !). This is why we’re able to tolerate current display resolutions, despite them being much lower than the ‘magic’ 300dpi. In fact, most modern LCDs – the iPhone included – will use ‘sub-pixel’ ant-aliasing, which actually uses the individual red, green and blue pixel components to get even higher apparent resolution.

Don’t forget that the iPhone 4 will have sub-pixel anti-aliasing AS WELL as a 336 DPI screen. I’m confident that it’s going to be well past the resolution of even good eyesight.

Man, forget the hype, I’m just impressed that they can pack pixels at such density. Especially while (hopefully) maintaining decent contrast, black levels, color gamut, refresh rate, etc. People will be playing video on these outside in the sunshine, so all those factors matter.

Now when can I get a 30-inch, 300+ ppi monitor for under $500? That would be good stuff. Not sure what kind of beefy graphics card it would take to power games on that baby though!

Basically, as far as I’m concerned, Jobs has just added a sixth blade to the disposable razor.

Anyone else here old enough to remember the old SNL “commercial” for a 3-blade razor? The catch-line was basically “3 blades… because you’ll believe anything”. I get a chuckle at the current “it’s got so many blades, it _has_ to give you a close shave” line of commercials.

So the iPhone 4G might be slightly sucky for Chuck Yeager, though better than any other smartphone out there at the moment. Woopdiwoo. So if you can’t tell which way the horns of a crescent Venus are pointing, then for you, Steve Jobs is right; your eye won’t be able to resolve the pixels.

More important, though, is the meaninglessness of this. It’s cool, but no consumer’s gonna notice the difference, especially since the majority of the images they look at will be lower resolution anyway.

If anybody wants to get all hot and bothered about exaggerated cell phone claims, they should look at the *cameras*. The Droid Incredible boasts an 8 megapixel camera. But with a lens less than centimeter across, how on earth is the increased resolution in the CCD going to do a damn thing? “The poor focus and chromatic abberations of your camera phone will now be preserved in even greater fidelity!” (Assuming, of course, that the smaller CCD pixels aren’t now considerably more prone to interfering with one another, which they probably are.)

I have better than 20/20 vision. At a *distance*. I also have presbyopia, so I can’t focus at 12″. I generally view a screen at at least double that. And frankly, watch anyone using an iPhone (or Blackberry, or iPod Touch or whatever). How many people hold thing 12″ from their face? Most people hold them 18+” away, so the whole 12″ argument is silly anyway.

In addition, there is the issue of local contrast. If the image quality is good, adjascent pixels will tend to blend together smoothly, except for sharp, high-contrast edges, and then only in still shots. In video, the blurring caused by movement tends to smooth out the pixelation anyway.

Corey, I don’t understand your argument that the higher resolution is useless because pictures and YouTube videos are commonly at lower resolutions. A large amount of what people look at on their phones is text, and this will be rendered at the native resolution of the display.

Also, if even a low-res image is embedded in a Web site, the user may not choose to zoom in to the pixel-for-pixel level (thus using only, say, 25% of the display for the particular image) and thus can gain an advantage in clarity, compared to the same zoom level on a lower-res display.

Since full-screen viewing of low-definition images and video is not really the primary use of an iPhone, the increased resolution will provide significant improvements for end-users.

I have 20/15 vision, meaning that what I can resolve from 20 feet, average people can only resolve from 15. So for me, holding this iPhone at 12 inches is like most people holding it at 9. Using your 3438 factor:

9 inches / 3438 = 0.0026 inches < 0.0031 inches

I would be able to resolve the pixels at 12 inches, and I don't have perfect vision – just better than average. I may be in the minority, but that means that Jobs' claim is not true for me, and therefore not true in general.

The yellow channel on the Quatros is primarily bogus only because the original signal sent doesn’t have any more information. However, I can see where image processing might be able to make use of it. After all, we can create colors we’d never actually see under ordinary conditions – just red-fatigue an eye, and show it pure green. One thing I haven’t been able to find, though, is the frequency response curve of the rods. I expect they’re normally during color vision so over-saturated they don’t affect things – but might there be conditions of lighting where even a normal eye is tetrachromatic?

And it won’t stop any of the Apple hater/bashers like you to make unfounded critiques against the company

Head over to Wired or Ars and watch the hardubercorepower users try to spin AT&T’s email info leak into Steve Jobs’ personal failing.

A company is having a lot of success in the middle of an economy stricken with the vapors, and people are enjoying a nice set of products they can actually use without a 200 page manual. How HORRIBLE! (eyeroll)

So let me see if I understand correctly. If we’re talking about recent discoveries about Titan, then it is okay to be picky about titles like “evidence of life” rather than the more appropriate “evidence of possible life.” But, when it comes to Steve Jobs and an iPhone claim, then being misleading is okay?

It is clearly an exaggerated claim. 20/20 Snellen visual acuity is a reference standard used as a cut-off for the lowest level of normal vision — not as an average visual acuity for the human population.

Elliott, Yang and Whitaker (1995) published Visual Acuity Changes Throughout Adulthood in Normal Healthy Eyes. In this paper, they reported mean VA for 18-24 was 20/15 (6/4.5 metric). That’s the MEAN visual acuity for young adults. So a significant number likely had a better VA than 20/15. 25-29 year old mean actually _improved_ to ~20/13 (6/4 metric). The mean VA increased (approaching 20/20 or 1.0) from that point until reaching a mean VA of 20/20 (6/6 metric) in the 75-year-old group!

20/20 is the wrong VA to use for average human visual acuity. In addition, R.N.Clark at Clarkvision.com reports that people up to 50 can reliably tell the difference between 300 ppi and 600 ppi printouts.

The editor at Wired may have gone for a bit of a sensational headline but Wired is correct. The iPhone resolution isn’t so high resolution that the _average_ person can’t resolve individual pixels at 12″.

I agree with your methods, however, I’m going to have to disagree with your conclusion of
“Still, the headline used by Wired.com was clearly incorrect; Jobs wasn’t falsely advertising the iPhone’s capabilities at all.”

since when jobs marketed the device he didn’t specify that it was only true for a portion of the population (the size of that portion doesn’t really change my argument), then he was definitely misleading. Being misleading in your marketing does constitute false advertising.

Lets play a logic game.

If I were to say that a new vitamin I’m selling is safe and say:
“It turns out there’s a magic number right around 300 mg of dosage, that when you take this pill, you will live forever.”

but it comes out that for some portion of humans, this dosage is fatal and I know this…… am I guilty of false advertising?

@gss_000 (39): Well there’s a difference; we actually are near the brink of making pixels that are unresolvable to 100% of people, even if this iPhone isn’t it. The tech is just not that far off. As Bipedal Tetrapod says, 12″ isn’t much; I care more about 18″ for handhelds (or more like 24″ for monitors), which will take a bit longer, but it’s still not unforeseeable.

Discovering genuine evidence of life on Titan? That’s hard to even define, short of recovering a sample. AFAIK there are zero plans yet to even try that. We’re currently relying on hints from spectroscopy and geologic & atmospheric composition. Doesn’t seem particularly close to conclusive. Maybe we’ll get lucky and image a bright green Titanian “algae bloom” that stands out in the landscape? I’d hope we would have caught something that obvious already.

All that said, sure, in terms of pure logic, Jobs’ literal statement was wrong. But it’s not hard to infer what he meant, nor surprising that a company’s founder is hyping his own product.

@Ken B #28: That SNL skit was before my time, but I do recall The Onion’s article on the subject back in ’04, which was a curse-laden “CEO commentary” about how Gillette planned to release a five-blade model, and put blades in weird places if necessary – and this predated the Gillette Fusion by three years. No link, due to offensive language in the URL, but Google “The Onion Gillette” and you’ll find it.

@Tim H #34: If all we’re considering is text, then who cares what resolution the screen’s at? Do you really care if pixels in the text on the screen you’re reading are indistinguishable? I argue that you don’t need a high-res screen to read text; you can get perfectly readable text at some pretty crappy resolutions. For resolution to really be that big a product driver, you need to talk about images and video.

I have wondered about the Quattro. On the one hand, the signal sent to the Quattro does not include a separate yellow channel. Indeed, since it is exactly the same signal with the same information, the knee jerk reaction is to say that it cannot be any better.

On the other hand, having a new colored pixel on screen undoubtedly changes the gamut of the screen. We also know that the gamuts of different screens are already different and that no screen accurately portrays all colors, since some are outside their gamuts. So, if the Quattro extends the gamut, then it could very well be better without any change in the signal, assuming that the signal is processed in such a way as to extract a new yellow channel signal to re-create the required color. This would be like converting between RGB
signals and YUV or CMYK.

Of course, there is also the question of the pixel layout on the screen. It is hard enough
getting the three RGB pixels on the screen in a reasonable manner. What does having a
fourth pixel in a pixel group do the apparent resolution?

So, I don’t think that you can just blithely dismiss the Quattro a priori. I haven’t seen one
but I have a friend that has one, and she swears by it.

Now that we’re on the subject of resolution and electronic gadgets, can someone come up with a standard for measuring the effective resolution of cameras? Years ago I bought a phone with a 1.3 Megapixel camera. That may have been the number of elements on the CCD but due to spherical aberration, chromatic aberration, diffraction, noise, image compression, etc it would do no better than an honest 0.3 Megapixel camera.

The iPhone 4 may have better resolution than previous iPhones, but it still has the same serious design flaw that it’s had since its inception: AT&T. Until they fix that particular bug, the phone will remain useless to most people.

#7 David: Because printers advertise DOTS per inch. A dot is one color component of a “pixel”. Depending on the architecture of the printer, that 600dpi could be as low as 75 effective “pixels” per inch.

I do have better than 20/20 vision, but I also just checked something: I typically hold the iPhone between 20 and 22 inches away from my face. Holding a foot away seems absurdly close to me. If the new iPhone looks sharper and clearer from that distance than the gen 1 iPhone, then it’s doing what it said it does from my perspective.

ASFalcon13, on the contrary I would argue that a higher-resolution display is even more important for text than it is for images. That’s because text is usually higher-contrast and high in detail compared to photographs. Note that televisions (which display little text) are usually substantially lower in resolution compared to computer monitors. Compare the difference between a 720p movie and a 1080p movie, versus the same resolution increase for text — the text will show much more improvement than the movie.

Human visual processing works by seeking edges, as an essential first step, and text is heavy on contrasty edges, including curved and diagonal edges. On a low-resolution display curves and diagonals show visible stair-step artifacts which confuse the eye/brain with extra edge details that are meaningless. Though these details are filtered out, they represent extra work for the brain and make reading more taxing. This is one reason (of several) why people prefer to read large amounts of text on print rather than on a display. In print, 300dpi is generally considered a bare minimum for text.

Now if Apple had only put a 300 dpi screen into the iPad it would have saved a lot of “meh whats so special about it” hate comments. It would have had SOMEthing special about the hardware itself for marketing purposes.

Would be nice to see the math in your article for how far away the iPhone needs to be held for people with perfect eyesight to resolve the pixels. If it’s at a reasonable distance I see no issue with Job’s claim. Would also be nice to see the math for the required DPI if held literally right in front of your eyes.

I used to pay Texas Hold ’em on my iPod Nano. Addictive game/implementation. So when I bought an iPhone I naturally purcased Texas Hold ’em for that too. I hated it. The pixelation not only looked horrid but it gave me a headache because the card graphics werent as clear. For me the resolution on the original iPhone was what I disliked most about it. I usually wait for reviews for gear, even from Apple, to see if it’s worth my money but in this case I can’t buy the iPhone 4 soon enough. All the other features is just icing on the cake to me.

@Bob: I played with two Android handsets yesterday to see if they could tempt me away from the iPhone 4, before I get sucked into a contract. I found the UI extremely convoluted frankly. They wouldn’t tempt me away from the iPhone 3GS let alone the iPhone 4. To each their own I guess. But if AT&T coverage in your area is poor I wouldn’t turn my nose up at an Android handset – they’re the next best thing. Here in the UK we tend not to have that issue though because the iPhone will be available on all five networks here, O2, Orange, Vodafone, T-Mobile and Three.

#46 Ken (a different Ken) That is only true of color printers. There must be something else going on, because like #7 David, my eye sight is none too good, but I can tell the difference from a foot away between 300dpi text, 600 dpi text and 1200 dpi text. Perhaps that is what #48 Tim H is on to. Its the edges?

@7 & @46: actually, this is partly due to color depth. A printer can’t really print grey scale (modern printers can kind of vary dot size at best) so for quality halftones, edges and color mixing, printers need a higher resolution. A true color device like an LCD or OLED uses its true color depth. Professional digital typesetters were using 2400dpi resolution in 1980s…

Exploiting subpixels however do also make a significant difference in some applications such as Microsoft’s ClearType technology for improving text on LCDs.

The eye is an analog, not a digital device, and I think the real story is a bit more complex: people can spot some differences between a 300, 600, and 1200 dpi image at a foot distance, and there is a grey area between “cannot resolve two dots” and “cannot tell any difference at all”.

The tradeoff between spatial resolution and color depth is kind of a question of how much information you can pack into a square inch.

Thank you Zombie and Ken. Every professional printer/photographer knows there’s a difference between dots/DPI and squares/PPI as displayed on a screen. Mixing the two in the article leaves room for doubt on the accuracy of the articles conclusion IMO.

Apple keeps rejecting the totally awesome music streaming website grooveshark.com from making an app for iTunes. Because why buy music when you can listen to whatever you want, whenever you want on your iPhone/Pod touch/Pad for $3/month? Or whatever you want, whenever you want from your computer for… $0/forever. Well, sucks for iPhonesers. You can get it on the Android market. Which I plan to. So :-P, Jobs!!!!1 You can keep your stupid iTunes with DRM where I have to “authorize” computers.

stompsfrogs: First of all, iTunes hasn’t had DRM in quite some time, so you’re wrong there. And furthermore, both spotify and pandora has been approved into the app store. Maybe you should look at why that happened instead of ranting against Apple? Especially when your facts are off by a mile.

2 days ago, I was sick, I wasn’t thinking clearly, I pulled the blankets and sheets off the bed and washed them, got up and took a shower.

My iphone goes everywhere with me. Like a puppy, it was in bed with me. I was in the shower and realized my 9pm alarm hadn’t gone off.

Oh %$&^*)@^*%$$!!!!!!

Yes, my iphone is dead. Yes, I tried the bag of rice in a warm place. It’s dry but it’s dead. So I don’t care about the pixels of the 4g. It uses a longer battery life and it’s a working phone with my blue-hour app and my grocery list and my calendar and my alarms and my emergency camera for festive pictures of boba tea that I can post immediately to Facebook or Life Journal. I’m also halfheartedly trying to run a photography business without a phone, luckily, I can still claim illness for another day or 2 but I NEED A NEW PHONE AND I NEED IT 2 DAYS AGO! I can’t wait to get on the list on the 15th and I will be there 1st thing in the morning of the 24th to pick up a new iphone 4g.

Thanks for the reasonable argument Phil. I too found Soneira’s argument to be a bit too harsh. Jobs is almost certainly engaging in a bit of hyperbole, but it’s not a huge bit, rather like calling every new feature “revolutionary”, when in fact most of them are pretty evolutionary. The real take away is that further increases in pixel density are probably not necessary because under most viewing conditions, they aren’t distinguishable from the display we have.

Another thing to note is that these resolution measures are based upon the Rayleigh criterion, which judges your ability to discern two point sources as distinct. They are considered distinct when they are separated by the radius of the first minimum of the Airy disc. This is actually a pretty stringent requirement, and probably has very little to do with comfortable ordinary viewing conditions.

Phil,
I have a dispute with your calculation of pixel size. The pixels themselves are a certain size, but there is also space between each pixel isn’t there? So in reality, a pixel, plus a little bit of space is 0.0031 inches across. So the pixels themselves should be smaller.

And also since pixels are made up of RGB subpixels, I would think a certain pattern of pixel colors would be easier to resolve than pixels of the same color since the space between different colors of subpixels is greater than the pixels themselves. For instance an alternating pattern of red and blue.

The human eye uses an averaging algorithm and does see things the same as photographic film. Digital cameras have been made that mimic the way the eye sees things. The simple limit of resolution does not fully explain the appearance of a complex image vs. printing. The only way to resolve the question at hand would be for Steve Jobs to give all readers of this blog a new iphone so that a panel of super experts can decide.

We need the actual display specs. I suspect that the resolution is not 300 pixels per inch, but rather 300 dots per inch as discrete R G and B elements.

I’m a graphic designer and I can confirm that 300 dots per inch printed resolution is very poor quality. To reproduce smooth curves and fine detail, an imagesetter typically outputs 2700 dots per inch. The difference is easily seen in reflected artwork reproduction. Even finer resolution is required for high end printing processes like stochiastic screening.

hey geniuses, you forgot something. BIG error here, how did you miss something so obvious?

Your article fails for one simple reason.

People who are nearsighted (which is a larger amount of the public than farsighted), who basically see better than 20/20 up close. Thus, they can distinguish pixels smaller than .6 arcmin, in fact more like .2 arcmin.

So no, you are incorrect. Lots of people are still going to see this pixelated, because a large amount of the public is nearsighted.

As for comparing, if you are looking at a standard iPhone display, with normal content on it, you won’t notice the individual pixels until it is *REALLY* close. That’s because with smooth patterns, the individual pixels are harder to discern. The solution is to display a pixel-for-pixel black/white “checkerboard” (or similarly every-other-pixel in at least one direction) pattern.

I have close-to-20/20 eyesight, which is 20/20 with glasses. If I hold my iPhone exactly 12″ from my eyes, displaying a every-other-pixel pattern (http://www.lagom.nl/lcd-test/clock_phase.php specifically, which, on my iPhone, I need to zoom in until it hits pixel-perfect,) I can still make out the individual pixels. When I move it to about 18″, they start to lose individual definition, and by 24″, they are gone. As the new iPhone has twice the ppi as mine, that means that I should not be able to define individual pixels at 12″.

How is that .6 resolution determined? By experimental testing or by calculation based off the size of the retina and density of rod/cones?

I remember reading that we can get a higher resolution than just what our detectors see because our eyes are always moving slightly and this “wobble” lets the image jump from one detecting cell to another rapidly and our brains can use that. plus the fact we have 2 eyes to resolve a higher resolution than would be possible if our eyes really functioned like a CCD of the same size and resolution.

This article is based on a fundamental mistake. The single ‘visual acuity’ ratio mixes accommodation (focusing) with other kinds of defects such as astigmatism. One person may have 20/200 vision, yet be able to focus perfectly at 12 inches with minimal astigmatism, and be able to resolve to 0.6 arcmin. Someone else may have 20/20 vision, yet because of astigmatism be able to resolve only to 1.0 arcmin, even at 12 inches. Also, Jason in 74 makes excellent points about why one cannot simply calculate the effective resolution from the physical parameters of a human eye.

Though I hate to say it, Jobs is actually correct: for an object to be RESOLVED as opposed to DETECTED, there should be at least two resolution elements across the object (Nyquist sampling). One can discern the difference in these two words on some satellite photos of the earth: you can see a bright streak that you know is a road say, that is only one pixel wide. Thus you’ve DETECTED the road, but you have not RESOLVED it. Thus the figure of merit to use is HALF the size of the pixel , not the diameter, as everyone is using. There really is a difference in the meaning of the two words, and if people are going to get picky, they should use the accepted definitions. and Ed_CO makes a very good point, at least for camera images. I’m not sure if display images actually use the 2×2 Bayer matrix of 2 green, one red, and one blue pixel though??

I can’t find the link now, but some research group (whether university or corporate I can’t remember,) made a piece of software that uses video footage to generate a very-high-resolution still photo by doing exactly what you suggest. They use the temporal differences to infer the higher detail. (Something like 10 seconds of 640×480 video was sufficient to generate a higher-quality 4 MP photo than a native 4 MP camera, IIRC. The demo was a video of a mountain in Washington, so I think it was a Microsoft experiment.)

As an engineer and ophthalmologist, I would like to add my 2 cents. The author did a great job explaining resolution and how it pertains to the optical system of the eye. As previously mentioned, “20/20” vision is not “average”. It was taken as “average” by Snellenhimself, the father of the standard eye chart, when he asked a group of people who self reported their vision to be normal ~ 120 years ago! Although most doc’s don’t bother checking vision further down the eye chart after a patient reads the 20/20 line, most people can easily do this assuming their refractive error has been corrected. This includes adults, I’m in my late 20’s and read 20/10 in one eye and 20/13 in the other. The crux of the authors argument is that 20/20 is average, and it simply isn’t. The best documented vision I’ve seen reported is 20/8, which is close the theoretical optical and retinal limit of visual acuity.

If you can’t resolve two adjacent pixels, there’s no reason to anti-alias. Anti-aliasing reduces jagged edges, but results in a much softer edge. You can really see a difference. Draw a vertical black line on a fairly low-res LCD display, and compare it with an anti-aliased diagonal line.

You may know your business around the Hubble, but visual acuity is something else
Jobs claim was that you were not going to be able to see the individual pixels and that is wrong. Look for Vernier acuity

From 12 inches away, I can see the black spacing between 2 illuminated pixels in the 3GS iPhone and following your logic that would be a lot smaller than 0.6 arc minutes (and my vision is a little bit worst than 20/20)

Take a white/black alternating grid that moves from far away…. closer it approaches, it will be exactly 1 white/1 black every 2 pixels.

Before it gets to that size, of course it looks gray. Then it reaches 1×1 to the pixel size and that is when the timer starts.

Then, as it gets even bigger, one white square takes up more than 1 pixel, then it takes up 4 pixels. Can you tell when it hits 1 square per pixel? If you have perfect vision, it sounds like you could tell immediately. If you don’t have perfect vision, it will probably be zoomed in more than 1 pixel in size before the gray color starts looking like a checkerboard pattern.

Regarding the yellow pixels on the Quattro, RGB phosphors are imperfect and their saturation is not maximal, limiting the reproducible color gamut. Don’t forget that typical XYZ (perceptually-based) color space conversion to the standard RGB can result in _negative_ numbers! So it is quite possible to extend a TV’s gamut by introducing a non-primary color, since the RGB phosphors used in the TV are offset from true primaires.

Of course, the encoded chroma information the TV receives only contains the limited TV gamut, but that could be upconverted to fill out the expanded gamut analogously to superresolution approaches that use statistical methods, deconvolution, or machine learning heuristics to increase (pixel) resolution (there are papers in the computer graphics community that use information from multiple frames in a video, for example, to achieve visually excellent resolution increases, but even single-frame approaches can do very well for on the order of 2x increases).

Having said that, I don’t know details of the specific implementation in the Quattro, but I was just addressing the issue that there can in fact be a practical benefit to adding an additional pixel color.

All of these direct scientific and mathematical comparisons are flawed. We do not actually “see” what the retina “sees.” Visual cognition is a fascinating thing: are brain smooths jagged lines, detects borders and edges, uses motion cues, etc… Even if the human eye is physically capable of perceiving pixels in print, there is a range where our brain will kick in and smooth the jagged edges. “Resolution” of the retina is not the entire equation. Font and graphics rendering engines, anti-aliasing (on the device side) and visual cognitive tricks of the brain are also a factor.

Apple-bashing, plain & simple. You can tell by the language he chooses. Very fashionable now that Apple is the largest market cap in the world, and everything they do is golden. What a difference 10 years makes! Yeah, expensive, but the Apple logo attracts hot women…who can put a price on that?
KP

Sorry, Phil … it’s not that simple. The eye is very sensitive to SPATIAL DERIVATIVES. Discontinuity in the derivative will cause obvious visual artifacts with very small pixels, especially when only 256 gray levels are displayed. This is called Mach banding, and it is very disturbing.

So … regardless of pixel size, the iPad has a display that is FAR from perfect. You need something like 1000 dpi and 14 bits per color to get that.

What I want to know about the Quattro is, is there any evidence that someone comparing an RYGB monitor to an RGB one can tell the difference in a blind comparison? I’d buy into it if I saw evidence that people could pick correctly more than 50% of the time.

If you look at or listen to the quote, we will see that Jobs says “around” 10- 12″ and “around 300 pixels” Then others infer the definitive statements, of definitely exceeding all human perception. When I heard that statement, with two “arounds” in it, I interpret that as “being damn close to if not exceeding the limit”. However, when making a statement this vague, you can guarantee the spec nerds will go nuts, once again not getting the point that the exact specs are not what is important, but the effect on the user experience.

The funny thing is that these “expert” nerds are actually less perceptive than the casual non-tech user, when the nerds never see the forest with their nose against the tree.

Haha, “Wikipedia said so.”
I like that excuse.
of course, 10 years ago, maybe they didn’t know as well as we do now?
“I don’t believe you because 10 years ago McGraw Hill said something else”
that’ll get you far in life.

ASFalcon13 Says:
If all we’re considering is text, then who cares what resolution the screen’s at? Do you really care if pixels in the text on the screen you’re reading are indistinguishable? I argue that you don’t need a high-res screen to read text; you can get perfectly readable text at some pretty crappy resolutions. For resolution to really be that big a product driver, you need to talk about images and video.

Clearly, you have no idea what you are talking about and have never seen a high-resolution cellphone display. My friends Motorola Droid (800×480, 3″ display) has a far sharper screen than my iPhone 3GS (480×320, 3.5″), and it makes a WORLD OF DIFFERENCE READING SMALL TEXT. It is like night and day, particularly with modern smartphone web browsers rendering “fullscreen” web pages, forcing people to zoom in farther and farther to read the text.

You obviously need to actually SEE a high-DPI device before you comment any further…

the author should have gone another step further, Software used to display the images, purposely “anti-alais” the image… (for the laymen, it blurs the image edges) so it gives a superHuman’s eye a blur or a normalizing factor… giving even a superHuman eye the absence of the ability to be superHuman, and not be able to discern individual pixels, even at 10″….. (remember the lines between the pixels are even tinier than the pixels themselves, yet the PHD used the pixel dimension… also your eyes naturally move constantly even if you try to hold them still, further blurring lines between pixels…. even the shape of your eye is slightly moving… even in superHuman eyes.

so not only was the PHD all wet, he had no clue… NO ONE CAN DISCERN individual pixels on an iPhone 4 screen even from 10″ away… and you simply can prove it to yourself, by looking at the screen, and use glasses that can nearly double your natural abilities….

It is obvious you are “pro” Mac. I hope you did not do this article for free.
“…And if you hold the phone a few inches farther away it’ll look better.” – Right, you keep it on the moon! From there you won’t see the BS Mac throws in the face of “average”-witted people.
Mac is whack! Long live the freedom not to choose Mac products!

When you look at an 1slam1c website with your iPhone, or a tenth-amendent site for that matter, and the NSA spooks at AT&T note it and send you to Bagram without judicial recourse (or resolution), it won’t matter what your eyesight is.

This article could have been readable by a much larger audience if it had used metric units. How many inches to a mile? I don’t care! I know there are 1000 millimeter in a meter and 1000 meters in a kilometer. Don’t forget that the use of non-metric units may hurt. It has even caused the loss of a spacecraft… (http://en.wikipedia.org/wiki/Mars_Climate_Orbiter#The_metric.2Fimperial_mix-up

I am working on a late 2008 15″ MacBookPro
It has a resolution of 1440 x 900 and approximate dimensions of 13″ x 8″.
So my horizontal resolution is 1440 / 13 = 110 ppi
And my vertical resolution is 900 / 8 = 130 ppi (aren’t pixels square?)

If I work approximately 24″ away the size of visible pixels becomes 24 / 5730 = 0.0041
And the actual size on my MBP is 1 / 110 = 0.009.
So my MBP resolution is more than double what a person with perfect vision could see.

Here’s to LCD’s with a resolution of 225 ppi or 2925 x 1800

p.s. I’m an English teacher… so I suspect there are flaws in my calculations.
But I am aware that I would not be able to see text with present specs and that my dimensions are a bit off for a 16:9 or 16:10 screen ratio

To bring this back to Astronomy, it explains why I see the International Space Station as a line when I watch it go overhead. My wife never believes me when I say that. reason: Her eyesight is terrible, while my long-distance vision is very good.

I just did some quick math, and the ISS is 240 ft long, by 356 ft wide (if Wikipedia is to believed here) and its 189 miles up. so, 189 miles *5280 ft/mi = 997,920 ft. Take the “reasonable” factor of 3438, and multiply that by the width and I get 1,223,938. Bingo, proof that I can resolve the ISS from the ground, and that’s assuming “average” eyesight

What about the visual acuity of elite level athletes such as NFL Quarterbacks and MLB hitters. Besides having strong bodys, I’m guessing their visual system must be well above average. Have any studies made on them as a group?

i’m an ophthalmologist (techinically still a resident) and i figured i’d weigh in…

i think you have to take into account a lot more variables to say scientifically what is what.

the eye’s resolution is of course different between people.

i think the idea that the “perfect vision” eye and the “normal eye” are different is true but i would clarify that the “perfect vision” eye doesn’t exist. we correct the length of the eye (too long = myopia) and abnormal curvature of the cornea (astigmatism) with glasses or contacts and if all else is healthy then the eye should be in the ballpark of “the normal eye” listed above. certainly without optical correction, the resolution takes a nose dive. still, we are only correcting a few types of optical aberration and many more affect people to degrade the optical clarity of the image going to the retina (see Wavefront analysis and some advanced LASIK stuff).

these aberrations would also vary regardless if people were wearing glasses and cause a difference between people.

and all this is still ignoring the fact that the retina’s photoreceptor are also spaced differently in different people. and then these photoreceptors are then summated etc…

so ok, you take all that into account and come up with an average arcmin resolution for the eye for this theoretical calculation for an average 20/20 eye with glasses or contacts….

when you look at an image closer to your eye, your lens accommodates (or you wearing reading glasses) which adds around 6% magnification to the image (2% per diopter i think, +3 diopters at 33cm away) and that would need to be considered…

and you didnt consider Luminance!

the contrast and luminance of an image influence the resolution of the eye too…

i’m not about to do a lick of math on this partly because i’m lazy and partly because my iphone 2G is going to get replaced regardless because the resolution is high enough for me to check my email and play pacman.

This makes a huge difference. Assuming (big assumption, yes) that the 330 dpi claim is for the “macro pixels (r,g, and b subpixels), the display actually has 330 dpi in one direction and 990 dpi in the other.

Since resolution is most important for text, and since text benefits the most from sub-pixel rendering, the claim of better-than-eye resolution seems to hold up, even when held closer than a foot to the eye.

(and I tend to use my eye-phone about 18 inches from my face, so even more so)

If you look at or listen to the quote, we will see that Jobs says “around” 10- 12″ and “around 300 pixels”… However, when making a statement this vague, you can guarantee the spec nerds will go nuts, once again not getting the point that the exact specs are not what is important, but the effect on the user experience.

The practical effect is that text will be sharp and readable at normal handholding distances. Humans clearly have a WIDE range of visual acuity – it’s enough to say “it’s better, here’s why”.

Another thing that’s bugging me: comparisons with laserprinters. Many others have said it, but 300dpi black text on a page does look jaggy – there’s no antialiasing. Can you imagine if the older iPhone screen was a 1-bit laserprint? It would look awful. Antialiasing makes it readable on screens.

Lots of great comments here about the math, cognition, and visual acuity – I learned a lot today.

As I recall, bits used for grayscale (or color) are approximately as effective as bits used for pixel density. That’s why some of the early b/w grayscale displays looked pretty good even at 72dpi.

I doubt that it’s fair to count RGB as 8+8+8 bits per pixel, though, but it is certainly true that the intensity of a pixel is important for determining its effective resolution.

The eye apparently considers the intensity of adjacent pixels to determine the effective location of an edge. This is what makes antialiasing work.

Many printing mechanisms lack grayscale, and have to fake it by depositing more or fewer dots of ink at each point of the “screen”. That’s why good digital printing requires very high resolution, like at least 2560 dpi. If one wants 256 gray levels, you need to ink 0-255 dots for one pixel. That’s a square 16 on a side, which means 2560 corresponds to 160 pixels per inch. Not too fine. Color does this separately for each of CMY and K, and those have to be staggered so the ink (which is generally opaque) dots don’t often land on top of one another.

Thank you for the article, Phil, but like some other visitors to your site, i cannot believe that, as a scientist, the units you used in your article were not translated into metric, for the benefits of —i don’t know what—the rest of the world, maybe?
The inch-feet-small toe model does not mean a thing to many of your readers, and it’s too bad these readers were left out. i hope that the next time you use units, a translation will be provided (it’s no fun to read you with a calculator in hand!).

“And even a 1200 dpi printout doesn’t have print as good as what is in a hardback book. Can someone explain? Is there more than just resolution that makes a difference?”

Yes. I’ll ignore the resolution numbers because there’s all kinds of trickier aspects to it (some of it is marketing b.s. and other stuff is just really technical).

They key difference there is the *mechanism* by which ink is getting on the paper. Your inkjet or laser printer is converting image data into machine dots and applying them to the page. That hardcover book was printed using offset lithography: a metal plate is coated with ink and pressed to paper. That plate was created by developing its photo-sensitive emulsion coating using film.

The combination of ink pressing onto the page (with various level of dot gain) and the natural smoothness of photo-developed imagery gives you great detail on solids. So for things like solid black text you end up with much smoother, crisper edges.

The quality of the paper also impacts the final results (for all printing types). But that’s enough babbling from me. 😉

There’s all sorts of variables in printing. Many magazines use a rough rule of 300 dpi resolution for photos because they screen them at 150 lpi (the size of the halftone dots used in color seperations.) I think higher quality publications like National Geographic go higher.

For text or black & white lineart the dpi used would be much higher. Newspapers might go 1,200 dpi and higher quality publications more than double that.

This is ridiculous do you think we live in some kind of world where people wear corrective lenses or have LASERS fired into their eyes and have PERFECT vision? What an absurd claim. Obviously most people have 20/20 vision. And people with worse vision OBVIOUSLY wouldn’t hold the phone even CLOSER to their face!

To do Soneira justice, he closed his observation with “”It is reasonably close to being a perfect display, but Steve pushed it a little too far”.” it was the WIRED editors, who exaggerates it with a headline “iPhone 4’s ‘Retina’ Display Claims Are False Marketing”.

Sometimes – especially with Apple coverage, were I do have all public information available at my hand – its scary how the press “interprets” information. This is not what Journalism should do.

Sorry Bad Astronomer but you can put those science books away in that drawer with your logic. This is a religious dispute, and there is nothing you can do to persuade the worshipers at the shrine of An Droid that Jobs is not the son of Beelzebub, and as such completely incapable of telling the truth.

“Don’t forget that the iPhone 4 will have sub-pixel anti-aliasing AS WELL as a 336 DPI screen. I’m confident that it’s going to be well past the resolution of even good eyesight.”

No, the iPhone doesn’t use sub-pixel anti-aliasing. The problem is that the iPhone can be reorientated between portrait and landscape. Sub-pixel anti-aliasing only works well when increasing the horizontal resolution of text. The iPhone does however use regular anti-aliasing which helps.

Printer resolution is different. The listed figure is the size of an individual dot of ink/toner. You can’t change the brightness of the ink, so you instead change the size or density of the dots. A good analogy would be to think of advertised printer resolutions as referring to sub-pixels.

P.S. I too know a bit about resolution, having worked for a firm that specialised in colour calibration for proofing between screen and print.

@Cory,
You are quite correct that most YouTube videos and inferior quality media will only look worse on this display — every pixel will be revealed because it will be represented by a block of at least 4 iPhone pixels.

However the display comes into its own for movies (real ones), games, slideshows of our own images and text rendering, among other things. There is no exaggeration there, and it will be plainly evident how much more compelling the quality of this screen will make these things.

Do you watch YouTube videos for their production quality or immersive, cinematic experience? Neither are consumers going to buy the iPhone for YouTube.

okay you metric minded imperial measure haters, how about he publishes the article in Mandarin? that way about 1.5 billion more people can enjoy it, especially the lucky folks who actually get to make iPhones and what not.

An excellent counter-argument to Soneira’s piece. If he indeed claims to be an industry expert, it is slightly curious that he would assume everyone had perfect vision. I wish I never had to use my glasses!

People should be very careful making assumptions about the quality of various printouts at claimed resolutions. Digital imaging is my business and I can you that if you’re picking apart printed artifacts on a 1200dpi printer it’s probably printing out at an optimized 600dpi. How? You read the marketing on the box and it said 1200dpi! But it’s firmware defaults to 600dpi so that it can raster and print at a higher rate. Unless you’ve asked for 1200, you’re not getting it. This is before we talk about substrate, hard inks versus wet inks, etc.

As well, very few of the people on this board have ever held in their hands true image-setter film output at 1200dpi and 2400dpi (never 2700…). It’s beautiful. It’s unique and nearly gone from the face of the planet. Nearly all printed matter is printed straight to aluminum plate at this point. Long live the imagesetter.

300dpi images are never printed at an actual 300dpi, there is always a raster image processor between that image and the printed page that will interpret that resolution and make use of it based on the various settings requested at the time of print. A trained user using extensive quality control will indeed attain a true 300 dpi output, however, that person took an early-retirement buyout 5 years ago and is either on disability from being exposed to various printing chemicals for 30 years or is working at your local gas station. Or both. Regardless, if you have an image that claims to be 300dpi and you print it on your 600dpi inkjet, You’re likely getting a 144dpi image represented by the much smaller 600dpi dots via some sort of, often proprietary, interpretive composition system. Be careful making claims here. You might end up under a headline you don’t like on Wired.

Nowhere in here does anyone discuss your eyes’ adaptive characteristics. I have 20/20 vision and challenge my optometrist to beat me every time I’m in. I don’t wear glasses, yet, according to my Dr. Now, I bet there’s professors on here doing a lot of math that might have better vision than even me, however, math is what they do. I do pixels. EVERY DAY. That means I know exactly what to look for, quickly, and how to get to the bottom of it. Late at night I actually become agitated by the rough edges of type on my monitor. I’d argue that after a long day my eyes will achieve a hyper-sensitive level of detail. You can do all the math that you want, but the eye is an organic device prone to error and hyper-realism at any given moment. Not to mention emotional reaction. “I hate Steve Jobs! No where’s all these pixels? THERE! There they are! He’s WRONG!”

All of this aside, I’ve been examining pixels and halftones for well on 20 years and I can tell you that resolution is hogwash. A myth to be flogged by the self-righteous. A good is a good image is a good image is a good image is a good image is a good image is a good image is a good image is a good image is a good image is a good image is a good image is a good image is a good image at ANY resolution.

Apple excels at involving high-quality image processing as part of its operating system. They know that a solid imaging engine provides a much more reliable, consistent, and pleasing user experience. Whether you like Apple or not, they are better than everyone else at this. They are. One of the few things I can say definitely is that Apple is better at imaging. Hands down.

So, we have an experience-obsessed CEO talking about his precious screen at a press event. He made an exaggeration. I’m the son of a scientist. He never lied and everything he said was accurate and provable. Barking spiders are invisible, yet stinky, insects.

Sure Steve made broad claims, but I doubt many of you have made better claims from bar stools, living room couches, lecterns… or god-forbid… website comments…

@PT: A single white pixel on a black background is not really a good test as the “light”, will/can be “bleed” distributed over several light sensing cells in the retina. The opposite test should though be better, a single black pixel on a white image. If the black pixel can be identified the “bleeding” distribution of the neighboring pixels are low enough to distinguish it uniquely…

I’m surprised no one really explained very well the difference between printer resolution and screen resolution. A 300 dpi printer is not remotely the same as a 300 ppi screen.

To simplify, let’s take a black-and-white printer with only one ink — black.

Because the ink is black, this printer can print only black dots; it can’t print gray. It has to simulate gray by spacing out tiny, tiny black dots.

A screen, however, can display levels of gray — the higher the bit depth, the more shades of gray.

So a printer has to “build” gray pixels by using a grid of even tinier dots (usually measured in lines per inch). It’s this grid, these bricks that make up each “pixel” (the discrete box of gray that the printer is trying to represent), that is the listed resolution of the printer.

The fewer dots in a given pixel area (line), the lighter the gray; the more, the darker.

We can adjust the size of the pixel (technically, printers don’t have pixels, but we can imagine they do) to make them smaller, so we get more pixels per inch, but since the dots that build the pixels remain the same size, we lose the ability to create shades of gray. (Fewer dot combinations can fit into a reduced area.)

So, again, the resolution listed by the printer is NOT the “pixel” resolution; it’s the maximum number of dots — the building blocks that make pixels.

Most professional magazines use presses that run at about 2700 dots per inch and use about 150-175 lines per inch. High-end jobs might use 225 lpi and possibly somewhat higher. THAT is your equivalent resolution (to the degree that you can compare the two).

Obviously, there are many other factors involved in whether the image/text looks nice and is readable — and paper/ink and screen are very different media.

– Impressive vs Important –
Or, never more aptly put than Tripper (aka Bill Murray) in “MeatBalls”, “It just doesn’t matter… It just doesn’t matter!”

I have stopped spending hours calibrating my camera, monitor and printers, because the vast majority of my customers will hang the wedding photo under a deep yellow incandescent light, riddled with reflections off of the glass in the dollar store frame. At the start of the season, I take a few shots, send them off to the local printer, get my monitor to be pretty close to the print, and then I move on and never think about it until next year.

While this scientific roundtable is far more illuminating and impressive than the 99.99% of the other websites’ excuse for a blog, none of it is important if you don’t factor in the iPhone user’s brain. With a pair of 20/20 eyes, they still manage to miss the 6 bright red traffic lights scattered all over the intersection and travel full speed into the rear end of a brightly painted truck that was visible 4 blocks away. The eyes may have a resolution of 1 arc second, but their brains, apparently, couldn’t resolve a truck in the middle of the road directly in front of them.

Good article, but here is what I got out of Mr. Jobs speech,”the iPhone 4 has a great display, we’re getting to the point where we won’t be able to get them looking significantly better”

The details are interesting, but here is all the audience of Mr. Job’s presentation wants to know:

– is the iPhone 4 display noticeably better than the iPhone 3GS display?
– is it the best display out there today?
– will I say “wow, nice display when I see it”?
– is it getting close enough to what the eye appreciates that I won’t likely want a different smartphone in the near future based on quality of display?

If the statements above are true, then any technical arguments are misplaced –this was a marketing show by a CEO, not the presentation of a PhD thesis by a doctoral student.

There are three points that strike me in this thread:
1. The article says, “Jobs is actually correct.” but many, many commenters dispute Jobs’s accuracy. If one goes to the context of the claim, the claims are understood to be very straightforward. (See below.)

2. Some commenters use the false controversy to assert Android superiority. I couldn’t find a DiscoverMag article to this point, but to claim that the recent Android devices have “800 X 480” screens is to ignore the fact that any attempt to achieve that resolution results in horrible, horrible color problems. (Again, see below.)

3. Defining the iPhone screen in terms of spatial resolution is like claiming that voting is all about putting lines on a piece of paper. The nit-pickers, and a couple of people who maybe don’t use their phones much, have utterly missed what a wonderful, wonderful improvement this will be for many of us.

OK? Here’s my supporting info.

1. Jobs is correct in the claim — even more than the article says.
I don’t have a link to the presentation handy, but when I watched, I did see a very pixelated, lower-case “a” compared to a visually-perfect one on-screen; I thought, “why use a misleading graphic?”

But then Jobs said, not cited by Wired although in the immediate context, was, “at that point [300 ppi], things start to look like continuous curves, and text looks like it might on a fine-printed book.”

Again: Jobs’s actual claim was that that at ppi counts below the iPhone’s resolution, the eye starts smushing the dots enough that the square pixels start to look not like bumpy squares, but like continuous curves.

Jobs is perhaps guilty of imprecise language in his presentation, but was, by the evidence above, utterly factually correct.
The Wired article is guilty of taking his speech out of context and claiming that Jobs said something that he did not. Ditto, the DisplayMate source focused on an incomplete claim before saying it was a stretch.

2. Android is much, much worse.
The Google I/O conference, a few days before Apple’s, featured repeated comparisons to Apple. At least in their minds, there’s a war going on, and we see in comments here, how Truth has become a casualty.

Unlike the Android fans and Google execs, Apple made very little mention of his competition, a bit to say how Apple was later but prefer their style of multitasking. I did NOT hear Apple bash Google about the screen.

But many Nexus users are rather unhappy about how fuzzy a supposedly 800 X 480 screen is. Sure enough, when the DisplayMate prexy looked at the 800 X 480 screens that are going into many Android devices, he finds that trying to actually GET that resolution produces weird color problems (moiré) due to the fact that no pixel has all three colors. A hacker was actually able to use this bug to create black-and-white pictures that look like washed-out color versions. The Nexus AMOLED screen, with several color artifacts, was labeled as “unacceptable,” IIRC. You cannot get both 800 X 480 resolution and also the promised (rather weak) color quality at the same time. As of the reviews I read, Google was promising a fix for some obvious rush-to-market faults in the display; I haven’t heard whether they’ve delivered.

Summation: despite lots of comments above about how Apple lies, Apple put forward a reasonable claim that some have managed to twist into an overstatement, while Google’s claims get a free pass on a claim that’s factually incorrect.

3. The point of it all.
I am amazed that a bunch of nit-pickers pretend to pass professional judgement on a small part of what Jobs said, to the extent of (in some cases, intentionally) distorting the meaning of the speech. And of utterly losing the point, which is that the display is dramatically more sharp than anything else available, and as a result of that extra quality, much more useful.

I have very good (corrected) vision, but both near-sightedness and presbyopia. 9″ away from my sans-glasses eyes is my sweet spot— how I read in bed. When I use my Kindle program or other book-like features for extensive reading on this phone, I will absolutely love this achievement. Those of us who handle a lot of email will appreciate being able to hold the thing at whatever comfortable distance and see lots of excellent, crisp text on-screen, even if it’s only as sharp as my wonderful laptop at 18″. It’ll make the extremely busy front pages of WSJ.Com, FT.Com and NYT.Com much more accessible. Even the condensed photos will be crisper at their non-zoomed view.

Sharp, correctly-lit text is a Big Deal in preventing eye fatigue; clear letterforms speed reading and prevent the eye from having to scan back as often and is up there with an appropriate line length and other typographic niceties in making reading a pleasure rather than something you do in spurts before you need to take a break.

= = = = =

I’m actually a bit amazed at how so many people have focused [wink] on so many things — especially, false claims of hucksterism; the claimed-superior but actually-inferior attributes of a competitor — when this screen is a very important part of a big productivity and pleasurability enhancement. Why are people so focused on supposed negatives, to the extent that they can’t see the many important positives?

Disclaimer: my interest in Apple is only as a consumer of some of their products; I also use Google and many other competitors’ products, too.

“Interestingly, I can’t even resolve the pixels on the 3GS screen from 12″ away, and that has half the ppi of the iPhone 4. Can anyone else?” —Ray Merkler

I can easily resolve the pixels on the screen of my original iPhone, but your situation is a perfect example of what the author of this article is talking about: The average person doesn’t have perfect vision. Far from it. Just look around you and notice how many people wear glasses!

The article fails to mention another reason why it’s hard to resolve the individual pixels in a computer display: Unless your eyes are trained to look for the jagged edges, they have to be not just visible but fairly obvious before the average person notices them.

Nice post, thanks for that. But, please, could you switch to SI units next time? This would make reading your post substantial easier (and for the rest of people from europe :). Feet, inch, arcmin, drives me nuts

Even if with good eyesight you could resolve the image as separate pixels as stated in the article I think something may be missing.

If two pixels next to each other were on and all others off could you tell if they were two as the distance that the pixels are apart is much less than the size of the pixel itself would it not appear as one thing not two?

Also the calculation 1/326 = 0.0031 assumes that there is no space between the pixels.

Consumers who’ve seen it have already discerned a difference. Anyone on this forum actually *seen* one of these yet? Or is this all typical internet blather?
>>
Cory Says:
there is no difference a consumer could easily discern.
>>

Having read the several articles, I believe the main issue here is Job’s use of the term “limit of the human retina,” which does if fact imply perfect vision. At best, it’s an unfortunate choice of words, chosen no doubt to highlight the fact that this is indeed the best display on the market – do other manufacturers even care? At worse, it provides an opening wide enough for Jobs-bashers to drive a truck through.

He might have been better off using phrases which referred to average viewers.

So, as Andy (25) and others have pointed out, it is entirely destroyed by the reality. Nobody will be able to tell two pixels apart on the iPhone 4 because no two adjacent pixels will ever be 100% black and white except in a specific display checking app.

@DataJack (39): you realize that most people live OUTSIDE the US, do you?

1) The diagonal resolution of the display is a factor of 1/sqrt(2) smaller (assuming square pixels). So it should be included as the worst case for point separation.

2) While the angular resolution is the most important technical property when it comes to display resolution there’s another property that relates to how images are created on a screen and how pleasant they look to us. I’m speaking of aliasing and nearly bandlimited image reconstruction.
Two displays with relatively high and identical resolution but different pixel shapes can create very different subjective impressions of being “pixelated”. If the blurriness of the pixels matches the pixel spacing nicely you can’t identify single pixels, no matter how much you zoom in. The limited resolution will only result in lack of sharpness. This is generally a lot more pleasing than pixels with the same size but sharp boundaries. That’s also the reason why image processing puts so effort in finding good scaling methods for raster images.
Liquid crystal displays greatly limit the smoothness of single pixels due to their microscopic structure, so you typically see the pixels (and black grid lines) when you look closely (and software antialiasing can’t change that). Organic LEDs could possibly be better in this aspect, because the pixels can have fuzzier edges.
This all doesn’t matter if you’re well beyond the resolvability limit however. But if you want to know how pleasant a display looks, you need to consider it in general.

It seems I actually have better than ‘perfect’ eyesight. Yeah, most people don’t, but variance between people sometimes does this. Sorry, just had to get that out. I’m tired of this ‘perfect’ vision thing.

And, there’s that effect of things not looking perfect just yet right at the resolution limit. You can’t really tell pixels apart there, but it isn’t the sharp continuous image the real world would give. Yeah, there are laws about resolution, but those assume idealized filters that just don’t exist in the real world, or are practical in rendering for that matter. That ‘ideal’ sinc filter also isn’t ideal. If you try it, the signal clips a lot (you can force brightness down to fix positive peaks, if that’s any better, but it actually causes negative values, which no display can emit), and the ringing is insane. What it comes down to is that in practice, details beyond the resolution limit still leak out as a form of aliasing.

@184 William Beaudot: Last time I looked up the number of photoreceptors on the retina it actually varied by an order of magnitude or so between people. Another factor besides the eye’s optics. And speaking of those optics, I think they play a role in the distance you’ll comfortably hold the phone at. Accommodative ability in particular. I can hold things far away or right up to my face, it doesn’t matter. If I hold it close it’s not because my eyes have no angular resolution. You could think of not ever holding it close as inability to focus nearby (perhaps doing it instinctively). Just saying in case some people think 25 cm/10 inches is awkward (although it would be awkward if you held it that close all the time).

But stop this already. I won’t complain about a 326 ppi display (and that’s 960×640 full RGB pixels, not 800×480 subpixels), whatever they call it. Allow me to complain about how it will only get here in July (not to mention the wait at some other places), and especially the carriers. It’s not just a high-res display, it’s a phone that’s meant to have connectivity. The exclusive carrier’s network over here has collapsed, and they even admitted it. There’s still rumors there will be others this time, which I hope are true. Oh, and most importantly, will it blend?

A pixel has a dimension, its not a singularity. Therefore the whole argument is bogus as you need to know the dimensions of the gaps between the pixels. If the gaps that separate the pixels are .0020 or less they they cannot be resolved and the display will look continuous. Jobs was right.

I think you should take a more evidence vs empirical based approach:
a) no iFan complained about the display hence quadrupling the resolution was ludacris
b) there’s far more to take into account than sharpness:

Alan is right that the need for anti-aliasing would be eliminated if this display resolution really is of sufficient resolution that human eyes at normal distance cannot distinguish “jaggies.” But few people realize just what it would mean if we could dispense with anti-aliasing (and color-positioned sub-pixel rendering, e.g. ClearType™) once and for all.

Anti-aliasing blurs the edges. Sub-pixel anti-aliasing also causes color fringing of text. We use them because the loss of legibility that results from this is less than the loss of legibility that would result from the jaggies of unprocessed text at low display resolutions such as 72–96ppi.

For the same reason, we use anti-aliasing in games, both 2D and 3D. 2D anti-aliasing is used in Flash and other vector animations, CAD and 3D modeling displays, etc. to smooth out the displays at the cost of added blurriness.

Anti-aliasing is not computationally free. It requires memory and CPU horsepower to accomplish. Most modern video cards have special circuitry specifically dedicated to anti-aliasing, both for 2D and 3D graphics. This extra circuitry uses additional electricity and generates additional heat.

Without hardware assist or special trickery algorithms, anti-aliasing requires rasterizing the text, 2D vector object, 3D scene, or whatever to a raster much larger than the final display size. For instance, if you want 16 total shades from pure background to pure foreground with 14 intermediate shades in-between to anti-alias with, you must rasterize to an internal memory buffer that is four times larger in both width and height, for a total of 16× larger in area (and thus RAM usage and potentially rasterizing duration [there are other ramifications at play here]), then “shrink” the larger bitmap down to its final size with interpolation of the shades of color, and finally “blit” the final bitmap onto the video RAM for actual display.

Anyone who’s played a 3D game and tweaked the video card settings for it knows that you have a choice between enabling anti-aliasing (at various levels, with optional anisotropic anti-aliasing in modern video cards) and leaving it disabled. If you enable it at, say, 2×, your game will run considerably slower — slightly slower than it would if you had anti-aliasing disabled but were using a display resolution twice as high and wide (e.g. if you were playing at 800×600, the game is actually rendered at 1600×1200, and then down-sampled to the final 800×600, so in this case it would play somewhat slower than it would without anti-aliasing if you were playing at 1600×1200! 4× anti-aliasing at 800×600 final display size must render to a buffer at 3200×2400, and 8× anti-aliasing would render to a buffer at 6400×4800 to display at 800×600!).

Even with hardware assist, this generally means much slower frame rates, so you have to make the choice between “jaggies” along the edges of objects (which partly ruins the psychology of the immersive effect of the game’s realistic 3D rendering — real life doesn’t have “jaggies” around the edges of things, that stay put horizontally and vertically while the object and/or “camera” moves) or slower frame rates (which mean getting nailed by opponents who have faster frame rates because they can respond faster than you can — not to mention that if the frame rates get too slow, you lose immersion from THAT because now you’re having the effect of a slide show instead of a movie!).

Imagine not having to make any of those trade-offs anymore. No more blurring and/or color-fringing text to make it more legible. Text is razor-sharp, just like on paper. No more slower frame rates in 2D and 3D animations just to make the edges smooth instead of jaggy. No more battery-wasting and heat-generating circuitry dedicated to reducing the memory and/or CPU burden of anti-aliasing. No more slower response to text and graphics display, screen scrolling, etc. because of the CPU time used for anti-aliasing. The CPU and video circuitry could even be clocked slower and still have a smooth user experience (since they wouldn’t be wasting time on anti-aliasing), further extending battery life. Less memory would be needed for the OS, since less would be needed for rendering displays (no need for an off-screen buffer to render objects at larger sizes to), leaving more for the applications. And so on, and so on, and so on.

This is not a mere gimmick. This is a really big deal. I understand that the iPhone 4 still uses anti-aliasing, but perhaps soon they’ll realize that they no longer have to (if the resolution really is high enough for this), which would greatly increase the quality of the overall user experience in many respects.

I’ve had my droid since december but had to get a substitute because the software program ruined my telephone when it up to date itself. So evidently having troubles & found this excellent site lol i do have a questio tho…my ram storage is gettin smaller & smaller..is there nethin i can do to help get sum back?

I loved this article. Very interesting. I never thought the resolution as an angle, never realized the word resolution itself draws the line between what your eye resolve and what can’t be resolved. As many people I was just thinking about pixels and pixels per inch and forgetting that most of your time (except for geeks maybe) you’re not seeing pixels but true living things.
Thanks!

I’m unable to resolve the pixels from the iPhone 4 screen, even with my glasses. For me it’s just as if it was perfect.

Another important think to add is that pixels are composed of three subpixel elements of different colors: red, green and blue. These are definitely smaller than the eye can tell. So even if pixels themselves are barely resolved by people with perfect vision, their subpixel components aren’t.

Very nice calculations I enjoyed it really. Thanks for this deep information (not about the iPhone but in general) …

I can surely say that even Jobs wouldn’t have that idea of the figures correctly which you have played with and He must have got upset upon hearing from Soneira’s dispute.

But sincerely, IMO, it should be more logical and technical rather than the averages. If Steve told about the Technical Specifications, He should have been exactly accurate to the point. And if Soneira has disputed, technically, to the Steve’s statement, I guess He is right about the figures too. It really is the fact that someone should be aware of..

Even you have tried to figure it more accurately, why? because of course it should actually be more accurate in terms of Technicality. You may have supported Jobs only based on emotions because If I compare, technically(not emotionally), you actually supported Soneira. So I guess there is no room to support Steve’s statement based on the averages. I could have accepted your support to Jobs only and only if Jobs had not been more specific to the figures and if he had also used the terms with averages or approximations. But he didn’t. He’s always been more than confident so He fails at it.

A person with technical genes would always speak based on the accurately “defined” figures. How Soneira did, and even you did.

Thanks again… for the article… I really have got what never had thought about..

An excellent combination of physics and common sense to debunk a marketing claim, very interesting stuff!

I’m about to get myself a new iPhone 4 (upgrade from my expired contract 3GS) and I’ve just come across this article – given, a bit late – but it’s great to see that everything published in advertisments isn’t just taken for a given nower-days!

I notice that my angualr size calculator is being referenced here.
Well, the correct address is http://www.1728.org/angsize.htm
Why was this address change necessary?
In mid-May the domain name 1728.com was stolen from me.

I had this also experience, Ed, and also did the battery alternative. Worked for a although, but eventually had the same problem. I’m sure each successive generation is best, but now I’m pissed off! So I moved to a Nomad + Aol Music and I’ve been happy thus far. Ethyl Kirkness

It’s complicated, very complicated, and it depends. Depends on if what we are trying to resolve is a dot or a line (the eye sees a single line much better than a dot and a pair of lines even better) or if it’s a while line on black (best) or black line on white (not as good). Even more, it depends on the luminance of the object (how much light is illuminating it and how much it reflects) but this is not even relevant because the device is pushing out it’s own light (illuminance), which is much much more than the reflectance of paper. The 300dpi figure comes from assumptions that from a viewing distance of 25cm (10″), the eye can under almost all circumstances not resolve any better (it’s where the circle of confusion being 0.030mm is derived). I would think that 260dpi on a backlit LCD is better resolution than 300dpi on paper because of the brightness, but again when you get to this level it does become a coke vs pepsi subjective taste test. To accuse Jobs of lying about this level of nit-picking … well, that’s an issue that needs not to be dignified with a response.

Interesting to note, my understanding of the eye’s resolution is that it is actually far higher than discussed in this article. Clarvision (http://www.clarkvision.com/articles/eye-resolution.html) and others all say the eye’s resolution is 0.3 arc-minutes, not 0.6 or 1. My practical tests (including one described below) seem to agree with that. At 12″ you need a resolution of about 954 (not 477) to have contrasting pixels unresolved.

But the key word there is “contrasting”. Contrast is a key factor nearly everyone seems to be ignoring here..?

Even for someone with PERCECT vision, the eye’s ability to distinguish between two adjacent pixels of high contrast (eg. black and white) is much greater than its ability to distinguish between two adjacent pixels of low contrast (eg. two shades of grey, only slightly different from each other). Since almost everything in computer software these days is anti-aliased, we really can’t see the difference between two adjacent dots, most of the time, and therefore Steve’s claim is correct.

One way to test this. Using any image editing program, create a graphic which is made up of alternating black and white lines or dots, each one pixel wide. View this on your 330 ppi iPhone 4/4S. At 12 inches from your eyes, can you see the individual dots? I can. Retina display my foot.

Now move your eyes slowly away from the iPhone. At some point, the alternating black and white dots/lines will merge into one single grey image and you won’t be able to see the individual dots/lines any more. For me that was about 3 feet away. And if you examine the math correctly, it supports that.

Now try the same experiment with dots of much lower contrasting colours. Two similar shades of grey, or two similar shades of anything really. If the colours are close enough then I can’t see the difference even two inches from my face. I tried the same thing with the new Retina MacBook Pro and the new iPad. Similar results. This is essentially the effect anti-aliasing has on anything displayed on a computer or mobile device display.

So really, what part of anything we see on our displays these days is monochrome – ie. not anti-aliased in any way? I think the answer to that is none…? Even high resolution photos are anti-aliased when scaled down to the lower resolution of the display, to compensate. So on that basis, these displays really are “retina” displays and Steve’s claim, for nearly all practical purposes, is in fact true.