Let’s be honest here, the iPhone 4’s Retina Display is absolutely stunning, with a resolution (326 ppi / dpi) that absolutely demolishes all competition in the market today. The new Retina Display, surpasses the resolution at which the human eye can resolve individual pixels, which is simply astounding (although debatable in its implementation, more below). But that’s still not keeping a select few from speaking out about it, saying that it’s nothing more than “spec exaggeration.” Samsung had their say about it today, and they believe that AMOLED is still the clear winner in the screen arena.

A Samsung representative spoke to the Korean Herald, saying that quadrupling the resolution on the screen would only increase the clarity at most only three to five percent. They also said that this type of display is too power-hungry, draining it almost 30 percent faster than Samsung’s Super AMOLED technology. Luckily, the new iPhone’s battery should be able to take it.

An AMOLED display does not need a backlight, which makes it much more energy-efficient. The use of OLED pixels also allows the Super AMOLED display to show colors more accurately, while also giving a higher contrast. Blacks are more black, colors are more vibrant, and has no limit on viewing angles. Just don’t take one out in direct sunlight.

The main draw back from traditional AMOLED screens is that they suffer greatly in the sun, but that’s not the case with Samsung’s Super AMOLED screens, which will be available on the company’s Wave, and upcoming Galaxy S smartphones. Building upon the benefits of existing AMOLED screens, Samsung’s Super AMOLED displays are much less reflective in the sun, brighter, thinner and even more energy-efficient. But it will take sitting the two types of displays right next to each other to truly get an opinion. We wouldn’t want to go soley on Samsung’s word alone – they’ve obviously been drinking some haterade.

Apple had said that the limit of the human retina is around 300 pixels per inch (ppi / dpi), and the new iPhone 4 Retina Display surpasses this this limit. Raymond Soneira,Founder, President and CEO of DisplayMate Technologies said that to truly achieve a display that would be invisible to the human eye, the PPI would have to be somewhere near 477PPI, and what Apple claims isn’t even true unless you’re holding the device about 1.5 feet away from your eyes. So yes, Apple has exaggerated about the clarity of their display, but it doesn’t mean that it makes the display a bad display. Plus, this isn’t the first time Apple has announced a slight fib about their achievements.

All in all, we’re gonna have to get the iPhone 4’s Retina Display and the Samsung Galaxy S’ Super AMOLED screen a proper showdown when we can actually get our hands on them. With this kind of war brewing, the consumer wins, and options are always great.

Apple didn't exaggerate any claims. Listen to the keynote, they clearly say the within 10-12 inches, the human eye cannot see pixels for resolutions higher than 300ppi. Again, within 10-12 inches.

kdarling

Blogs need to stop this "demolishes competition" junk. And learn how the eye sees, so you don't repeat Apple marketing hype.

Do you remember the 2007 Toshiba G900 WVGA phone? It had a PPI around 313, which also pushed it past the so-called retina boundary.

As for distance, yes it makes all the difference. Hold any Android phone a few inches further away than the iPhone 4, and they _also_ could be called "retina" displays.

Coder

Apple always makes their products seem ground breaking. iPhone multitouch – Microsoft had introduced it before, the Macbook keyboard – Sony and many other features. Apple sell their products at ridiculous price and people buy them because they are ignorant and fall for their eye candy. Apple's products are good but certainly not as good as many people think. Other companies such as Microsoft are really getting bashed wrongfully because of these exaggerations by Apple as they market their products.

Bill

It's about time something has come along to rival the iPhone and put an end to it's dominance~

C Bastant

You can say what you want, you are an Apple hater yourself. I was in that minority a few years back…..I was one of the many Iphone haters. But after getting some hands on experience with the Iphone, I have to say I was wrong. The feel of the Iphone is like no other. Its solid, easy to use, and has a lot of great apps to offer. the iphone 4 is everything that people say it isnt. It is the talk of the town, and for good reason. Its so solid feeling in your hand, yet its compact, and rugged as well. the screen is dazzling, the form factor is brilliant. and it fits in your pocket to boot! All of you people should really get some hands on experience with the Iphone before poo pooing it. I have a good idea that maybe you just dont have the means of actually buying one, so you have to put it down.

COMALite J

Don’t underestimate the importance of pixel density. The iPhone 4 (and Toshiba G900 before it) are certainly high enough that an average person could not see pixels at typical use eye-to-display distances. Remember, first-generation laser printers were only 300dpi, and that was when any individual pixel could only be either 100% black or 100% white, and the average person still could not see individual pixels at normal paper viewing distance (they could if held close up, or with a magnifying glass, and since the pixels were also needed to build halftone screens that were necessary for grayscale, increased resolution helped mightily there, which is why 600dpi is considered the bare minimum today).

With a full color display such as either IPS LCD or OLED (including Super AMOLED), there are no halftone screens (well, except maybe for special effects, or if viewing a photo scanned at high res from a newspaper or magazine, etc.). Since any pixel can be any of millions of colors (16,777,216 shades if the technology faithfully replicates every color that 8-bit-per-primary 24-bit-per-pixel color depth can supply), then the difference between any two adjacent pixels is not likely to be high-contrast except around the edges of characters or line art or some user interface elements — certainly not within photographs and the like.

Right now, much effort is spent on making relatively low resolution desktop, laptop, and handheld screens seem not as low as they actually are. One well-known example of this is anti-aliasing, which is a technique for blending the shades of adjacent pixels to blur “jaggies” and make them less noticeable. Whether text, line art, 3D objects (e.g. in games), etc., the adjacent pixels, if large enough to be easily visible, are also large enough to be ugly and unpleasant to look at, blowing the nicely crafted shapes of glyphs in fonts, the smooth Beziér curves of line art, the edges of objects and the textures that add detail to them in 3D games, etc.

The brute-force method of anti-aliasing is to render the object or image at some multiple of its actual displayed resolution, then reduce it in size using an algorithm which produces the smoothing of anti-aliasing. For instance, if the final displayed resolution of your monitor is 1600×1200 and you're playing a game, rendering it at that resolution will result in jaggies severe enough to shake your suspension of disbelief in the game. Turning on 2× anti-aliasing will render it at 3200×2400 instead, requiring quadruple the memory in the video card, quadruple the memory bandwidth, etc., and in return you get two intermediate levels of anti-aliasing (meaning that any given pixel around the edge of an object could either be 100% background color, 67% background / 33% foreground, 33% background / 67% foreground, or 100% foreground color — two pure shades plus two intermediate shades for a total of four used in the anti-aliasing). That still would mean visible jaggies, albeit softer-looking. It also means lower frame rates, and a need for a more powerful GPU.

If you up the anti-aliasing to 4×, then the rendering is done at 6400×4800 resolution. This is, of course, far higher than any monitor currently on the market. It also means 16× the usage of graphics RAM, bandwidth, etc. When it’s shrunk back down to 1600×1200, though, you get fourteen intermediate shades (sixteen in all when adding 100% foreground and background colors).

While that helps the realism considerably, it may still not be enough, especially on big screens or at lower resolutions. Note that many graphics cards offer 8× anti-aliasing modes, or even higher!

Similar techniques are used by your OS to render text, user interface elements, and other graphics images, and the costs are similar as well.

With a true retina display, none of that would be needed! Simply render at the physical resolution of the screen (which, admittedly, would be pretty darn high itself once such high densities make it to larger laptop and desktop screens — but even so, the need to then shrink the resulting huge image would be removed, saving many CPU / GPU and memory cycles).

At present, I do believe that the iPhone OS still does anti-aliasing, so they’re not taking full advantage of this, if indeed their display qualifies as true anti-aliasing. If they did, they could improve battery life considerably by removing much of the load from the CPU and GPU for basic display purposes.