An in-depth analysis of the Retina Display on the new iPad claims Apple has addressed two of the "major weak points" of the second-generation iPad, sharpness and color saturation, and has upgraded them to be "state-of-the-art."

Dr. Raymond Soneira, display expert and president of DisplayMate, put the new iPad's display through its paces and agreed with Apple that it is the "best display ever on a mobile device." He also noted that the new iPad's picture quality, color accuracy and gray scale are even better than most HDTVs, laptops and monitors.

Soneira found Apple's own definition of a "Retina Display" to apply to the new iPad, assuming the device is held 15-18 inches away from the eyes. He did, however, take issue with Apple's use of the term "Retina," as he has in the past, because the "true acuity" of the retina would require at least 458 pixels per inch for them to be indistinguishable at that distance.

According to the report, the new iPad "decisively beats (blows away)" all of the other tablets DisplayMate has tested.

"As expected, all of the images, especially the text and graphics, were incredibly and impressively razor sharp. In some photographs, that extra sharpness made a significant difference, especially in close-ups and when fine detail like text was photographed," Soneira said.

The analysis discovered that the new iPad has "a virtually perfect 99 percent of the Standard Color Gamut." By comparison, the iPad 2 has just 61 percent of the gamut.

"The colors are beautiful and accurate due to very good factory calibration – they are also “more vibrant” but not excessively so or gaudy like some existing OLED displays," he said.

The new iPad is so accurate that Soneira believes that it could function as a studio reference monitor with some "minor calibration tweaks." The device's very accurate colors and picture quality make it "really shine," he noted.

One area the new iPad was not as strong in was screen reflectance. Reflecting 7.7 percent of the light from all directions on average, the iPad was listed as in "the middle of the range" seen for tablets and smartphones.

The tablet is also not as efficient as its predecessor. According to Soneira, the new iPad "uses 2.5 times the Backlight power of the iPad 2 for the same screen Brightness."

Soneira was, however, impressed by how Apple managed to preserve the 10-hour battery life of the first- and second-generation iPads without significantly adding to the device's weight and thickness. The new iPad's battery has a 42.5 watt-hour capacity, 70 percent more than the iPad 2. At full brightness, the third-generation iPad had a running time of 5.8 hours, compared to the iPad 2's 7.2 hours, but at medium brightness, the new iPad lasted for 11.6 hours, nearly identical to the iPad 2.

"Apple has taken the very good display on the iPad 2 and dramatically improved two of its major weak points: sharpness and color saturation – they are now state-of-the-art," Soneira concluded.

He awarded the new iPad the Best Mobile Display award for his company's video hardware guide and also gave the device the Best Mobile Picture Quality award. Soneira said the new iPad is now "qualified" for professional level applications, such as professional photography, medical imaging and field service technicians.

Alongside the high praise heaped upon the iPad, the report listed some areas where the Apple and other manufacturers could see further improvement. Screen reflectance, ambient light sensor, automatic brightness, display user interface, RGB LED backlights, OLED displays and size were all mentioned.

Soneira found Apple's own definition of a "Retina Display" to apply to the new iPad, assuming the device is held 15-18 inches away from the eyes. He did, however, take issue with Apple's use of the term "Retina," as he has in the past, because the "true acuity" of the

This sounds like a lame definition to me.

Quote:

However, Apple’s definition of a “Retina Display” is actually for 20/20 Vision (defined as 1 arc-minute visual acuity). 20/20 Vision is just the legal definition of “Normal Vision,” which is at the lower end of true normal vision. There are in fact lots of people with much better than 20/20 Vision, and for almost everyone visual acuity is actually limited by blurring due to imperfections of the lens in the eye. The best human vision is about 20/10 Vision, twice as good as 20/20 Vision, and that is what corresponds to the true acuity of the Retina.

Do we measure the normal gait based on the longest recorded gait? Do we measure the normal height of a person based on the tallest record height? Do we measure the average IQ by the highest recorded IQ? No, no and no. So why would Apple take the maximum presumed acuity on the Snellen test in order to define marketing term for normal vision? The idiom isn't "hindsight is 20/10" it's "20/20" so why market a term to a scale that means nothing to your consumer base? Is the point to show vision can be better than 20/20 or just to be pedantic for pedantic's sake?

PS: When I read something that seems mathematical and scientific I get turned off by the use of "lots of people" as the foundation for a definition.

This bot has been removed from circulation due to a malfunctioning morality chip.

I am totally loving the new iPad - the screen is incredible to say the least. But one thing I'd like to mention in the hope of saving anybody else the hassle - the high resolution is not very friendly to anti-glare screen protectors at all, pretty much negating the upgrade to retina. I assume it's because of the way that anti-glare protectors use a microscopic surface of raised dots to bounce light off in all directions. I'm guessing on the older iPads the screen pixels could each fit a whole bunch of these micro dots, so it averaged out and still made a white pixel look white, for example. The new screen has such tiny pixels it looks like each micro dot is now magnifying a whole pixel - all the white areas on the screen are now a "red, green and blue snow" to continue my example. So much so that I'd say if you need to use an anti glare protector for whatever your personal iPad use case is, don't bother getting the new iPad, stick with the old iPad 2. That all said, I would love to hear if anybody finds an anti glare screen protector that works on the new iPad and proves me totally wrong.

I am totally loving the new iPad - the screen is incredible to say the least. But one thing I'd like to mention in the hope of saving anybody else the hassle - the high resolution is not very friendly to anti-glare screen protectors at all, pretty much negating the upgrade to retina. I assume it's because of the way that anti-glare protectors use a microscopic surface of raised dots to bounce light off in all directions. I'm guessing on the older iPads the screen pixels could each fit a whole bunch of these micro dots, so it averaged out and still made a white pixel look white, for example. The new screen has such tiny pixels it looks like each micro dot is now magnifying a whole pixel - all the white areas on the screen are now a "red, green and blue snow" to continue my example. So much so that I'd say if you need to use an anti glare protector for whatever your personal iPad use case is, don't bother getting the new iPad, stick with the old iPad 2. That all said, I would love to hear if anybody finds an anti glare screen protector that works on the new iPad and proves me totally wrong.

I can see how that would be the case. You should try other brands though, or look up recommendations for other brands as they are added.

This sounds like a lame definition to me.
Do we measure the normal gait based on the longest recorded gait? Do we measure the normal height of a person based on the tallest record height? Do we measure the average IQ by the highest recorded IQ? No, no and no. So why would Apple take the maximum presumed acuity on the Snellen test in order to define marketing term for normal vision? The idiom isn't "hindsight is 20/10" it's "20/20" so why market a term to a scale that means nothing to your consumer base? Is the point to show vision can be better than 20/20 or just to be pedantic for pedantic's sake?

PS: When I read something that seems mathematical and scientific I get turned off by the use of "lots of people" as the foundation for a definition.

I think the point being made is clear. If you are going to define/market a devices capabilities relative to the capabilities of the user based on the overall limits of human abilities then the retina display is simply "normal" in it's capabilities.

It would be like Nike marketing a shoe for marathon running realizing that "normal" people usually just run in 5ks therefor making the shoe capable of just simple 5k races comfortably.

So yes, Apple defined their screens to match the capabilities of the human retina but the true capabilities of the screen fail to match the highest extreme. Is it nit picking considering how many people have 20/10 and would notice? Yes, but it doesn't invalidate the point either. It just sets a higher benchmark for the Apple TV to hit.

I think the point being made is clear. If you are going to define/market a devices capabilities relative to the capabilities of the user based on the overall limits of human abilities then the retina display is simply "normal" in it's capabilities.

It would be like Nike marketing a shoe for marathon running realizing that "normal" people usually just run in 5ks therefor making the shoe capable of just simple 5k races comfortably.

So yes, Apple defined their screens to match the capabilities of the human retina but the true capabilities of the screen fail to match the highest extreme. Is it nit picking considering how many people have 20/10 and would notice? Yes, but it doesn't invalidate the point either. It just sets a higher benchmark for the Apple TV to hit.

A more accurate marketing situation would be advertising a router as a "whole house" router, when it only had a 1000' range. Many mansions are larger then 1000' long, so it would not be an accurate term.

If you think we will ever see a TV with over 450 ppi, then you have very high expectations for a company that doesn't provide consumers with technology that isn't useful or cost effective.

I wish Apple had just picked another word to call their higher-resolution displays... or at least not brought all this math and science into it.

Then again... it's only a handful of people on internet forums who even discuss this topic... compared to the millions of Apple's other customers.

Dell calls some of their monitors UltraSharp and it's simply a brand name. Dell didn't try to explain what "ultra" means.

Apple uses the term "iSight" for some of their cameras. That seems simple enough.

Why not "iDisplay" for these high-resolution screens?

Some good news... in a few years Apple won't be making low-resolution screens anymore. So the "Retina display" can simply be called "the display"

The reason is that Apple chose something that actually MEANS something. Dell is a great example. What does 'Ultrasharp' mean? Absolutely nothing. It does not tell the consumer anything.

"Retina Display" has a real meaning - it says that the pixels are small enough that the average person can not distinguish them. That's a real, measurable thing, not simply a silly marketing term. I'd much rather have Apple using terms that mean something rather than simply make things up and expect the brand to add value.

Quote:

Originally Posted by bmason1270

I think the point being made is clear. If you are going to define/market a devices capabilities relative to the capabilities of the user based on the overall limits of human abilities then the retina display is simply "normal" in it's capabilities.

It would be like Nike marketing a shoe for marathon running realizing that "normal" people usually just run in 5ks therefor making the shoe capable of just simple 5k races comfortably.

So yes, Apple defined their screens to match the capabilities of the human retina but the true capabilities of the screen fail to match the highest extreme. Is it nit picking considering how many people have 20/10 and would notice? Yes, but it doesn't invalidate the point either. It just sets a higher benchmark for the Apple TV to hit.

Yes, it's nit picking. 'Normal' vision is defined as 20/20 (that is, in fact, the definition of '20/20'). Apple's Retina Display definition clearly applies to the average or normal vision. It makes absolutely no sense to claim that they can't use the term just because some people have abnormal vision. Heck, if you go down that route, why not say that some people wear magnifying lenses for reading, so they need to consider that?

Using your definition would be akin to not allowing speaker manufacturers who have 20-20,000 Hz output to claim 'full range sound' because some rare individuals can hear sounds up to 24,000 Hz.

It is not reasonable to expect someone defining a normal characteristic to cover 100% of every single bizarre situation with their claims (unless they specifically claim that it applies to everyone).

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

I think the point being made is clear. If you are going to define/market a devices capabilities relative to the capabilities of the user based on the overall limits of human abilities then the retina display is simply "normal" in it's capabilities.

It would be like Nike marketing a shoe for marathon running realizing that "normal" people usually just run in 5ks therefor making the shoe capable of just simple 5k races comfortably.
........

nonsense
It would be like Nike making a shoe that fit the majority of people, and allowed them to run it in their best time, knowing however shoes progressed, they would never find a better shoe! That would permit them to run faster

The retina is a good term, better than ultra, or mega, or hyper, because it DOES have a meaning, and is not a superlative. It means that w hat ever dell et al, even apple do in the future, the average person will not, under normal use be able to percieve the difference in the pixels

A more accurate marketing situation would be advertising a router as a "whole house" router, when it only had a 1000' range. Many mansions are larger then 1000' long, so it would not be an accurate term.

If you think we will ever see a TV with over 450 ppi, then you have very high expectations for a company that doesn't provide consumers with technology that isn't useful or cost effective.

We agree it is nit-picking. And a smiley implies that the previous statement is made somewhat in jest.

That said, 4k Super OLED is being made now. Necessary? Nope. Is there anything to watch at 4k? Nope. But it will designate the floor of what resolution numbers people will hear and expect. The iPad 2's screen is usefull. The new iPad screen is a luxury in relation. But considering that most of your activity with the device is staring at it, the resolution becomes pretty important.

Apple simply tries to market their specs relative mostly to just their own products. And for the most part, considering they control the OS and the hardware that is fine to do, but they have always appealed to people's senses. Design, touch and feel, screens, etc.

If, and it is still a "if", they make a TV, they will most likely not market it as the floor benchmark in specs, I.E. 1080p. If the best screen they can reasonably produce and sell at a profitable price they will. Quite frankly, the most "useful" part of a TV is its screen. The navigation scheme they can come up with will be super and the icing on the cake but the TV is still, and always will be about the picture. The question is, can Apple make it the best TV and at a reasonable price?

nonsense
It would be like Nike making a shoe that fit the majority of people, and allowed them to run it in their best time, knowing however shoes progressed, they would never find a better shoe! That would permit them to run faster

The retina is a good term, better than ultra, or mega, or hyper, because it DOES have a meaning, and is not a superlative. It means that w hat ever dell et al, even apple do in the future, the average person will not, under normal use be able to percieve the difference in the pixels

I am not defending the article. But the point that the article is making is that the screen is built for the limits of the average normal viewers capabilities, but by its definition it is marketed as theoretically designed for the "limits" of the human eye.

It makes it a nit picking argument to say the least, but not necessarily a invalid one. I doubt that 4 out of 1000 people with 20/10 would complain if they noticed in the first place. The articles point is both accurate and irrelevant.

This sounds like a lame definition to me.
Do we measure the normal gait based on the longest recorded gait? Do we measure the normal height of a person based on the tallest record height? Do we measure the average IQ by the highest recorded IQ? No, no and no. So why would Apple take the maximum presumed acuity on the Snellen test in order to define marketing term for normal vision? The idiom isn't "hindsight is 20/10" it's "20/20" so why market a term to a scale that means nothing to your consumer base? Is the point to show vision can be better than 20/20 or just to be pedantic for pedantic's sake?

It sounds like you missed the point of his complaint. "Retina" refers to a particular part of the eye, which has basically the same "resolution" for everyone. His point is that using the adjective "retina" implies that the pixels are indistinguishable by the retina. What we don't all have is a perfect lens. He, presumably, would have preferred the name "20/20 display" or something.

It sounds like you missed the point of his complaint. "Retina" refers to a particular part of the eye, which has basically the same "resolution" for everyone.

That's like saying Foot Locker is a scam unless it has shoes for every possible sized foot.

Quote:

His point is that using the adjective "retina" implies that the pixels are indistinguishable by the retina.

And they are for those with normal vision and when holding the display x many inches from your face. For a marketing term Apple has qualified it with both required factors

Quote:

What we don't all have is a perfect lens. He, presumably, would have preferred the name "20/20 display" or something.

There is nothing sketchy or incorrect about the term. Apple has never once said that it is impossible that someone on Earth will vision so sharp that they will not be able to discern pixels fro x inches away.

You and Soneira are getting hung up with the word retina as if using a medical term means it can no longer be a marketing term and you have to go with what is deemed the best possible vision every recorded. Which by the way appears to be 20/8 as i found while trying to locate, unsuccessfully, the percentages of the population that have visual acuity at various levels.

This bot has been removed from circulation due to a malfunctioning morality chip.

I am totally loving the new iPad - the screen is incredible to say the least. But one thing I'd like to mention in the hope of saving anybody else the hassle - the high resolution is not very friendly to anti-glare screen protectors at all, pretty much negating the upgrade to retina. I assume it's because of the way that anti-glare protectors use a microscopic surface of raised dots to bounce light off in all directions. I'm guessing on the older iPads the screen pixels could each fit a whole bunch of these micro dots, so it averaged out and still made a white pixel look white, for example. The new screen has such tiny pixels it looks like each micro dot is now magnifying a whole pixel - all the white areas on the screen are now a "red, green and blue snow" to continue my example. So much so that I'd say if you need to use an anti glare protector for whatever your personal iPad use case is, don't bother getting the new iPad, stick with the old iPad 2. That all said, I would love to hear if anybody finds an anti glare screen protector that works on the new iPad and proves me totally wrong.

I'm not sure this is the screen protector. I don't use one and I've noticed the same effect.

Reading a page of black text on a white background on the new iPad where one word has a red underline (because of the lame un-editable dictionary), the whole area around the word appears pink in many lighting situations. I've noticed quite a lot of situations where white areas are actually blue, red, or green tinged as a result of adjacent coloured elements. I'm thinking it's more a trick of the eye than a physical flaw of the device.

That said, 4k Super OLED is being made now. Necessary? Nope. Is there anything to watch at 4k? Nope. But it will designate the floor of what resolution numbers people will hear and expect. The iPad 2's screen is usefull. The new iPad screen is a luxury in relation. But considering that most of your activity with the device is staring at it, the resolution becomes pretty important.

You are countering with a different argument. 4K does not exist to make small devices higher resolution. 4K exists to make HUGE displays (theater sized) not have HUGE individually visible pixels. Different purposes, different cost strata.

Regular consumers didn't go out and buy 70mm cameras and projectors because they could get higher resolution images. They stuck with 8mm and 16mm because that gave roughly the same screen resolution as 70mm when projected on a small screen at home and watched from only a few feet away.

Also don't confuse major capability with luxury. Text sharpness is FAR better, which will have a significant effect in reducing eyestrain and reader comfort. Just because that made other graphics better too does not make it a luxury.

The resolution of the new screen is nice but I think the other things mentioned are are equally significant. Accurate grayscale and color saturation are more important to me. I can't read really small text no matter how crisp it is. I used to have perfect vision. Not so much anymore.

.... His point is that using the adjective "retina" implies that the pixels are indistinguishable by the retina. ...

While we are being incredibly picky and nit-picky ...

"Retina" is not an adjective and taking it as such is the source of the problem.

"Retina Display" is a single, whole, marketing term.

This is not a display that is being defined as "having the quality of retina-ness" but simply a "Retina Display" defined in the exact way that Apple defines it.

Before Apple came up with the term there was no one talking about making displays "more retina" as using retina as an adjective is meaningless and stupid. Just eliminate the tiny space between the words with your mind and all the so-called problems with the definition disappear like magic.

You are countering with a different argument. 4K does not exist to make small devices higher resolution. 4K exists to make HUGE displays (theater sized) not have HUGE individually visible pixels. Different purposes, different cost strata.

Off topic FYI: I tested a 4k video downloaded from YouTube (100.2MB for 20 seconds) on the new iPad. It would load into iTunes but not sync over. Then I tried to load it in Dropbox and then tried to stream the video but it wouldn't load.

This bot has been removed from circulation due to a malfunctioning morality chip.

The resolution of the new screen is nice but I think the other things mentioned are are equally significant. Accurate grayscale and color saturation are more important to me. I can't read really small text no matter how crisp it is. I used to have perfect vision. Not so much anymore.

I can see why Apple wanted to go double resolution now instead of 2013 but I wonder if these other factors were a strong focus for them or just a happy accident. I'm guessing that it was also important, while not the most important, to get the display to be visually accurate so that it can get more traction in medical imagining and other fields were accurate photo reproductions are important.

This bot has been removed from circulation due to a malfunctioning morality chip.

You and Soneira are getting hung up with the word retina as if using a medical term means it can no longer be a marketing term and you have to go with what is deemed the best possible vision every recorded. Which by the way appears to be 20/8 as i found while trying to locate, unsuccessfully, the percentages of the population that have visual acuity at various levels.

The whole notion of 20/x is based on distance of 20 ft which is not the normal reading distance for a mobile device. There is a certain point where the display is high enough resolution to satisfy the vast majority of the users, even those with excellent vision. The current iPad screen should be close to that range.

The analysis discovered that the new iPad has "a virtually perfect 99 percent of the Standard Color Gamut." By comparison, the iPad 2 has just 61 percent of the gamut.

"The colors are beautiful and accurate due to very good factory calibration they are also more vibrant but not excessively so or gaudy like some existing OLED displays," he said.

The reviewer delivered his results in a very odd way here. He didn't mention his testing methods. Was it a colorimeter? spectrophotometer radiometer? Assuming the device is within spec, the tolerance levels between such devices can be massive. How is he defining "standard color gamut"? sRGB and Adobe 1998 are the most common reference points in display gamuts. Displays that approximate Adobe 1998 are often referred to as wide gamut. They're harder to control than sRGB. It's easy to end up with overly saturated colors in the ui. sRGB is closer to that of the thunderbolt display, but I think the thunderbolt display uses a hardware native color temperature rather than a D65 target.

The historic problem with LED color related to stability and once again a shift in color temperature. I don't know if that's the issue with OLED, but calling it gaudy is just massively dumbing down the explanation. I don't blame appleinsider on this one. I think it's just that the reviewer is overly simplifying his explanation, not that I was expecting a white paper on it. I just find find the look and feel explanation kind of odd on a highly technical subject, especially given that unrealistic saturation can make other things look really weird. The visual comparison doesn't take into account the exposure of the image or the real color of the flower.

The reviewer delivered his results in a very odd way here. He didn't mention his testing methods. Was it a colorimeter? spectrophotometer radiometer? Assuming the device is within spec, the tolerance levels between such devices can be massive. How is he defining "standard color gamut"? sRGB and Adobe 1998 are the most common reference points in display gamuts. Displays that approximate Adobe 1998 are often referred to as wide gamut. They're harder to control than sRGB. It's easy to end up with overly saturated colors in the ui. sRGB is closer to that of the thunderbolt display, but I think the thunderbolt display uses a hardware native color temperature rather than a D65 target.

The historic problem with LED color related to stability and once again a shift in color temperature. I don't know if that's the issue with OLED, but calling it gaudy is just massively dumbing down the explanation. I don't blame appleinsider on this one. I think it's just that the reviewer is overly simplifying his explanation, not that I was expecting a white paper on it. I just find find the look and feel explanation kind of odd on a highly technical subject, especially given that unrealistic saturation can make other things look really weird. The visual comparison doesn't take into account the exposure of the image or the real color of the flower.

AnandTech's preliminary results of the display might have a little more meat for you.

You are countering with a different argument. 4K does not exist to make small devices higher resolution. 4K exists to make HUGE displays (theater sized) not have HUGE individually visible pixels. Different purposes, different cost strata.

Regular consumers didn't go out and buy 70mm cameras and projectors because they could get higher resolution images. They stuck with 8mm and 16mm because that gave roughly the same screen resolution as 70mm when projected on a small screen at home and watched from only a few feet away.

Also don't confuse major capability with luxury. Text sharpness is FAR better, which will have a significant effect in reducing eyestrain and reader comfort. Just because that made other graphics better too does not make it a luxury.

Nope, sorry, you missed my point entirely. I argued that Apples benchmark for a TV, if they make one, would not simply be 1080p at current resolution based on the previous posters statement that "apple only does things that are 'useful'". (I paraphrase)

My argument is simple, with regards to a TV there is no other real purpose or utility than the screen and it's resolution. You look at a TV, that is really all you do with it. And the best achievable screen that is suitable for the typical demands of a TV had better be more than just "useful" if Apple is going to claim it is better than what is out currently.

But I'm glad I gave you an opportunity to show that you are clever, but you over analyzed my comment. Especially, when I had admitted to a benchmark of a 55 inch retina display to be mostly a joke.

I'm guessing that it was also important, while not the most important, to get the display to be visually accurate so that it can get more traction in medical imagining and other fields were accurate photo reproductions are important.

Accurate grayscale, surprisingly, is not that important in medical imaging because with digital x-rays it is all about being able to adjust the histogram to accentuate the contrast in the particular region of interest. Accurate color is a nice thing to have for medical photography though, for publishing papers or CE documentation, but there is no governing standards based criteria for that.

I think the point being made is clear. If you are going to define/market a devices capabilities relative to the capabilities of the user based on the overall limits of human abilities then the retina display is simply "normal" in it's capabilities.

It would be like Nike marketing a shoe for marathon running realizing that "normal" people usually just run in 5ks therefor making the shoe capable of just simple 5k races comfortably.

So yes, Apple defined their screens to match the capabilities of the human retina but the true capabilities of the screen fail to match the highest extreme. Is it nit picking considering how many people have 20/10 and would notice? Yes, but it doesn't invalidate the point either. It just sets a higher benchmark for the Apple TV to hit.

Quote:

Originally Posted by Cloud30000

A more accurate marketing situation would be advertising a router as a "whole house" router, when it only had a 1000' range. Many mansions are larger then 1000' long, so it would not be an accurate term.

If you think we will ever see a TV with over 450 ppi, then you have very high expectations for a company that doesn't provide consumers with technology that isn't useful or cost effective.

Actually all products are marketed based on some recognized public figure of normal. The range advertised on a router is in fact based on open air scenarios most of the time, 5GHz is actually lower signal strength inside a house even though it is often advertised as further range. Higher frequencies have more trouble penetrating walls, the advertised range of 5GHz is usually based on open air statistics.

There is nothing wrong with Apple's claim, it's far less controversial than the vast majority of claims out there. It's advertising, how would it sound if Apple instead of calling it a retina display called it, "almost retina" or "so close to retina most people won't notice".

At least Apple's advertisements are all about showing how the device actually works & looks, unlike most Android products that use flashy graphics & overhyped claims to try & fool consumers into buying their products.

Nope, sorry, you missed my point entirely. I argued that Apples benchmark for a TV, if they make one, would not simply be 1080p at current resolution based on the previous posters statement that "apple only does things that are 'useful'". (I paraphrase)

My argument is simple, with regards to a TV there is no other real purpose or utility than the screen and it's resolution. You look at a TV, that is really all you do with it. And the best achievable screen that is suitable for the typical demands of a TV had better be more than just "useful" if Apple is going to claim it is better than what is out currently.

But I'm glad I gave you an opportunity to show that you are clever, but you over analyzed my comment. Especially, when I had admitted to a benchmark of a 55 inch retina display to be mostly a joke.

A Ferrari has a "major capability" too, but yet it is a luxury.

In trying to say I missed your point you seem to have ignored mine, and forgot you were talking about the iPad at the time:

Quote:

That said, 4k Super OLED is being made now. Necessary? Nope. Is there anything to watch at 4k? Nope. But it will designate the floor of what resolution numbers people will hear and expect. The iPad 2's screen is usefull. The new iPad screen is a luxury in relation. But considering that most of your activity with the device is staring at it, the resolution becomes pretty important.

Even if you just really munged your sentences, 4K is not meant for TV. I was pretty specific there too. The fact you want to make it something it is not is your issue to deal with, the rest of the world will get along just fine for the next decade or three. And I wasn't answering your joke, it was trite and not worth repeating, I was VERY specific in what I quoted and responded to, maybe you should be paying more attention.

now... Have you ever done realtime bandwidth on 4K video? I have worked with projects that have. We had a TEAM of researchers and tech support personnel setting up access and configurations on a custom internet trunk to push 4K video realtime. TV is realtime video, it is synchronous and hard time dependent to all destinations. YouTube/Hulu/streams are not realtime video, they are asynchronous streams, very useful, but not TV.

As for your retort on major capability, you are the individual with context and comprehension issues. Since when is an improvement from coarse to just fine enough resolution (equivalent to course magazine print) considered luxury? You are just full of misguided hyperbole and poorly mixed metaphors.

Coarse to "retina" quality video is more like the difference between writing with fat crayon and .5mm lead and had absolutely nothing to do with cars since they all have the same capability to move things from A to B with exactly the same resolution. I don't see anyone calling my Pentel mechanical pencil a luxury, despite being far better matched to everyday life than the crayon. (Insert your own meaningless high end writing implement luxury argument here, it will match your poorly chosen Ferrari problem.)

In trying to say I missed your point you seem to have ignored mine, and forgot you were talking about the iPad at the time:

Even if you just really munged your sentences, 4K is not meant for TV. I was pretty specific there too. The fact you want to make it something it is not is your issue to deal with, the rest of the world will get along just fine for the next decade or three. And I wasn't answering your joke, it was trite and not worth repeating, I was VERY specific in what I quoted and responded to, maybe you should be paying more attention.

now... Have you ever done realtime bandwidth on 4K video? I have worked with projects that have. We had a TEAM of researchers and tech support personnel setting up access and configurations on a custom internet trunk to push 4K video realtime. TV is realtime video, it is synchronous and hard time dependent to all destinations. YouTube/Hulu/streams are not realtime video, they are asynchronous streams, very useful, but not TV.

As for your retort on major capability, you are the individual with context and comprehension issues. Since when is an improvement from coarse to just fine enough resolution (equivalent to course magazine print) considered luxury? You are just full of misguided hyperbole and poorly mixed metaphors.

Coarse to "retina" quality video is more like the difference between writing with fat crayon and .5mm lead and had absolutely nothing to do with cars since they all have the same capability to move things from A to B with exactly the same resolution. I don't see anyone calling my Pentel mechanical pencil a luxury, despite being far better matched to everyday life than the crayon. (Insert your own meaningless high end writing implement luxury argument here, it will match your poorly chosen Ferrari problem.)

You know what, I made a couple of points. You don't agree with them. So here goes.

Everytime I picked up an iPad 2 at the Apple Store it felt less solid than our original iPad. I'm one who is glad the new iPad is a bit thicker. Same battery life is great, love the screen and it feels like I'm holding something very solid. I've never understood the cries about it weighing much. Then again, I walk around reading hardback books frequently. I still need to load up a D&D pdf and check it out

Everytime I picked up an iPad 2 at the Apple Store it felt less solid than our original iPad. I'm one who is glad the new iPad is a bit thicker. Same battery life is great, love the screen and it feels like I'm holding something very solid. I've never understood the cries about it weighing much. Then again, I walk around reading hardback books frequently. I still need to load up a D&D pdf and check it out

That's my view, as well. When I travel, I often have my 17" MacBook Pro and 3 or 4 hardback books - which often weigh at least a pound each. Even the 'heavy' iPad 3 weighs a lot less than the books it replaces - and saves even more weight on the trips where I don't carry the MBP.

"I'm way over my head when it comes to technical issues like this"Gatorguy 5/31/13

The iPad has QXGA Resolution. This is the same horizontal as the old 30" cinema display, and very close (I haven't worked it out) to the Thunderbolt Display.

The iPhone has DVGA Resolution, far less, but on a much smaller screen.

The PPI of these displays is also quite different, considering.

What links these is the perceived resolution at a nominal distance.

Retina Display may not be the best choice of anatomical descriptor, but it does describe the required specification phenomenally better than the xxGA/PPI, dimensions etc. It's not a marketing term, it's an achievement.

You know what, I made a couple of points. You don't agree with them. So here goes.

You win.

I don't really care what you think or say. Ok?

Do me a favor and add me to your ignore list. We'll both be happier.

Wow, somebody needs a hug.

If you want to debate, don't get needlessly antagonistic as you did in post 27, and be ready to back up what you say with fact. Opinion is fine, but when it is directly contradicted by the evidence at hand don't be surprised when someone points out that contradiction.

Now either decide to stop acting like a 3 year old, petulant asshat, and we can welcome you to the boards, or sulk away. I'm fine either way.