Apple didn't just take 'what was currently available', and you can tell because it was over a year before anybody released a device with equal PPI (LG, and so far only in Korea).

Apple pushed display makers to achieve far more PPI than anybody else required, because Apple specifically wanted 960x600. I doubt we'll see these 460 PPI displays in physical products till mid 2012 at the earliest - so likely Apple will have the highest PPI (shared with LG) for over 2 years.

27 inches

The Dell Ultrasharp 2711 has 27 inches of glorious goodness. It has a resolution of 2560 x 1440 (WQHD) and Adobe RGB colourspace.

IMHO it is the best monitor anywhere near it's price.

I believe that the new £1m Apple cinema display uses the same panel, but they use cheapo lighting so it only manages sRGB, and put a mirror on the front of it; such a shame :( MBPs drive the Dells very nicely though as they have displayport input :)

However, I agree that it would be VERY VERY nice to get some high-res panels into decent monitors.

... the eye can't detect individual pixels ...

Most people over 20 couldn't detect individual pixels at far lower resolutions, and for oldies like me it's the size of the display that matters so I don't have to keep fishing out my reading glasses to check texts and emails ...

Yes, but, the thing is...

Higher resolutions = fewer artificial features in the resulting image. The test image shown in the article demonstrates why this is something we should avoid. Intersecting lines and fine-detail will look bad on low resolution displays unless you blur the image.

Sadly, that is what we have had to put up with for many, many years. To make the image more 'real', anti-aliasing and bi-cubic interpolation are image processing algorithms used so often that virtually every image on every screen you encounter is using one or the other.

They both introduce blurriness that fools our eyes/brain into thinking there is far more detail in the image than is really there, with the unfortunate side-effect of subconsciously *not* fooling our very clever auto-focus ability, resulting (eventually) in eye-strain due to the constant attempts our eyes are making to focus on an image that is essentially impossible to focus on.

Apple's Retina display goes a long way toward display perfection - but it's still probably an order of magnitude behind being a display capable of showing images in enough detail that anti-aliasing and interpolation are no longer required.

Probably the need for anti-aliasing is that the image processing that the brain does tries to pick out "interesting" features, such as corners, where lines intersect, etc. that have high-contrast in 2 directions.

If you focus nicely on a pixel based display with a line at an angle, you see these "interesting" high-contrast points each time the line "jumps" from one row/column of pixels to the next, or the jagged bits of letters (the x as I type this is particularly bad).

I can't wait for higher resolution displays. In some ways, the substrate price could be the same (same size, possibly higher quality), with similar processing steps but the failure of individual pixels/sub-pixels should have lower impact, so manufacturing yeilds may not have to be affected. However the image processing requirements will increase quite a bit, and interconnect requirements means that the display will probably have to have some of the smarts on board to save having to connect 10000 signals to it.