I think it would be neat if one could configure one's eyesight parameters (astigmatism and myopia in my case), viewing distance, and perhaps age into a special display driver, such that a computer would present its user interface distorted (from the view of a normal-sighted person) in such a way that the bad eye, by applying its own "distortion", would essentially un-distort it so that the brain would receive a sharp view without the need for physical glasses or lenses.

I suspect though that this is not possible partly on grounds of limited information. If the error in eyesight would be linear (e.g. shift to the left or enlargement) an appropriate distortion seems trivial. But a realistic error (hence a realistic distortion) would lead to a situation where several image points get mixed into the same "pixel" (in the physical eye this would mean: hit the same receptor), and disentangling those would require a-priori knowledge about original image, which is not always available (could be available for regular window shapes, or moving images -- this may lead to headache :) It's strange though, because my glasses are simple physical devices that a computer should be able to simulate, but perhaps not in the limited confines of the 2D surface of its display.

Now all of this is just a hunch and my question is this: is there a straightforward answer for what kinds of "invertible" distortions (in a mathematical sense) the imagined apparatus could work, and has the problem ever been formalized in a better way?

1 Answer
1

Unless you deal with holograms, all displays we are observing follow the laws of ray optics. So the display has some specific location – for example, rectangle between four points with certain coordinates – and all the light rays leaving from the points of the display carry the information about their origin because they are diverging from that point, and if one extrapolates several rays associated with the same "pixel" on the display, they actually intersect in the pixel that created the light.

The human eye, when properly focused, changes the direction of these divergent light rays by refraction so that they converge again and intersect on the retina. So a sharp pixel on the display produces a sharp pixel on the retina.

When it's not so, because of myopia, astigmatism, or any condition that prevents the eye from accurate focusing – and believe me, I also know something about it – it just means that the light rays emitted by one pixel on the display are attempting to "reconverge" but they don't intersect at one point of the retina.

Myopia means that the eye works a little too much in trying to "reconverge" the light rays. As a consequence, the intersection of the light rays from one pixel appears inside the liquid in the eye, before the rays reach the retina. And the signal on the retina is inevitably fuzzy – a disk of a sort – even though the source of the light is one pixel.

Hyperopia works in the other way around; the light rays try to "reconverge" but the refraction is too weak and the intersection would only occur "behind" the retina (in the brain) which doesn't physically occur because the photons are absorbed before they reach the intersection.

Astigmatism means that the light rays emitted into different directions horizontally and the light rays emitted into different directions vertically experience different amounts of refraction. So there's no "common intersection" of all the light rays coming from one display pixel at all. If you only consider light rays differing in the "vertical" direction, they intersect at one point; the light rays differing in the "horizontal" direction intersect at another distance from the retina (positive or negative). So one is more myopic for horizontal lines or more myopic for vertical lines or more farsighted for horizontal lines or... and so on.

If a display pixel emits light rays according to the laws of geometric optics – and it's true for CRT monitors, LCD panels, LED, OLED, Plasma TV, or anything else of this sort – the light rays simply carry the information about the actual distance and this information can't be fooled. The image on the retina is blurred and when it's blurred, it just can't ever be unblurred (without a modification of the eye's refraction, by glasses or lens etc.). This is a completely general claim (for generic images). Blurring is "irreversible" (in the same sense as the increasing-entropy phenomena in thermodynamics such as heat diffusion) because one is losing the detailed sharp information. Even if one tried to apply some "sharpening" tools, they would have to work within/near the eye, not on the display's side. What the unfocused eye sees is always "more blurry" (analogous to "higher entropy") and there's no way to produce sources of light from many pixels that would end up as one pixel on the retina, for example.

A normal display can't "fake" the actual point where the photons originate – in this respect, it's completely analogous to all other objects we routinely observe with our eyes. However, holograms can do it. You could construct holographic TVs based on the wave optics which can reconstruct the configuration of photons that is equivalent to an arbitrary arrangement of objects in 3D, between the plate and the human eye.

Note that holography isn't some cheap "3D effect" depending on the fact that we are observing things by two eyes, from two different points. Instead, holography creates genuine 3D images so the eyes must individually focus at various distances, depending on the distances of parts of the objects seen on the hologram, and one may actually see how the 3D object looks like from all directions (something certainly impossible for stereographic "3D" technologies depending on having two images only).

In this way, you could create light rays that apparently originate from points that are closer to the human eye than the plate. This is still insufficient for astigmatism because astigmatism is an "inconsistency" in the required distance of the objects that the eye is able to see sharply.

However, in principle, it should be possible to produce holograms that may be seen sharply by an astigmatic eye, at least if the astigmatism is sufficiently "weak" or "perturbative". Some complicated calculation would have to be done to produce the right hologram that looks sharp to a general enough astigmatic eye.