What Are Screens Doing to Our Eyes—and Our Ability to See?

Our eyes are hardening; we can barely see our phones anymore. We must learn to look at the wider world.

The eyes are unwell. Their childhood suppleness is lost. The lenses, as we log hours on this earth, thicken, stiffen, even calcify. The eyes are no longer windows on souls. They’re closer to teeth.

To see if your own eyes are hardening, look no further than your phone, which should require no exertion; you’re probably already there. Keep peering at your screen, reading and staring, snubbing life’s third dimension and natural hues. The first sign of the eyes’ becoming teeth is the squinting at phones. Next comes the reflexive extending of the arm, the impulse to resize letters into the preschool range. And at last the buying of drugstore readers.

Modern medicine offers little apart from magnifying glasses to treat presbyopia (from the Greek presbus, meaning “old man”). But those $3.99 specs will get you on your feet just fine, which is to say, you can once again relish your phone without squinting or arm-stretching. A remedy for farsightedness evidently succeeds to the degree that it restores a woman or man to the comfortable consumption of texts, email, ecommerce, and social media on a glazed rectangle of aluminum alloys held at a standard reading distance of 16 inches. With reading glasses we live again.

Doesn’t this seem like an unwholesome loop? The eyes may be unwell, but the primary object of our eyesight seems corrosive. We measure our vision against the phone, all the while suspecting the phone itself is compromising our ability to see it.

Even if we don’t say out loud that failing vision has something to do with our vastly narrowed visual field, our bodies seem to know what’s up. How convenient, for example, that you can turn up a phone’s contrast and brightness with a few taps. If perception can’t be improved, objects can be made more perceivable, right? But then the brightness seems, like morphine, to produce a need for more brightness, and you find yourself topping out, hitting the button in vain for more light only to realize that’s it. You’ve blinded yourself to the light that was already there.

Having recently, in my forties, gotten reading glasses, I now find myself having to choose between reading and being, since I can’t read without them and I can’t see the world with them. The glasses date from a time when reading was much rarer a pastime than being; you’d grope for them to see a book, while relying on your naked eyes for driving, talking, walking.

But of course now so many of us read all day long. And I opt to flood my field of vision with the merry play of pixels and emoji rather than the less scintillating, brown-gray “real world.” This means wearing the reading glasses, even on the street, and affecting blindness to everything but my phone.

What might modern vision be today without the phone as its reason for being? If you were a nomadic goatherd in the Mongolian grasslands, you might not even consider presbyopia a pathology. Many nomads carry cell phones for calls and music, but, except to play games, they rarely gaze at them. Instead, they rest their eyes on the ever-moving flock, alert to vagaries in the animals’ collective configuration and inclinations; but simultaneously they soften the vision to wide angle, so as to detect peripheral anomalies and threats. On camelback in the wide-open grasslands, the eyes line easily with the horizon, which means their eyes take in distance, proximity, an unpixelated spectrum, and unsimulated movement. A panoramic view of the horizon line roots the beholder in the geometer’s simplest concepts of perspective: foreshortening, a vanishing point, linearity, and the changeable shadows cast by the movement of the sun over and under the horizon line. That third dimension—depth—is never, ever forgotten by the nomads. The sun rises and sets on depth.

Depending on your after-hours curriculum in Mongolia (cooking, talking, playing the fiddle), you might rarely even need to do what digital moderns never stop doing: recruit the eye’s ciliary muscle and contract it, releasing tension in the ligaments that suspend the eye to acutely curve the lens and train it to a pixelated 1.4-milimeter letter x on, for instance, a mobile news app. If you explained to a nomad the failures of her aging eyes, she might shrug: Who needs anxious ciliary muscles?

Indeed. And the use of those muscles by digital moderns gets even more complicated when we encounter our x’s not on paper—carbon-­black ink, like liquid soot, inscribed on bleached pulpwood—but on screens. That’s where we come across the quivering and uncertain symbols that play across the—surface, is it? Where are they exactly? Somewhere on or in our devices. No wonder the eyes are unwell.

There are at least two recorded cases of something called smartphone blindness. The New England Journal of Medicine notes that both patients had been reading their phones in bed, on their sides, faces half-hidden, in the dark. “We hypothesized that the symptoms were due to differential bleaching of photo-­pigment, with the viewing eye becoming light-adapted.” Differential bleaching of the eyes! Fortunately, smartphone blindness of this kind is transient.

The blanket term for screen-borne eyesight problems is computer vision syndrome, an unsatisfactory name given to the blurring, dry eyes, and headaches suffered by the people of the screen. The name is unsatisfactory because, like many syndromes, it describes a set of phenomena without situating them in a coherent narrative—medical or otherwise. For contrast, arc eye is a burn: Welders get it from their exposure to bright ultraviolet light. Snowblindness is caused when corneas are sunburned by light reflecting off snow. Hallucinations afflict lookouts because, as Ishmael explains in Moby-Dick, they’re up at odd hours and alone, parsing the “blending cadence of waves with thoughts” for danger, whales, or other vessels; the brain and eyes are inclined to make meaning and mirages of undifferentiated land- and seascapes where none exist.

Computer vision syndrome is not nearly as romantic. The American Optometric Association uses it to describe the discomfort that people report feeling after looking at screens for a “prolonged” period of time. When screens pervade the field of vision all day, what counts as prolonged? (Moreover, reports of discomfort seem like not much to predicate a whole syndrome on.) But the AOA’s treatment of the syndrome is intriguing. This is the so-called 20-20-20 rule, which asks that screen people take a 20-second break to look at something 20 feet away every 20 minutes.

The remedy helps us reverse-engineer the syndrome. This suffering is thought to be a function not of blue light or intrusive ads or bullying and other scourges. It’s thought to be a function of unbroken concentration on a screen 8 inches to 2 feet from the eyes. The person suffering eyestrain is taught to look 20 feet away but she might presumably look at a painting or a wall. Twenty feet, though, suggests it’s depth she may be thirsty for.

The naming of a syndrome discharges the latest anxiety about screens, which have always been a source of social suspicion. People who are glued to screens to the exclusion of other people are regarded with disdain: narcissistic, withholding, deceitful, sneaky. This was true even with the panels that prefigured electronic screens, including shoji, as well as mirrors and newspaper broadsheets. The mirror-gazer may have been the first selfie fanatic, and in the heyday of mirrors the truly vain had handheld mirrors they toted around the way we carry phones. And hand fans and shoji—forget it. The concealing and revealing of faces allowed by fans and translucent partitions suggest the masquerade and deceptions of social media. An infatuation with screens can easily slide into a moral failing.

Not long ago a science writer named Gabriel Popkin began leading walks to identify trees for city dwellers in Washington, DC, many who do work that takes place on screens. That’s right, tree blindness—and the broader concept of blindness to the natural world—might be the real danger screens pose to vision. In 2012, Popkin had learned about trees to cure this blindness in himself and went from a naif who could barely pick out an oak tree to an amateur arboriculturist who can distinguish dozens of trees. The biggest living beings in his city suddenly seemed like friends to him, with features he could recognize and relish.

I opt to flood my field of vision with the merry play of pixels and emoji rather than the brown-gray “real world.” This means wearing reading glasses, even on the street, and affecting blindness to everything but my phone.

Once he could see trees, they became objects of intense interest to him—more exhilarating than apps, if you can believe it. “Take a moment to watch and listen to a flowering redbud tree full of pollen-drunk bumblebees,” he has written. “I promise you won’t be bored.”

If computer vision syndrome has been invented as a catch-all to express a whole range of fears, those fears may not be confined to what blue light or too much close-range texting are doing to the eyesight. Maybe the syndrome is a broader blindness—eyes that don’t know how to see and minds that increasingly don’t know how to recognize nondigital artifacts, especially nature.

Lately, when I pull away from the screen to stare into the middle distance for a spell, I take off my glasses. I try to find a tree. If I’m inside, I open a window; if I’m outside, I will even approach a tree. I don’t want mediation or glass. The trees are still strangers; I hardly know their names yet, but I’m testing myself on leaf shapes and shades of green. All I know so far is that trees are very unlike screens. They’re a prodigious interface. Very buggy. When my eyes settle after a minute or two, I—what’s that expression, “the scales fell from my eyes”? It’s almost, at times, like that.