On January 31, 1882, a partially paralyzed man living with his brother and sister-in-law in a row house in Camden, New Jersey, wrote to a friend to tell him of a recent visitor to that home. “He is a fine large handsome youngster,” the man wrote of that guest. And “he had the good sense to take a great fancy to me.”

Thus Walt Whitman described the day he spent with Oscar Wilde. This meeting between the self-described “old rough” who revolutionized American poetry with his masterpiece Leaves of Grass and the self-anointed “Professor of Aesthetics” who was touring America with a lecture praising sconces and embroidered pillows, has been examined often in the intervening years, usually through the lens of what is now called queer history, or as an interesting, if not particularly consequential, moment in the history of literature.

But neither approach takes the true measure of the meeting’s importance. For Wilde didn’t travel to Camden to talk about gender roles or belles lettres. The writer was still years away from becoming the author whose peerlessly witty plays are still staged today. What drew him to Whitman’s home was the opportunity to discuss fame. He wanted to listen to the singer of “Song of Myself”—an older man (Whitman was 62, Wilde 27) with inexhaustible energy, despite his infirmity, for self-promotion. Whitman was an international icon who exploited the fuzzy line between acclaim and notoriety and a media-savvy poet who understood the crucial role of image in the making of a literary career. Wilde didn’t travel to Camden to learn how to be a famous writer. That, he was certain, he would later teach himself. He went to learn how to be a famous person. It would be hard to imagine a more apt pairing of teacher and student.

Wilde had been sent to America by Richard D’Oyly Carte, the business manager for Gilbert & Sullivan, whose latest operetta, Patience, had recently opened in the English capital to rave reviews and huge ticket sales. Carte had dispatched his clients’ previous hits to America, where they were well-received. He planned to do the same with Patience, but he was nervous. Patience was a satire of the British aesthetic movement, a movement united behind the slogan “art for art’s sake.”1

“The most beautiful things in the world are the most useless: peacocks and lilies, for instance,” the Oxford professor John Ruskin had written.

Aesthetes championed the use of decorative ornamentation in the making of furniture, ceramics, textiles, and so on, and proclaimed the superiority of handmade goods to mass-produced ones. Its poetic credo was summed up by Keats: “Beauty is truth, truth beauty,—that is all ye know on earth, and all ye need to know.”

To W.S. Gilbert, however, the movement was a nirvana for useless narcissists and a way for self-adoring dandies to natter away in public about their exquisite taste, a conviction he verbalized to great comic effect in Patience. The principal male roles in the operetta, Bunthorne and Grosvenor, two poets competing for the hand of a lass named Patience, were composites modeled on several leading aesthetes, among them: the painters Dante Gabriel Rossetti and James McNeill Whistler, the poet Algernon Swinburne, and the recent Oxford graduate Oscar Wilde—who claimed, with no justification, to be the leader of the movement. Wilde had just self-published his first book of poetry to some withering reviews, sarcastic cartoons in humor magazines such as Punch, and negligible sales.

What had made Carte nervous was that the aesthete was not a native species in the United States. Would American audiences get the jokes? A solution was put forth by the manager of Carte’s New York office: send over a “real” aesthete (maybe Oscar Wilde?) and have him present a series of lectures in America (“On Beauty,” perhaps?), delivered in the same “aesthetic” costume (satin breeches, shiny patent-leather pumps, form-fitting velvet jacket, and so on), worn by Bunthorne in Patience. A telegram was sent from New York to Wilde in London that claimed (falsely) that fifty American lecture agents were ready to book him, if he were available to speak. Wilde was nearly broke, so he answered, “Yes, if offer good.” It was: fifty percent of the box-office take, less expenses.

Wilde arrived on January 3, 1882, and, six days later, presented his first lecture, titled “The English Renaissance of Art,” to a sold-out house at Chickering Hall (seating capacity: 1,250) on lower Fifth Avenue. That a man virtually unknown to most Americans could achieve that commercial triumph was, in large part, the result of the nearly nonstop coverage in the New York press of Wilde’s preening at parties in the nights before his lecture by leading figures in Manhattan society. “I stand [in] the reception rooms when I go out, and for two hours they defile past for introduction,” Wilde wrote of his socializing in New York to a friend in London. “I bow graciously and sometimes honour them with a royal observation.”

A reporter from Philadelphia interviewed Wilde as they took a train to that city, the second stop on his tour. “What poet do you most admire in America?” the reporter asked Wilde, who had won the prestigious Newdigate Prize for poetry, at Oxford.2

Past winners included Matthew Arnold and John Ruskin.

“I think Walt Whitman and [Ralph Waldo] Emerson have given the world more than anyone else,” he answered. “I do so hope to meet Mr. Whitman,” (Perhaps Wilde’s press agent had informed him that the poet lived nearby.) “I admire him intensely,” Wilde continued. “Dante Rossetti, [Algernon] Swinburne, William Morris and I often discuss him.” In reality, Swinburne and Wilde were mere acquaintances and had not often discussed anything. But that didn’t stop Wilde from adding, as if he were repeating something from their “frequent” discussions: “There is something so Greek and sane about [Whitman’s] poetry; it is so universal, so comprehensive.” After these words were published in the Philadelphia Press, Wilde got the response he was likely hoping for. Whitman sent this note to his hotel: “Walt Whitman will be in from 2 till 3 ½ this afternoon & will be most happy to see Mr. Wilde.”

“I come as a poet to call upon a poet,” Wilde said, when Whitman opened his door. Whitman, who adored being adored as few others ever have, was delighted to hear this. He went to the cupboard and removed a bottle of his sister-in-law Louisa’s homemade elderberry wine. The two men began to empty it.

They were unlikely drinking companions. Wilde had a double “first” from one of the most prestigious universities in the world; Whitman left school at age eleven. Wilde was a polished talker and epigrammist; Whitman spoke in short, occasionally ungrammatical bursts. Wilde was a snob; Whitman (in his own words) “talk[ed] readily with niggers.” Despite these differences, the two men enjoyed each other’s company. “I will call you Oscar,” Whitman said. “I like that so much,” Wilde replied. He was thrilled to be in such close proximity to the man who, as Wilde had hoped to do for himself, had launched his career with a self-published book of poems.

So Wilde accepted Whitman’s invitation to accompany him to his den on the third floor, where, as Whitman said, they could be on “thee and thou terms.” Wilde was shocked by the tiny room where Whitman wrote his verse. Dust was everywhere, and the only place for Wilde to sit, a low stool near Whitman’s desk, was covered by a messy pile of newspapers Whitman had saved because he was mentioned in them.

The American told his guest he admired the work of Britain’s poet laureate, Alfred, Lord Tennyson, yet noted that it was often “perfumed … to an extreme of sweetness.” He then asked: “Are not you young fellows going to shove the established idols aside, Tennyson and the rest?”

“Tennyson’s rank is too well fixed,” Wilde said, “and we love him too much. But he has not allowed himself to be a part of the living world…. We, on the other hand, move in the very heart of today.” That “we” was the aesthetic movement. “You are young and ardent,” Whitman said, “and the field is wide, and if you want my advice, [I say] go ahead.”

The real subject of Whitman’s conversation wasn’t literary form; it was how to build a career in public, with all the display that self-glorifying achievement requires. We can deduce that with confidence because the first thing Whitman did when he reached his den was to give his guest a photograph of himself. Whitman had pioneered the idea that a writer in search of fame should fashion himself as a literary artifact. When Leaves of Grass was self-published in 1855 it did not have Whitman’s name on the title page; instead, it had his portrait on the preceding page, showing the author standing tall in workman’s garb, his collar open, his left hand in one pocket of his slacks, his right resting on his hip, his bearded head topped by a hat set at a cocky angle, and his eyes meeting the reader with a stare simultaneously casual and challenging. No writer had ever presented himself to the public this way, let alone so intentionally. (Or with a visible button fly.) This frontispiece is now considered, the scholars Ed Folsom and Charles M. Price write, “the most famous in literary history.”

The portrait Whitman gave Wilde in 1882 appeared on his next book, Specimen Days & Collect, an assemblage of travel diaries, nature writing, and Civil War reminiscences. (Whitman had spent the war years in Washington, working as a government clerk and volunteering as a hospital visitor.) He is in profile in the photograph, sitting in a wicker chair wearing a wide-brimmed hat, an open-necked shirt, and a cardigan. A butterfly is perched on his index finger, held in front of his face. “I’ve always had the knack of attracting birds and butterflies,” Whitman once told a friend. Years later Whitman’s “butterfly” was found in the Library of Congress. It was made of cardboard; it had been tied to his finger with string.

By handing Wilde that photo Whitman was teaching him that fame as a writer is only partly about literature. It is also about committing oneself to a performance. Such role-playing isn’t the act of a phony; in Whitman’s mind every pose he struck was authentic. This type of authenticity—the fashioning of an image one would be faithful to in public—Wilde had experienced on a small scale playing the aesthete on the campus of Oxford’s Magdalen College and at parties in London. It was instructive to have its truth verified by a literary star who had proved its efficacy on an international scale. Wilde had always believed there was nothing inglorious about seeking glory. By handing Wilde his portrait, Whitman was confirming that instinct.

Days before he met Whitman, Wilde sat for the photographer Napoleon Sarony in New York, posing himself as an aesthetic Adonis in satin breeches. Following Whitman’s lead, he used these portraits as his “logo” as he crossed America delivering his lectures. He would present more than 140 of them and remain in the States for a year, becoming the second-most-recognized Briton in America, behind only Queen Victoria. (Not bad for a writer who’d hardly written anything.)

“God bless you, Oscar,” Whitman said, when Wilde left. A Philadelphian joked that it must have been hard for Wilde to swallow the homemade wine Whitman had offered. For once Wilde rejected an invitation to snobbery. “If it had been vinegar, I should have drunk it all the same,” he said. “I have an admiration for that man which I can hardly express.”

Has the Internet killed thoughtful, prolonged engagement with a text—or are we nostalgic for a reading Eden that never existed?

It is becoming a cliché of conversations between twentysomethings (especially to the right of 25) that if you talk about books or articles or strung-together words long enough, someone will eventually wail plaintively: “I just can’t reeeeeaaad anymore.” The person will explain that the Internet has shot her attention span. She will tell you about how, when she was small, she could lose herself in a novel for hours, and now, all she can do is watch the tweets swim by like glittery fish in the river of time-she-will-never-get-back. You will begin to chafe at what sounds like a humblebrag—I was precocious and remain an intellectual at heart or I feel oppressed by my active participation in the cultural conversation—but then you will realize, with an ache of recognition, that you are in the same predicament. “Yes,” you will gush, overcome by possibly invented memories of afternoons whiled away under a tree with Robertson Davies. “What happened to me? How do I fight it? Where did my concentration—oooh, cheese.”

Reading insecurity. It is the subjective experience of thinking that you’re not getting as much from reading as you used to. It is setting aside an hour for that new book about mass hysteria in a high school and spending it instead on Facebook (scrolling dumbly through photos of people you barely remember from your high school). It is deploring your attention span and missing the flow, the trance, of entering a narrative world without bringing the real one along. It is realizing that if Virginia Woolf was correct to call heaven “one continuous unexhausted reading,” then goodbye, you have been kicked out of paradise.

And reading insecurity is everywhere, from the many colleagues who told me they have the condition (“My power to concentrate and absorb is atrophied. And that’s reading a short novel like Cat’s Cradle, which I’ve been reading for a year now”) to the desperate call-to-arms among twentysomething friends that rarely leads anywhere: “Let’s form a book club!” (Yeah, right.) An assortment of new reading apps advance the idea that we must reimagine reading if we’re going to salvage it, their fizzy positivism—Reading 1.0 is “inefficient” and “frustrating.” Reading 2.0 is great!—masking the same why isn’t-this-working anxiety. As a curative we have the unplugging movement. Books and articles probe the Way We Read Now: Teachers deplore it, kids seem unfazed by it, and millennials/late Gen Y-ers wonder whether to embrace or resist it. It is that last group—the ambivalent ones, who came of age just as the Internet was beginning to envelop society and can faintly remember glimmers of a prelapsarian past—that seems most susceptible to reading insecurity. Our nostalgia for print shades into nostalgia for childhood itself. We’ve landed in a different world from the one we started out in, but unlike our parents, we can’t retreat from it; we have to inherit it. We worry we’re not up to the task.

Science inflames this self-doubt, or at least reinforces the sense that something big has changed. A long train of studies suggests that people read the Internet differently than they read print. We skim and scan for the information we want, rather than starting at the beginning and plowing through to the end. Our eyes jump around, magnetized to links—they imply authority and importance—and short lines cocooned in white space. We’ll scroll if we have to, but we’d prefer not to. (Does the weightless descent invite a momentary disorientation, a lightheadedness? Or are we just lazy?) We read faster. “People tend not to read online in the traditional sense but rather to skim read, hop from one source to another, and ‘power browse,’ ” wrote psychologists Val Hooper and Channa Herath in June.

And it is not just the choreography of reading that changes when ink gives way to pixels. It is the way we experience, integrate, and remember the content. In her insightful (online) review of online and print-based reading styles for TheNew Yorker, Maria Konnikova describes a study by the Norwegian scientist Anne Mangen, who asked students to digest a short story either as a Kindle e-book or as a paperback. Despite the two texts’ physical resemblance—“Kindle e-ink is designed to mimic the printed page,” Konnikova notes—the students who read from the paperback volume could better reconstruct the story’s plot. Likewise, when volunteers were asked to write an essay on a narrative they’d consumed either online or on paper, those who had received tangible books crafted superior responses.

So maybe we’re right to be worried about our e-reading. Maybe we’ve sensed that we rely on physical cues to ground ourselves in complex arguments, and that we get more of those from books than from flickering screens. Online, the fugitive flow of pixels makes the ideas themselves seem airy and ephemeral. Are those wisps then less likely to lodge in memory?

The notion that language might absorb the evanescent or permanent properties of its medium was a big deal in the Middle Ages, as written records began to supplant spoken traditions. Chaucer linked oral expression to flux and deceit: In a poem that partially serves as a cautionary tale about rumor, he connects the transience of love to the vanishing sound of a lover’s voice professing it. (One character, for example, asks why guys lie so much when they pledge their faith out loud: “O have ye men swich goodlihede/ in speche, and never a deel of trouthe?”) Even earlier, the Roman poet Catullus sarcastically urged women to write their promises in wind and running water—media appropriate to the fickleness of their words.* Of St. Augustine, lifted to heaven by the concrete reality and inarguable verity of ink on codex (he converted after opening a Bible), Andrew Piper writes: “It was above all else the graspability of the book, its being ‘at hand,’ that allowed it to play such a pivotal role. …. The book’s graspability, in a material as well as a spiritual sense, is what endowed it with such immense power to radically alter our lives.”

Maybe this all seems somewhat egg-headed and wooly as an explanation for why Internet reading freaks us out, but I can’t help thinking that the hoary debate around “orality and literacy”—the slippery nature of one versus the stable authority of the other—is back, sort of. This time we’ve cast the new technology as the unreliable flibbertigibbet and the relic-like printed book as the trusty source. And after centuries of vaunting the solidity of written language, there’s a kind of whiplash in signing on and watching our literary output swoosh by.

Plus, and more prosaically, it is just much harder to concentrate when you read online. Email, IM, social media, and spiral-arms of infinite, alluring content are a click away. Once you pick a page, ads and hyperlinks beckon. In their 2014 paper, Hooper and Herath suggested that people’s comprehension suffered when they read the Internet because the barrage of extraneous stimuli interrupted the transfer of information from sensory to working memory, and from working to long-term memory. Experts have posited the extinction of the “deep reading brain” if we do not learn to tune out the Web’s distractions. (This is the kind of pronouncement that will make you sick with reading insecurity.) Some of my friends and colleagues say that they can feel their deep reading brains rallying if they boycott the Internet for a while, which at least implies that the syndrome is reversible. Yet most of our jobs in the information economy require a daily mind-meld with Dr. Google. Reading insecurity has a way of reminding you just how e-dependent you are.

For me, floating behind all the talk of our frazzled attention is a veil of guilt and blame: It’s your fault! You could sit down and do this if you wanted to. You could savor stuff on a screen—didn’t you just binge watch the entirety of High Maintenance last night? Yet the profusion of the Internet also changes the calculus of how long I’m willing to spend on a given story. I’m not alone: People report more impatience when they read from their computers. In reading as with everything else, we’re haunted by FOMO and the search for novelty: “We are sponges and we live in a world where the fire hose is always on,” wrote David Carr in the New York Times. Jakob Nielsen, who studies the mechanics of Internet perusal, put it more bluntly: “Users are selfish, lazy, and ruthless.”

So maybe the answer is just to close the laptop and read more books. Books! Hallelujah. Except that it sometimes feels as though we are Typhoid Marys, transferring our diseased Web habits back to print. A colleague around my age told me she now thinks of books as tabs: flitting distractedly between them, she is often forced to retrace her steps. I feel selfish, lazy, and ruthless even when met by the generosity of a sunny afternoon and a novel; sometimes I wonder whether a portal has permanently closed.

Yet the Web giveth, even as it taketh away. The good news is that, insecure or not, we are all reading more. Thanks to the Internet, words are everywhere; e-readers are light, slim, and cost-effective; our faster reading pace means we can range more widely. And yes, there are wonderful advantages to the onscreen reading experience, including searchable keywords, toolbars, the ability to look up anything. A different colleague, working on a historical project, raved to me about the obscure diaries he was able to unearth online—without the Web, he would have needed to travel to an archive in another state to find them, and would have had to scarf them down before the building closed at 5. The accessibility of the documents on the Internet, he explained, allowed for deep and prolonged engagement. And of course this was in addition to the breadth of knowledge afforded: You can’t overstate the vast contextualizing power of more than 1 billion websites.

The good news is that, insecure or not, we are all reading more.

I also realize, typing this confession of pathological distractibility, that I may be pining for an Eden of immersive focus that never existed. Did I ever really spend six hours with my face in a book? Was my imagination truly so unfettered from the concerns of everyday life—and, if so, isn’t that a childhood thing, not a technology thing? Twelve-year-old me never had a Google alert wrench her out of Francie’s Brooklyn so that she could write her roommate a check for the rent. She definitely wasn’t expected to know what was going on in Syria.

Still, the dissatisfaction lingers. In his 1988 study of ludic (or pleasure) reading, Victor Nell found that we read slower when we like a text. Our brains enter a state of arousal that resembles hypnosis. There is trance and transportation—which might explain why, 30 years later, adults prefer to encounter Darcy and Dracula offline, where they are less conditioned to skim, jump around, and be generally restless. In a recent survey of several hundred men and women over the age of 18, most respondents said they enjoyed print books more than e-books, though they were content to gather information from either format. The researchers suggested that pleasure reading requires a deeper engagement with the text, one facilitated by the kind of sustained, linear attention (and ability to annotate) that print books promote. In other words, when we bemoan that we don’t reeeeeaaad any more, we are mourning a specific kind of reading—and it is precisely this kind that seems to shimmer beyond our reach online.

Like this:

Why Walking Helps Us Think

BY FERRIS JABR

CREDITPHOTOGRAPH BY ALEX MAJOLI/MAGNUM

In Vogue’s 1969 Christmas issue, Vladimir Nabokov offered some advice for teaching James Joyce’s “Ulysses”: “Instead of perpetuating the pretentious nonsense of Homeric, chromatic, and visceral chapter headings, instructors should prepare maps of Dublin with Bloom’s and Stephen’s intertwining itineraries clearly traced.” He drew a charming one himself. Several decades later, a Boston College English professor named Joseph Nugent and his colleagues put together an annotated Google map that shadows Stephen Dedalus and Leopold Bloom step by step. The Virginia Woolf Society of Great Britain, as well as students at the Georgia Institute of Technology, have similarlyreconstructed the paths of the London amblers in “Mrs. Dalloway.”

Such maps clarify how much these novels depend on a curious link between mind and feet. Joyce and Woolf were writers who transformed the quicksilver of consciousness into paper and ink. To accomplish this, they sent characters on walks about town. As Mrs. Dalloway walks, she does not merely perceive the city around her. Rather, she dips in and out of her past, remolding London into a highly textured mental landscape, “making it up, building it round one, tumbling it, creating it every moment afresh.”

Since at least the time of peripatetic Greek philosophers, many other writers have discovered a deep, intuitive connection between walking, thinking, and writing. (In fact, Adam Gopnik wrote about walking in The New Yorker just two weeks ago.) “How vain it is to sit down to write when you have not stood up to live!” Henry David Thoreau penned in his journal. “Methinks that the moment my legs begin to move, my thoughts begin to flow.” Thomas DeQuincey has calculated that William Wordsworth—whose poetry is filled with tramps up mountains, through forests, and along public roads—walked as many as a hundred and eighty thousand miles in his lifetime, which comes to an average of six and a half miles a day starting from age five.

What is it about walking, in particular, that makes it so amenable to thinking and writing? The answer begins with changes to our chemistry. When we go for a walk, the heart pumps faster, circulating more blood and oxygen not just to the muscles but to all the organs—including the brain. Many experiments have shown that after or during exercise, even very mild exertion, people perform better on tests of memory and attention. Walking on a regular basis also promotes new connections between brain cells, staves off the usual withering of brain tissue that comes with age, increases the volume of the hippocampus (a brain region crucial for memory), and elevates levels of molecules that bothstimulate the growth of new neurons and transmit messages between them.

The way we move our bodies further changes the nature of our thoughts, and vice versa. Psychologists who specialize in exercise music have quantified what many of us already know: listening to songs with high tempos motivates us to run faster, and the swifter we move, the quicker we prefer our music. Likewise, when drivers hear loud, fast music, they unconsciously step a bit harder on the gas pedal. Walking at our own pace creates an unadulterated feedback loop between the rhythm of our bodies and our mental state that we cannot experience as easily when we’re jogging at the gym, steering a car, biking, or during any other kind of locomotion. When we stroll, the pace of our feet naturally vacillates with our moods and the cadence of our inner speech; at the same time, we can actively change the pace of our thoughts by deliberately walking more briskly or by slowing down.

Because we don’t have to devote much conscious effort to the act of walking, our attention is free to wander—to overlay the world before us with a parade of images from the mind’s theatre. This is precisely the kind of mental state that studies have linked to innovative ideas and strokes of insight. Earlier this year, Marily Oppezzo and Daniel Schwartz of Stanford published what is likely the first set of studies that directly measure the way walking changes creativity in the moment. They got the idea for the studies while on a walk. “My doctoral advisor had the habit of going for walks with his students to brainstorm,” Oppezzo says of Schwartz. “One day we got kind of meta.”

In a series of four experiments, Oppezzo and Schwartz asked a hundred and seventy-six college students to complete different tests of creative thinking while either sitting, walking on a treadmill, or sauntering through Stanford’s campus. In one test, for example, volunteers had to come up with atypical uses for everyday objects, such as a button or a tire. On average, the students thought of between four and six more novel uses for the objects while they were walking than when they were seated. Another experiment required volunteers to contemplate a metaphor, such as “a budding cocoon,” and generate a unique but equivalent metaphor, such as “an egg hatching.” Ninety-five per cent of students who went for a walk were able to do so, compared to only fifty per cent of those who never stood up. But walking actually worsened people’s performance on a different type of test, in which students had to find the one word that united a set of three, like “cheese” for “cottage, cream, and cake.” Oppezzo speculates that, by setting the mind adrift on a frothing sea of thought, walking is counterproductive to such laser-focussed thinking: “If you’re looking for a single correct answer to a question, you probably don’t want all of these different ideas bubbling up.”

Where we walk matters as well. In a study led by Marc Berman of the University of South Carolina, students who ambled through an arboretum improved their performance on a memory test more than students who walked along city streets. A small but growing collection of studies suggests that spending time in green spaces—gardens, parks, forests—can rejuvenate the mental resources that man-made environments deplete. Psychologists have learned that attention is a limited resource that continually drains throughout the day. A crowded intersection—rife with pedestrians, cars, and billboards—bats our attention around. In contrast, walking past a pond in a park allows our mind to drift casually from one sensory experience to another, from wrinkling water to rustling reeds.

Still, urban and pastoral walks likely offer unique advantages for the mind. A walk through a city provides more immediate stimulation—a greater variety of sensations for the mind to play with. But, if we are already at the brink of overstimulation, we can turn to nature instead. Woolf relished the creative energy of London’s streets, describing it in her diary as “being on the highest crest of the biggest wave, right in the centre & swim of things.” But she also depended on her walks through England’s South Downs to “have space to spread my mind out in.” And, in her youth, she often travelled to Cornwall for the summer, where she loved to “spend my afternoons in solitary trampling” through the countryside.

Perhaps the most profound relationship between walking, thinking, and writing reveals itself at the end of a stroll, back at the desk. There, it becomes apparent that writing and walking are extremely similar feats, equal parts physical and mental. When we choose a path through a city or forest, our brain must survey the surrounding environment, construct a mental map of the world, settle on a way forward, and translate that plan into a series of footsteps. Likewise, writing forces the brain to review its own landscape, plot a course through that mental terrain, and transcribe the resulting trail of thoughts by guiding the hands. Walking organizes the world around us; writing organizes our thoughts. Ultimately, maps like the one that Nabokov drew are recursive: they are maps of maps.

Like this:

Courtesy of A Small Press Life and Mental Floss, life is much, much bigger than LOL, BFN or even WTF. I’ll be attempting to Do the Bear and trust I’ll not Cop a Mouse. Here’s hoping we’ll all Take the Egg……

In 1909, writing under the pseudonym James Redding Ware, British writer Andrew Forrester published Passing English of the Victorian era, a dictionary of heterodox English, slang and phrase. “Thousands of words and phrases in existence in 1870 have drifted away, or changed their forms, or been absorbed, while as many have been added or are being added,” he writes in the book’s introduction. “‘Passing English’ ripples from countless sources, forming a river of new language which has its tide and its ebb, while its current brings down new ideas and carries away those that have dribbled out of fashion.” Forrester chronicles many hilarious and delightful words in Passing English; we don’t know how these phrases ever fell out of fashion, but we propose bringing them back.

1. Afternoonified

2. Arfarfan’arf

A figure of speech used to describe drunken men. “He’s very arf’arf’an’arf,” Forrester writes, “meaning he has had many ‘arfs,’” or half-pints of booze.

3. Back slang it

Thieves used this term to indicate that they wanted “to go out the back way.”

4. Bags o’ Mystery

An 1850 term for sausages, “because no man but the maker knows what is in them. … The ‘bag’ refers to the gut which contained the chopped meat.”

5. Bang up to the elephant

This phrase originated in London in 1882, and means “perfect, complete, unapproachable.”

6. Batty-fang

Low London phrase meaning “to thrash thoroughly,” possibly from the French battre a fin.

7. Benjo

Nineteenth century sailor slang for “A riotous holiday, a noisy day in the streets.”

8. Bow wow mutton

A naval term referring to meat so bad “it might be dog flesh.”

9. Bricky

Brave or fearless. “Adroit after the manner of a brick,” Forrester writes, “said even of the other sex, ‘What a bricky girl she is.’”

10. Bubble Around

A verbal attack, generally made via the press. Forrester cites The Golden Butterfly: “I will back a first-class British subject for bubbling around against all humanity.”

11. Butter Upon Bacon

Extravagance. Too much extravagance. “Are you going to put lace over the feather, isn’t that rather butter upon bacon?”

12. Cat-lap

A London society term for tea and coffee “used scornfully by drinkers of beer and strong waters … in club-life is one of the more ignominious names given to champagne by men who prefer stronger liquors.”

13. Church-bell

A talkative woman.

14. Chuckaboo

A nickname given to a close friend.

15. Collie shangles

Quarrels. A term from Queen Victoria’s journal, More Leaves, published in 1884: “At five minutes to eleven rode off with Beatrice, good Sharp going with us, and having occasional collie shangles (a Scottish word for quarrels or rows, but taken from fights between dogs) with collies when we came near cottages.”

16. Cop a Mouse

To get a black eye. “Cop in this sense is to catch or suffer,” Forrester writers, “while the colour of the obligation at its worst suggests the colour and size of the innocent animal named.”

17. Daddles

A delightful way to refer to your rather boring hands.

18. Damfino

This creative cuss is a contraction of “damned if I know.”

19. Dizzy Age

A phrase meaning “elderly,” because it “makes the spectator giddy to think of the victim’s years.” The term is usually refers to “a maiden or other woman canvassed by other maiden ladies or others.”

20. Doing the Bear

“Courting that involves hugging.”

21. Don’t sell me a dog

Popular until 1870, this phrase meant “Don’t lie to me!” Apparently, people who sold dogs back in the day were prone to trying to pass off mutts as purebreds.

22. Door-knocker

A type of beard “formed by the cheeks and chin being shaved leaving a chain of hair under the chin, and upon each side of mouth forming with moustache something like a door-knocker.”

23. Enthuzimuzzy

“Satirical reference to enthusiasm.” Created by Braham the terror, whoever that is.

24. Fifteen puzzle

Not the game you might be familiar with, but a term meaning complete and absolute confusion.

25. Fly rink

An 1875 term for a polished bald head.

26. Gal-sneaker

An 1870 term for “a man devoted to seduction.”

27. Gas-Pipes

A term for especially tight pants.

28. Gigglemug

“An habitually smiling face.”

29. Got the morbs

Use of this 1880 phrase indicated temporary melancholy.

30. Half-rats

Partially intoxicated.

31. Jammiest bits of jam

“Absolutely perfect young females,” circa 1883.

32. Kruger-spoof

Lying, from 1896.

33. Mad as Hops

Excitable.

34. Mafficking

An excellent word that means getting rowdy in the streets.

35. Make a stuffed bird laugh

“Absolutely preposterous.”

36. Meater

A street term meaning coward.

37. Mind the Grease

When walking or otherwise getting around, you could ask people to let you pass, please. Or you could ask them to mind the grease, which meant the same thing to Victorians.

38. Mutton Shunter

This 1883 term for a policeman is so much better than “pig.”

39. Nanty Narking

A tavern term, popular from 1800 to 1840, that meant great fun.

40. Nose bagger

Someone who takes a day trip to the beach. He brings his own provisions and doesn’t contribute at all to the resort he’s visiting.

41. Not up to Dick

Not well.

42. Orf chump

No appetite.

43. Parish Pick-Axe

A prominent nose.

44. Podsnappery

This term, Forrester writers, describes a person with a “wilful determination to ignore the objectionable or inconvenient, at the same time assuming airs of superior virtue and noble resignation.”

45. Poked Up

Embarrassed.

46. Powdering Hair

An 18th century tavern term that means “getting drunk.”

47. Rain Napper

An umbrella.

48. Sauce-box

The mouth.

49. Shake a flannin

Why say you’re going to fight when you could say you’re going to shake a flannin instead?

50. Shoot into the brown

To fail. According to Forrester, “The phrase takes its rise from rifle practice, where the queer shot misses the black and white target altogether, and shoots into the brown i.e., the earth butt.”

51. Skilamalink

Secret, shady, doubtful.

52. Smothering a Parrot

Drinking a glass of absinthe neat; named for the green color of the booze.

53. Suggestionize

A legal term from 1889 meaning “to prompt.”

54. Take the Egg

To win.

55. Umble-cum-stumble

According to Forrester, this low class phrase means “thoroughly understood.”

56. Whooperups

A term meaning “inferior, noisy singers” that could be used liberally today during karaoke sessions.

Why you should never, ever use two spaces after a period.

Can I let you in on a secret? Typing two spaces after a period is totally, completely, utterly, and inarguablywrong.

And yet people who use two spaces are everywhere, their ugly error crossing every social boundary of class, education, and taste.* You’d expect, for instance, that anyone savvy enough to read Slate would know the proper rules of typing, but you’d be wrong; every third email I get from readers includes the two-space error. (In editing letters for “Dear Farhad,” my occasional tech-advice column, I’ve removed enough extra spaces to fill my forthcoming volume of melancholy epic poetry, The Emptiness Within.) The public relations profession is similarly ignorant; I’ve received press releases and correspondence from the biggest companies in the world that are riddled with extra spaces. Some of my best friends are irredeemable two-spacers, too, and even my wife has been known to use an unnecessary extra space every now and then (though she points out that she does so only when writing to other two-spacers, just to make them happy).

What galls me about two-spacers isn’t just their numbers. It’s their certainty that they’re right. Over Thanksgiving dinner last year, I asked people what they considered to be the “correct” number of spaces between sentences. The diners included doctors, computer programmers, and other highly accomplished professionals. Everyone—everyone!—said it was proper to use two spaces. Some people admitted to slipping sometimes and using a single space—but when writing something formal, they were always careful to use two. Others explained they mostly used a single space but felt guilty for violating the two-space “rule.” Still others said they used two spaces all the time, and they were thrilled to be so proper. When I pointed out that they were doing it wrong—that, in fact, the correct way to end a sentence is with a period followed by a single, proud, beautiful space—the table balked. “Who says two spaces is wrong?” they wanted to know.

Typographers, that’s who. The people who study and design the typewritten word decided long ago that we should use one space, not two, between sentences. That convention was not arrived at casually. James Felici, author of theThe Complete Manual of Typography, points out that the early history of typeis one of inconsistent spacing. Hundreds of years ago, some typesetters would end sentences with a double space, others would use a single space, and a few renegades would use three or four spaces. Inconsistency reigned in all facets of written communication; there were few conventions regarding spelling, punctuation, character design, and ways to add emphasis to type. But as typesetting became more widespread, its practitioners began to adopt best practices. Felici writes that typesetters in Europe began to settle on a single space around the early 20thcentury. America followed soon after.

Every modern typographer agrees on the one-space rule. It’s one of the canonical rules of the profession, in the same way that waiters know that the salad fork goes to the left of the dinner fork and fashion designers know to putmen’s shirt buttons on the right and women’s on the left. Every major style guide—including theModern Language Association Style Manualand theChicago Manual of Style—prescribes a single space after a period. (ThePublications Manualof the American Psychological Association, used widely in the social sciences,allows for two spaces in draft manuscriptsbut recommends one space in published work.) Most ordinary people would know the one-space rule, too, if it weren’t for a quirk of history. In the middle of the last century, a now-outmoded technology—the manual typewriter—invaded the American workplace. To accommodate that machine’s shortcomings, everyone began to type wrong. And even though we no longer use typewriters, we all still type like we do. (Also seethe persistence of the dreaded Caps Lock key.)

The problem with typewriters was that they usedmonospaced type—that is, every character occupied an equal amount of horizontal space. This bucked a long tradition ofproportional typesetting, in which skinny characters (like I or 1) were given less space than fat ones (like W or M). Monospaced type gives you text that looks “loose” and uneven; there’s a lot of white space between characters and words, so it’s more difficult to spot the spaces between sentences immediately. Hence the adoption of the two-space rule—on a typewriter, an extra space after a sentence makes text easier to read. Here’s the thing, though: Monospaced fonts went out in the 1970s. First electric typewriters and then computers began to offer people ways to create text using proportional fonts. Today nearly every font on your PC is proportional. (Courieris the one major exception.) Because we’ve all switched to modern fonts, adding two spaces after a period no longer enhances readability, typographers say. It diminishes it.

Type professionals can get amusingly—if justifiably—overworked about spaces. “Forget about tolerating differences of opinion: typographically speaking, typing two spaces before the start of a new sentence is absolutely, unequivocally wrong,” Ilene Strizver, who runs a typographic consulting firmThe Type Studio,once wrote. “When I see two spaces I shake my head and I go,Aye yay yay,” she told me. “I talk about ‘type crimes’ often, and in terms of what you can do wrong, this one deserves life imprisonment. It’s a pure sign of amateur typography.” “A space signals a pause,” says David Jury, the author ofAbout Face: Reviving The Rules of Typography. “If you get a really big pause—a big hole—in the middle of a line, the reader pauses. And you don’t want people to pause all the time. You want the text to flow.”

This readability argument is debatable. Typographers can point to no studies or any other evidence proving that single spaces improve readability. When you press them on it, they tend to cite their aesthetic sensibilities. As Jury says, “It’s so bloody ugly.”

But I actually think aesthetics are the best argument in favor of one space over two. One space is simpler, cleaner, and more visually pleasing. (It also requires less work, which isn’t nothing.) A page of text with two spaces between every sentence looks riddled with holes; a page of text with an ordinary space looks just as it should.

Is this arbitrary? Sure it is. But so are a lot of our conventions for writing. It’s arbitrary that we writeshopinstead ofshoppe, orphoneinstead offone, or that we use!to emphasize a sentence rather than%. We adopted these standards because practitioners of publishing—writers, editors, typographers, and others—settled on them after decades of experience. Among their rules was that we should use one space after a period instead of two—so that’s how we should do it.

Besides, the argument in favor oftwo spacesisn’t any less arbitrary. Samantha Jacobs, a reading and journalism teacher at Norwood High School in Norwood, Colo., told me that she requires her students to use two spaces after a period instead of one, even though she acknowledges that style manuals no longer favor that approach. Why? Because that’s what she’s used to. “Primarily, I base the spacing on the way I learned,” she wrote me in an email glutted with extra spaces.

Several other teachers gave me the same explanation for pushing two spaces on their students. But if you think about, that’s a pretty backward approach: The only reason today’s teachers learned to use two spaces is because their teachers were in the grip of old-school technology. We would never accept teachers pushing other outmoded ideas on kids because that’s what was popular back when they were in school. The same should go for typing.So, kids, if your teachers force you to use two spaces, send them a link to this article. Use this as your subject line: “If you type two spaces after a period, you’re doing it wrong.”

*Correction: Thisarticle originally asserted that—in a series of emails described as “overwrought, self-important, and dorky”—WikiLeaks founder Julian Assange used two spaces after every period. Assange actually used a monospace font, which made the text of his emails appear loose and uneven. (Return.)

Farhad Manjoo is a technology columnist for theNew York Timesand the author ofTrue Enough.