History is driven by ideas and passions, and by unforeseeable events. The modern world might be very different if the German generals’ attempt to kill Adolf Hitler had succeeded, or if Lee Harvey Oswald’s attempt to kill John F. Kennedy had failed.

History is also driven by science and technology. Science is at the center of many of today’s political arguments—about climate change, evolution, definitions of the beginning and end of life. The technologies of economic growth (cars, factories, power plants) and of weapons production have created problems that other technologies (of conservation and miniaturization and communication) will presumably help solve.

The Atlantic was founded largely as an anti-slavery journal at a moment when technology was about to seriously affect the course of that debate. (Against the productive might of the industrialized North, the states of the slaveholding South stood very little chance in drawn-out warfare.) Ever since that time, even though its emphasis has been on public affairs and the arts, The Atlantic has consistently noted—with excitement, occasional concern, and serious attention—the inventions and discoveries of each age. These eight excerpts show the attempts of writers, frozen at particular moments in technological time, to imagine how the gizmos and breakthroughs they have just seen will matter in the long run. As a group they illustrate how prescient such assessments can turn out to be—and how silly. The more detailed a writer becomes about the scale and impact of an invention, the greater the potential giggle factor in retrospect. Airplanes that cut the travel time between Vienna and Paris to a mere ten hours! Word processors that spare you the need to hit “return” after you type each line! (The source of that last insight was, um, me, in an article twenty-four years ago. I should probably note at this point that I’m just writing the introduction—I didn’t choose the passages.)

But the harder a writer has tried to connect the technology of the moment to the permanent nature of individual and social life, the more prescient the assessment is likely to seem. The most famous of these passages is Vannevar Bush’s “As We May Think,” which in 1945, before the first transistor existed, imagined the structure and value of the modern World Wide Web. What Gilbert Seldes wrote in 1937 about television’s likely effect on styles of thought, what Oliver Wendell Holmes foresaw in 1859 about how photography would change our view of the physical world, and most of what the other writers predicted stands up well now. And what Mark Twain wrote in 1880 applies to a predicament as fresh and modern as hearing one side of a cell-phone call. —James Fallows

In the mid-nineteenth century, photography was in its infancy. Louis Daguerre had developed the daguerreotype in 1837, and by the 1850s travel photography and photographic portraiture were beginning to catch on. In a much-cited 1859 essay, Oliver Wendell Holmes impressed upon readers the revolutionary implications of this new technology.

Form is henceforth divorced from matter. In fact, matter as a visible object is of no great use any longer, except as the mould on which form is shaped. Give us a few negatives of a thing worth seeing, taken from different points of view, and that is all we want of it. Pull it down or burn it up, if you please. We must, perhaps, sacrifice some luxury in the loss of color; but form and light and shade are the great things, and even color can be added, and perhaps by and by may be got direct from Nature.

There is only one Coliseum or Pantheon; but how many millions of potential negatives have they shed,—representatives of billions of pictures,—since they were erected! Matter in large masses must always be fixed and dear; form is cheap and transportable. We have got the fruit of creation now, and need not trouble ourselves with the core. Every conceivable object of Nature and Art will soon scale off its surface for us. Men will hunt all curious, beautiful, grand objects, as they hunt the cattle in South America, for their skins, and leave the carcasses as of little worth.

Volume 3, No. 20, pp. 738–748

A Telephonic Conversation June 1880 By Mark Twain

Mark Twain’s family was one of the first in Hartford to install a telephone (which had been patented by Alexander Graham Bell in 1876) in its home. In 1880, Twain, bemused by this new device that permitted eavesdroppers to hear only one side of a conversation, wrote an amusing description of overhearing his wife talk on the telephone.

I consider that a conversation by telephone—when you are simply sitting by and not taking any part in that conversation—is one of the solemnest curiosities of this modern life. Yesterday I was writing a deep article on a sublime philosophical subject while such a conversation was going on in the room. I notice that one can always write best when somebody is talking through a telephone close by. Well, the thing began in this way. A member of our household came in and asked me to have our house put into communication with Mr. Bagley’s, down town. I have observed, in many cities, that the [gentle] sex always shrink from calling up the central office themselves. I don’t know why, but they do. So I touched the bell, and this talk ensued:

Central Office. [Gruffly.] Hello!

I. Is it the Central Office?

C. O. Of course it is. What do you want?

I. Will you switch me on to the Bagleys, please?

C. O. All right. Just keep your ear to the telephone.

Then I heard, k-look, k-look, k’look? klook-klook-klook-look-look! then a horrible “gritting” of teeth, and finally a piping female voice: “Y-e-s? [Rising inflection.] Did you wish to speak to me?”

Without answering, I handed the telephone to the applicant, and sat down. Then followed that queerest of all the queer things in this world,—a conversation with only one end to it. You hear questions asked; you don’t hear the answer. You hear invitations given; you hear no thanks in return. You have listening pauses of dead silence, followed by apparently irrelevant and unjustifiable exclamations of glad surprise, or sorrow, or dismay. You can’t make head or tail of the talk, because you never hear anything that the person at the other end of the wire says.

Volume 45, No. 272, pp. 841–843

The New Talking Machines February 1889 By Philip G. Hubert Jr.

In 1889, Philip G. Hubert Jr., a noted architect and writer (his 1893 book Men of Achievement: Inventors remains in print today), commended Thomas Edison for his progress in developing the phonograph and predicted great things for its future, including books on “phonograms” and music reviews accompanied by sound clips.

Edison has devoted nearly two years to the task of making the phonograph of commercial use. He believes that he has succeeded. Whether or not the instrument shall enter into every-day life, as the telephone has done, is a question for the future … Whether Mr. Edison, or Mr. Bell, or some one else puts the final touches which will take the apparatus out of the laboratory and make it practical for common use does not much matter. Some one will certainly do it. Those persons who smile incredulously when it is said that the perfected phonograph will do away with letter-writing, will read to us, will sing and play for us, will give us books, music, plays, speeches, at almost no cost, and become a constant source of instruction and amusement, must have forgotten the ridicule they heaped upon the rumor that an American inventor proposed to talk from New York to Chicago …

As compared with the field of the telephone, that of the phonograph is limitless. The telephone must always remain somewhat of an expensive luxury, owing to the cost of maintaining wires, connecting stations, etc. The whole expense of the phonograph will be the first cost … Imagine what the phonograph will do for the man on the borders of civilization! It will supply him with books in a far more welcome shape than print, for they will read themselves; the mail will bring him the latest play of London, or opera of Vienna. If he cares for political speeches, he can have the Congressional Record in the shape of phonograms. It is even possible to imagine that many books and stories may not see the light of print at all; they will go into the hands of their readers, or hearers rather, as phonograms …

I really see no reason why the newspaper of the future should not come to the subscriber in the shape of a phonogram … Think what a musical critic might be able to do for his public! He might give them whole arias from an opera or movements from a symphony, by way of proof or illustration.

Volume 63, No. 376, pp. 256–261

Life as We Know It July 1924 By Arthur D. Little

In the early years of the twentieth century, standardization, mass production, and the rise of consumer culture combined with new scientific advances to transform the everyday lives of Americans. In 1924, Arthur D. Little, the MIT-educated chemical engineer who in 1886 founded the world’s first consulting company, took note of some of those dramatic changes.

Within the last ten years the United States has become the first industrial nation of the world …

All those things that relieve household labor of its drudgery have their assured place in the future. Nature abhors a vacuum only because she has no carpets and rugs to clean. More and more homes will be equipped with electric appliances: toasters, irons, and washing machines; and the electric refrigerator is almost here …

We have seen old-time necessities—as candles and open fires—come to be classed as luxuries … Where our plutocrats progressed ten miles an hour behind a pair of horses, our workmen now go thirty in a Ford …

Oil has become as essential as gunpowder to the navies of the world, and almost as dangerous to our politicians. On land the tank-wagon is already as familiar as the coal-truck, and the convenience and temporary cheapness of fuel oil have caused it to replace coal in many thousands of plants and dwellings. This tendency will continue for a time until scarcity and science put new values on petroleum …

The sales of radio equipment reached a total of $150,000,000 last year and are expected to double in 1924. The earth has become a whispering gallery, and the ocean has lost its solitude. The farm is no longer isolated, and the newspapers, the theaters, and the pulpit have a new competitor.

Man is no longer bound to the earth. He has achieved a three-dimensional existence … It is now possible to fly from Vienna to Paris in ten hours and from Strasbourg to Constantinople in thirty …

During the last fifty years science and invention have led us further and further from the world that was; deeper and deeper into a new environment. The process of change has been so rapid that readjustment has been difficult. Yet readjust ourselves we must.

Volume 134, No. 1, pp. 36–45

Television and Radio May 1937 By Gilbert Seldes

In 1937, the impending commercial launch of television inspired Gilbert Seldes, a commentator on popular culture and the author of The Seven Lively Arts (1924), to consider how this new technology would affect radio. Soon after this essay ap­peared, he became the first director of programming for CBS.

Sometime in the middle of 1938, television sets may be put on sale in the United States …

The effect of television on radio will be so gradual that we may be able to preserve whatever in radio is desirable … Because of radio, more of us took setting-up exercises in the morning, with possible improvement in our health … Those who could not read found a new interest; oratory was restored to its ancient glory in Presidential campaigns; the difference between the city and the country was made less, vaudeville artists got jobs, book sales increased; farmers knew the price paid for stock and grain in Chicago and Minneapolis … millions of people, totally indifferent to social movements and international affairs and totally unhabituated to reading about such things, have become aware of them through news broadcasts and commentary …

It is desirable for us to know what price we have paid for the creation of this incomparable engine of social influence: we have certainly created a habit of almost indiscriminate, almost apathetic listening; through the air has come a really incalculable number of stupidities; much that is trite and tasteless comes with what is intelligent and bright. A critic of society would have a delicate job to determine how far radio has corrupted and how far improved the public taste, and the very existence of a power so great as that of radio seems menacing to many observers …

The audience which television will create will be more attentive and, if properly handled, more suggestible even than the audience of radio.

Volume 159, No. 5, pp. 531–541

As We May Think July 1945 By Vannevar Bush

Near the close of World War II, Vannevar Bush, the former director of the wartime Office of Scientific Research and Development, urged scientists to turn their energies from war to the task of making the vast store of human knowledge accessible and useful. The “infostructure” he sketched out—including a proposal for what might be seen as a kind of precursor to hypertext—was destined to be realized in what we now know as the Internet.

The human mind … operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain …

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve, for his records have relative permanency …

Consider a future device for individual use, which is a sort of mechanized private file and library. It needs a name, and, to coin one at random, “memex” will do. A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory …

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item …

Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities … The physician, puzzled by a patient’s reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology …

The inheritance from the master becomes, not only his additions to the world’s record, but for his disciples the entire scaffolding by which they were erected.

Volume 176, No. 1, pp. 101–108

Moving Toward the Clonal Man May 1971 By James D. Watson

As the science of embryology advanced in rapid strides, the geneticist and Nobel laureate James D. Watson—best known for his research on the structure of DNA—considered the potentially troubling implications of such research.

The notion that man might sometime soon be reproduced asexually upsets many people. The main public effect of the remarkable clonal frog produced some ten years ago in Oxford by the zoologist John Gurdon has not been awe of the elegant scientific implication of this frog’s existence, but fear that a similar experiment might someday be done with human cells. Until recently, however, this foreboding has seemed more like a science fiction scenario than a real problem which the human race has to live with …

If the matter proceeds in its current nondirected fashion, a human being born of clonal reproduction most likely will appear on the earth within the next twenty to fifty years, and even sooner, if some nation should actively promote the venture …

This is a matter far too important to be left solely in the hands of the scientific and medical communities. The belief that surrogate mothers and clonal babies are inevitable because science always moves forward … represents a form of laissez-faire nonsense … Just as the success of a corporate body in making money need not set the human condition ahead, neither does every scientific advance automatically make our lives more “meaningful.” No doubt the person whose experimental skill will eventually bring forth a clonal baby will be given wide notoriety. But the child who grows up knowing that the world wants another Picasso may view his creator in a different light.

Volume 227, No. 5, pp. 50–53

Living With a Computer July 1982 By James Fallows

Always a technophile, Atlantic contributor and editor James Fallows was one of the first writers to incorporate a personal computer into his life. A few years later, he explained for Atlantic readers how it worked and how it was subtly influencing the way he wrote.

The process works this way.

When I sit down to write a letter or start the first draft of an article, I simply type on the keyboard and the words appear on the screen. For six months, I found it awkward to compose first drafts on the computer. Now I can hardly do it any other way. It is faster to type this way than with a normal typewriter, because you don’t need to stop at the end of the line for a carriage return (the computer automatically “wraps” the words onto the next line when you reach the right-hand margin), and you never come to the end of the page, because the material on the screen keeps sliding up to make room for each new line. It is also more satisfying to the soul, because each maimed and misconceived passage can be made to vanish instantly, by the word or by the paragraph, leaving a pristine green field on which to make the next attempt …

None of this may sound impressive to those who have fleets of secretaries at their disposal, or to writers who can say precisely what they mean the first time through. Isaac Asimov recently complained in Popular Computing that his word-processor didn’t save him much time on revisions, since he composes at ninety words per minute and “95 per cent of what I write in the first draft stays in the second [and final] draft.” My first-draft survival ratio is closer to one percent, so for me the age of painless revisions is a marvel.

You won’t catch me saying that my machine has made me a better writer, but I don’t think it has made me any worse.

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?

When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.

Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.

Jim Gilmore joins the race, and the Republican field jockeys for spots in the August 6 debate in Cleveland.

After decades as the butt of countless jokes, it’s Cleveland’s turn to laugh: Seldom have so many powerful people been so desperate to get to the Forest City. There’s one week until the Republican Party’s first primary debate of the cycle on August 6, and now there’s a mad dash to get into the top 10 and qualify for the main event.

With former Virginia Governor Jim Gilmore filing papers to run for president on July 29, there are now 17 “major” candidates vying for the GOP nomination, though that’s an awfully imprecise descriptor. It takes in candidates with lengthy experience and a good chance at the White House, like Scott Walker and Jeb Bush; at least one person who is polling well but is manifestly unserious, namely Donald Trump; and people with long experience but no chance at the White House, like Gilmore. Yet it also excludes other people with long experience but no chance at the White House, such as former IRS Commissioner Mark Everson.