Francis Crick and James Watson closed their epoch-making paper on the structure of DNA with a single deliciously diffident sentence. ("It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.")

And Alan Turing created a new world of science and technology, setting the stage for solving one of the most baffling puzzles remaining to science, the mind-body problem, with an even shorter declarative sentence in the middle of his 1936 paper on computable numbers:

It is possible to invent a single machine which can be used to compute any computable sequence.

Turing didn't just intuit that this remarkable feat was possible; he showed exactly how to make such a machine. With that demonstration the computer age was born. It is important to remember that there were entities called computers before Turing came up with his idea, but they were people, clerical workers with enough mathematical skill, patience, and pride in their work to generate reliable results of hours and hours of computation, day in and day out. Many of them were women.

Early "computers" at work. (NASA)

Thousands of them were employed in engineering and commerce, and in the armed forces and elsewhere, calculating tables for use in navigation, gunnery and other such technical endeavors. A good way of understanding Turing's revolutionary idea about computation is to put it in juxtaposition with Darwin's about evolution. The pre-darwinian world was held together not by science but by tradition: All things in the universe, from the most exalted ("man") to the most humble (the ant, the pebble, the raindrop) were creations of a still more exalted thing, God, an omnipotent and omniscient intelligent creator -- who bore a striking resemblance to the second-most exalted thing. Call this the trickle-down theory of creation. Darwin replaced it with the bubble-up theory of creation. One of Darwin's nineteenth-century critics, Robert Beverly MacKenzie, put it vividly:

In the theory with which we have to deal, Absolute Ignorance is the artificer; so that we may enunciate as the fundamental principle of the whole system, that, in order to make a perfect and beautiful machine, it is not requisite to know how to make it. This proposition will be found, on careful examination, to express, in condensed form, the essential purport of the Theory, and to express in a few words all Mr. Darwin's meaning; who, by a strange inversion of reasoning, seems to think Absolute Ignorance fully qualified to take the place of Absolute Wisdom in all the achievements of creative skill.

It was, indeed, a strange inversion of reasoning. To this day many people cannot get their heads around the unsettling idea that a purposeless, mindless process can crank away through the eons, generating ever more subtle, efficient, and complex organisms without having the slightest whiff of understanding of what it is doing.

In order to be a perfect and beautiful computing machine, it is not requisite to know what arithmetic is.

Turing's idea was a similar -- in fact remarkably similar -- strange inversion of reasoning. The Pre-Turing world was one in which computers were people, who had to understand mathematics in order to do their jobs. Turing realized that this was just not necessary: you could take the tasks they performed and squeeze out the last tiny smidgens of understanding, leaving nothing but brute, mechanical actions. In order to be a perfect and beautiful computing machine, it is not requisite to know what arithmetic is.

What Darwin and Turing had both discovered, in their different ways, was the existence of competence without comprehension. This inverted the deeply plausible assumption that comprehension is in fact the source of all advanced competence. Why, after all, do we insist on sending our children to school, and why do we frown on the old-fashioned methods of rote learning? We expect our children's growing competence to flow from their growing comprehension. The motto of modern education might be: "Comprehend in order to be competent." For us members of H. sapiens, this is almost always the right way to look at, and strive for, competence. I suspect that this much-loved principle of education is one of the primary motivators of skepticism about both evolution and its cousin in Turing's world, artificial intelligence. The very idea that mindless mechanicity can generate human-level -- or divine level! -- competence strikes many as philistine, repugnant, an insult to our minds, and the mind of God.

A celebration of the life and work of the pioneering computer scientist See full coverage

Consider how Turing went about his proof. He took human computers as his model. There they sat at their desks, doing one simple and highly reliable step after another, checking their work, writing down the intermediate results instead of relying on their memories, consulting their recipes as often as they needed, turning what at first might appear a daunting task into a routine they could almost do in their sleep. Turing systematically broke down the simple steps into even simpler steps, removing all vestiges of discernment or comprehension. Did a human computer have difficulty telling the number 99999999999 from the number 9999999999? Then break down the perceptual problem of recognizing the number into simpler problems, distributing easier, stupider acts of discrimination over multiple steps. He thus prepared an inventory of basic building blocks from which to construct the universal algorithm that could execute any other algorithm. He showed how that algorithm would enable a (human) computer to compute any function, and noted that:

The behavior of the computer at any moment is determined by the symbols which he is observing and his "state of mind" at that moment. We may suppose that there is a bound B to the number of symbols or squares which the computer can observe at one moment. If he wishes to observe more, he must use successive observations. ... The operation actually performed is determined ... by the state of mind of the computer and the observed symbols. In particular, they determine the state of mind of the computer after the operation is carried out.

He then noted, calmly:

We may now construct a machine to do the work of this computer.

Right there we see the reduction of all possible computation to a mindless process. We can start with the simple building blocks Turing had isolated, and construct layer upon layer of more sophisticated computation, restoring, gradually, the intelligence Turing had so deftly laundered out of the practices of human computers.

But what about the genius of Turing, and of later, lesser programmers, whose own intelligent comprehension was manifestly the source of the designs that can knit Turing's mindless building blocks into useful competences? Doesn't this dependence just re-introduce the trickle-down perspective on intelligence, with Turing in the God role? No less a thinker than Roger Penrose has expressed skepticism about the possibility that artificial intelligence could be the fruit of nothing but mindless algorithmic processes.

I am a strong believer in the power of natural selection. But I do not see how natural selection, in itself, can evolve algorithms which could have the kind of conscious judgements of the validity of other algorithms that we seem to have.

He goes on to admit:

To my way of thinking there is still something mysterious about evolution, with its apparent 'groping' towards some future purpose. Things at least seem to organize themselves somewhat better than they 'ought' to, just on the basis of blind-chance evolution and natural selection.

Indeed, a single cascade of natural selection events, occurring over even billions of years, would seem unlikely to be able to create a string of zeroes and ones that, once read by a digital computer, would be an "algorithm" for "conscious judgments." But as Turing fully realized, there was nothing to prevent the process of evolution from copying itself on many scales, of mounting discernment and judgment. The recursive step that got the ball rolling -- designing a computer that could mimic any other computer -- could itself be reiterated, permitting specific computers to enhance their own powers by redesigning themselves, leaving their original designer far behind. Already in "Computing Machinery and Intelligence," his classic paper in Mind, 1950, he recognized that there was no contradiction in the concept of a (non-human) computer that could learn.

The idea of a learning machine may appear paradoxical to some readers. How can the rules of operation of the machine change? They should describe completely how the machine will react whatever its history might be, whatever changes it might undergo. The rules are thus quite time-invariant. This is quite true. The explanation of the paradox is that the rules which get changed in the learning process are of a rather less pretentious kind, claiming only an ephemeral validity. The reader may draw a parallel with the Constitution of the United States.

He saw clearly that all the versatility and self-modifiability of human thought -- learning and re-evaluation and, language and problem-solving, for instance -- could in principle be constructed out of these building blocks. Call this the bubble-up theory of mind, and contrast it with the various trickle-down theories of mind, by thinkers from René Descartes to John Searle (and including, notoriously, Kurt Gödel, whose proof was the inspiration for Turing's work) that start with human consciousness at its most reflective, and then are unable to unite such magical powers with the mere mechanisms of human bodies and brains.

Turing, like Darwin, broke down the mystery of intelligence (or Intelligent Design) into what we might call atomic steps of dumb happenstance, which, when accumulated by the millions, added up to a sort of pseudo-intelligence.

Turing, like Darwin, broke down the mystery of intelligence (or Intelligent Design) into what we might call atomic steps of dumb happenstance, which, when accumulated by the millions, added up to a sort of pseudo-intelligence. The Central Processing Unit of a computer doesn't really know what arithmetic is, or understand what addition is, but it "understands" the "command" to add two numbers and put their sum in a register -- in the minimal sense that it reliably adds when called upon to add and puts the sum in the right place. Let's say it sorta understands addition. A few levels higher, the operating system doesn't really understand that it is checking for errors of transmission and fixing them but it sorta understands this, and reliably does this work when called upon. A few further levels higher, when the building blocks are stacked up by the billions and trillions, the chess-playing program doesn't really understand that its queen is in jeopardy, but it sorta understands this, and IBM's Watson on Jeopardy sorta understands the questions it answers.

Why indulge in this "sorta" talk? Because when we analyze -- or synthesize -- this stack of ever more competent levels, we need to keep track of two facts about each level: what it is and what it does. What it is can be described in terms of the structural organization of the parts from which it is made -- so long as we can assume that the parts function as they are supposed to function. What it does is some (cognitive) function that it (sorta) performs -- well enough so that at the next level up, we can make the assumption that we have in our inventory a smarter building block that performs just that function -- sorta, good enough to use.

This is the key to breaking the back of the mind-bogglingly complex question of how a mind could ever be composed of material mechanisms. What we might call the sorta operator is, in cognitive science, the parallel of Darwin's gradualism in evolutionary processes. Before there were bacteria there were sorta bacteria, and before there were mammals there were sorta mammals and before there were dogs there were sorta dogs, and so forth. We need Darwin's gradualism to explain the huge difference between an ape and an apple, and we need Turing's gradualism to explain the huge difference between a humanoid robot and hand calculator.

The ape and the apple are made of the same basic ingredients, differently structured and exploited in a many-level cascade of different functional competences. There is no principled dividing line between a sorta ape and an ape. The humanoid robot and the hand calculator are both made of the same basic, unthinking, unfeeling Turing-bricks, but as we compose them into larger, more competent structures, which then become the elements of still more competent structures at higher levels, we eventually arrive at parts so (sorta) intelligent that they can be assembled into competences that deserve to be called comprehending. We use the intentional stance to keep track of the beliefs and desires (or "beliefs" and "desires" or sorta beliefs and sorta desires) of the (sorta-)rational agents at every level from the simplest bacterium through all the discriminating, signaling, comparing, remembering circuits that compose the brains of animals from starfish to astronomers.

There is no principled line above which true comprehension is to be found -- even in our own case. The small child sorta understands her own sentence "Daddy is a doctor," and I sorta understand "E=mc2." Some philosophers resist this anti-essentialism: either you believe that snow is white or you don't; either you are conscious or you aren't; nothing counts as an approximation of any mental phenomenon -- it's all or nothing. And to such thinkers, the powers of minds are insoluble mysteries because they are "perfect," and perfectly unlike anything to be found in mere material mechanisms.

We still haven't arrived at "real" understanding in robots, but we are getting closer. That, at least, is the conviction of those of us inspired by Turing's insight. The trickle-down theorists are sure in their bones that no amount of further building will ever get us to the real thing. They think that a Cartesian res cogitans, a thinking thing, cannot be constructed out of Turing's building blocks. And creationists are similarly sure in their bones that no amount of Darwinian shuffling and copying and selecting could ever arrive at (real) living things. They are wrong, but one can appreciate the discomfort that motivates their conviction.

Turing's strange inversion of reason, like Darwin's, goes against the grain of millennia of earlier thought. If the history of resistance to Darwinian thinking is a good measure, we can expect that long into the future, long after every triumph of human thought has been matched or surpassed by "mere machines," there will still be thinkers who insist that the human mind works in mysterious ways that no science can comprehend.

About the Author

Daniel C. Dennett is a professor of philosophy and co-director of the Center for Cognitive Studies at Tufts University. He is the author of many books including Breaking the Spell, Freedom Evolves, and Darwin's Dangerous Idea.

Most Popular

Five times a day for the past three months, an app called WeCroak has been telling me I’m going to die. It does not mince words. It surprises me at unpredictable intervals, always with the same blunt message: “Don’t forget, you’re going to die.”

Sending these notices is WeCroak’s sole function. They arrive “at random times and at any moment just like death,” according to the app’s website, and are accompanied by a quote meant to encourage “contemplation, conscious breathing or meditation.” Though the quotes are not intended to induce nausea and despair, this is sometimes their effect. I’m eating lunch with my husband one afternoon when WeCroak presents a line from the Zen poet Gary Snyder: “The other side of the ‘sacred’ is the sight of your beloved in the underworld, dripping with maggots.”

The president is the common thread between the recent Republican losses in Alabama, New Jersey, and Virginia.

Roy Moore was a uniquely flawed and vulnerable candidate. But what should worry Republicans most about his loss to Democrat Doug Jones in Tuesday’s U.S. Senate race in Alabama was how closely the result tracked with the GOP’s big defeats last month in New Jersey and Virginia—not to mention how it followed the pattern of public reaction to Donald Trump’s perpetually tumultuous presidency.

Jones beat Moore with a strong turnout and a crushing lead among African Americans, a decisive advantage among younger voters, and major gains among college-educated and suburban whites, especially women. That allowed Jones to overcome big margins for Moore among the key elements of Trump’s coalition: older, blue-collar, evangelical, and nonurban white voters.

Russia's strongman president has many Americans convinced of his manipulative genius. He's really just a gambler who won big.

I. The Hack

The large, sunny room at Volgograd State University smelled like its contents: 45 college students, all but one of them male, hunched over keyboards, whispering and quietly clacking away among empty cans of Juicy energy drink. “It looks like they’re just picking at their screens, but the battle is intense,” Victor Minin said as we sat watching them.

Clustered in seven teams from universities across Russia, they were almost halfway into an eight-hour hacking competition, trying to solve forensic problems that ranged from identifying a computer virus’s origins to finding secret messages embedded in images. Minin was there to oversee the competition, called Capture the Flag, which had been put on by his organization, the Association of Chief Information Security Officers, or ARSIB in Russian. ARSIB runs Capture the Flag competitions at schools all over Russia, as well as massive, multiday hackathons in which one team defends its server as another team attacks it. In April, hundreds of young hackers participated in one of them.

Brushing aside attacks from Democrats, GOP negotiators agree on a late change in the tax bill that would reduce the top individual income rate even more than originally planned.

For weeks, Republicans have brushed aside the critique—brought by Democrats and backed up by congressional scorekeepers and independent analysts—that their tax plan is a bigger boon to the rich than a gift to the middle class.

On Wednesday, GOP lawmakers demonstrated their confidence as clearly as they could, by giving a deeper tax cut to the nation’s top earners.

A tentative agreement struck by House and Senate negotiators would reduce the highest marginal tax rate to 37 percent from 39.6 percent, in what appears to be the most significant change to the bills passed by each chamber in the last month. The proposal final tax bill would also reduce the corporate tax rate from 35 percent to 21 percent, rather than the 20 percent called for in the initial House and Senate proposals, according to a Republican aide privy to the private talks.

If Democratic candidate Doug Jones had lost to GOP candidate Roy Moore, weakened as he was by a sea of allegations of sexual assault and harassment, then some of the blame would have seemed likely to be placed on black turnout.

But Jones won, according to the Associated Press, and that script has been flipped on its head. Election Day defied the narrative and challenged traditional thinking about racial turnout in off-year and special elections. Precincts in the state’s Black Belt, the swathe of dark, fertile soil where the African American population is concentrated, long lines were reported throughout the day, and as the night waned and red counties dominated by rural white voters continued to report disappointing results for Moore, votes surged in from urban areas and the Black Belt. By all accounts, black turnout exceeded expectations, perhaps even passing previous off-year results. Energy was not a problem.

There’s a fiction at the heart of the debate over entitlements: The carefully cultivated impression that beneficiaries are simply receiving back their “own” money.

One day in 1984, Kurt Vonnegut called.

I was ditching my law school classes to work on the presidential campaign of Walter Mondale, the Democratic candidate against Ronald Reagan, when one of those formerly-ubiquitous pink telephone messages was delivered to me saying that Vonnegut had called, asking to speak to one of Mondale’s speechwriters.

All sorts of people called to talk to the speechwriters with all sorts of whacky suggestions; this certainly had to be the most interesting. I stared at the 212 phone number on the pink slip, picked up a phone, and dialed.

A voice, so gravelly and deep that it seemed to lie at the outer edge of the human auditory range, rasped, “Hello.” I introduced myself. There was a short pause, as if Vonnegut were fixing his gaze on me from the other end of the line, then he spoke.

So many people watch porn online that the industry’s carbon footprint might be worse now that it was in the days of DVDs and magazines.

Online streaming is a win for the environment. Streaming music eliminates all that physical material—CDs, jewel cases, cellophane, shipping boxes, fuel—and can reduce carbon-dioxide emissions by 40 percent or more. Video streaming is still being studied, but the carbon footprint should similarly be much lower than that of DVDs.

Scientists who analyze the environmental impact of the internet tout the benefits of this “dematerialization,” observing that energy use and carbon-dioxide emissions will drop as media increasingly can be delivered over the internet. But this theory might have a major exception: porn.

Since the turn of the century, the pornography industry has experienced two intense hikes in popularity. In the early 2000s, broadband enabled higher download speeds. Then, in 2008, the advent of so-called tube sites allowed users to watch clips for free, like people watch videos on YouTube. Adam Grayson, the chief financial officer of the adult company Evil Angel, calls the latter hike “the great mushroom-cloud porn explosion of 2008.”

In The Emotional Life of the Toddler, the child-psychology and psychotherapy expert Alicia F. Lieberman details the dramatic triumphs and tribulations of kids ages 1 to 3. Some of her anecdotes make the most commonplace of experiences feel like they should be backed by a cinematic instrumental track. Take Lieberman’s example of what a toddler feels while walking across the living room:

When Johnny can walk from one end of the living room to the other without falling even once, he feels invincible. When his older brother intercepts him and pushes him to the floor, he feels he has collapsed in shame and wants to bite his attacker (if only he could catch up with him!) When Johnny’s father rescues him, scolds the brother, and helps Johnny on his way, hope and triumph rise up again in Johnny’s heart; everything he wants seems within reach. When the exhaustion overwhelms him a few minutes later, he worries that he will never again be able to go that far and bursts into tears.

In analyzing Doug Jones’s surprise win, the pundit-in-chief misconstrues the race and elides his own role in Moore’s defeat.

Doug Jones’s victory in the U.S. Senate race in Alabama on Tuesday poses a quandary to Republicans at all levels—but to none more than President Trump. The results of the race demonstrate the limitations of both his political power and of his self-appointed role as pundit-in-chief. He is more interested in being right than in winning—but on Tuesday, he did neither.

The president offered a series of somewhat contradictory responses to the race between Tuesday night and Wednesday morning. Late Tuesday, he tweeted:

Congratulations to Doug Jones on a hard fought victory. The write-in votes played a very big factor, but a win is a win. The people of Alabama are great, and the Republicans will have another shot at this seat in a very short period of time. It never ends!

Will the vice president—and the religious right—be rewarded for their embrace of Donald Trump?

No man can serve two masters, the Bible teaches, but Mike Pence is giving it his all. It’s a sweltering September afternoon in Anderson, Indiana, and the vice president has returned to his home state to deliver the Good News of the Republicans’ recently unveiled tax plan. The visit is a big deal for Anderson, a fading manufacturing hub about 20 miles outside Muncie that hasn’t hosted a sitting president or vice president in 65 years—a fact noted by several warm-up speakers. To mark this historic civic occasion, the cavernous factory where the event is being held has been transformed. Idle machinery has been shoved to the perimeter to make room for risers and cameras and a gargantuan American flag, which—along with bleachers full of constituents carefully selected for their ethnic diversity and ability to stay awake during speeches about tax policy—will serve as the TV-ready backdrop for Pence’s remarks.