During the unexpected and glorious ascendancy of the New England Patriots to pre-eminence in the football world last season, the press was full of the word "genius." As a sports reporter for The Boston Globe noted, a computer search had matched "genius" with the Patriots' coach, Bill Belichick, "not once, but more than 200 times." And when the Pats actually defeated the heavily favored St. Louis Rams in the Super Bowl, the New York Times headline for the Boston edition, in sixty-point type, read simply "DEFENSIVE GENIUS." The text of the article made it clear that this figure of speech had taken hold. Coach Belichick was likened to a "neighborhood tough guy in a dark alley" who "also has the I.Q. of a nuclear physicist."

Belichick was hardly alone in this peculiar species of gridiron celebrity. The Rams' coach, Mike Martz, was touted as an "offensive genius," the Washington Redskins' newly designated coach, Steve Spurrier, as a "genius strategist," and so on through a parade of sports pages, until even the TV commentators began to call for an end to the hyperbole. And as Belichick's father, a ripe eighty-three years old (and himself a former coach), took time out to observe, "genius" seemed an odd appellation for "somebody who walks up and down a football field."

We live in a time in which all terms and traits are inflated, and even the standard size at Starbucks is a tall. But "genius" appears marked for special inflation, so much so that the term "overrated genius" has begun to seem like a tautology rather than a cautious qualification. What is a genius, anyway? And why does our culture have such an obsession with the word and with the idea?

Genius is fundamentally an eighteenth-century concept, though it has had a good long run through the centuries since. The genius was, and to some extent continues to be, the Romantic hero, the loner, the eccentric, the apotheosis of the individual. The further our society gets from individual agency—the less the individual seems to have real power to change things—the more we idealize the genius, who is by this definition the opposite of the committee or the collaborative enterprise. Indeed, some of the resistance to the idea that Shakespeare wrote his plays in collaboration with other playwrights and even actors in his company comes from our residual, occasionally desperate, need to retain this ideal notion of the individual genius. We prefer the myth: It was Watson and Crick who discovered DNA—not a whole laboratory of investigators. Edison invented the electric light bulb and the phonograph—never mind that he worked with an extensive team of technicians, mechanics, and scientists.

The pursuit of genius is the pursuit of an illusion. As illusions go, it's among mankind's happier ones—the idea that an individual might have an exceptional and intrinsic talent for art, music, science, mathematics, or something else beneficial to civilization and culture. There's no doubt that such individuals have lived among us throughout history, and have bequeathed to us the legacy of their art and their ideas—but do they constitute an actual class called geniuses? And if so, how can we tell the real ones from the wannabes, the genuine articles from the poseurs?

Over the years we have increasingly tried to analyze, codify, and even quantify "genius," the post-Enlightenment equivalent of sainthood. This wistful quest has itself become a kind of secular religion. Saint Einstein, Saint Newton, Saint Darwin, and Saint Edison have replaced the healers and martyrs of the past, their "miracles" the discoveries of modern science and modern research: relativity, gravity, evolution, electricity.

Artists, musicians, and writers, too, look to the genius standard to separate the "timeless" from the time-bound or the merely timely. In the British novelist Ian McEwan's Booker Prize-winning novel, Amsterdam, a self-absorbed and slightly over-the-hill composer, struggling to write a commissioned symphony, reflects on his musical gifts.

Clive stood from the piano ... and had once more a passing thought, the minuscule fragment of a suspicion that he would not have shared with a single person in the world, would not even have committed to his journal, and whose key word he shaped in his mind only with reluctance; the thought was, quite simply, that it might not be going too far to say that he was ... a genius. A genius. Though he sounded it guiltily on his inner ear, he would not let the word reach his lips. He was not a vain man. A genius. It was a term that had suffered from inflationary overuse, but surely there was a certain level of achievement, a gold standard, that was nonnegotiable, beyond mere opinion.

As it turns out, Clive is mistaken. His Millennial Symphony, desperately written in high hopes of praise, is a "drone" of sound even to his own ears; later assessments will call it a "dud," and note that a melody at the end is a "shameless copy of Beethoven's Ode to Joy." A worldly conductor, praising Clive's past achievements in tacit contrast to his present ones, remarks with unconscious cruelty, "The inventiveness of youth, so hard to recapture, eh, maestro?" The real barb here is that honorific "maestro": master. Clive has become a public figure and a society success story, but he is further away from genius than he was as a young man.

At this point in history genius has become a commodity, an ambition, and even a lifestyle. Biographers, scholars, critics, and fans spend untold hours trying to nail down a concept that can't be nailed down, to identify a proof or a marker the way scientists identify genes. At the same time, they seek to "humanize" or "personalize" the story of cultural and intellectual achievement, to make the genius lovable, accessible, and ready for prime time.

Consider an emblematic pair of titles: Shakespeare in Love and Einstein in Love. The first is the film hit of 1998, with a witty screenplay co-authored by Tom Stoppard; the second is Dennis Overbye's 2000 biography of the young Albert Einstein. Both portray universally accepted, card-carrying geniuses in moments of offstage intimacy. The movie hypothesizes (cleverly, although entirely at variance with the facts) that Shakespeare's gift for writing brilliant plays was jump-started when the playwright fell in love with a beautiful, elusive, and ultimately unattainable female aristocrat. Before, he was a hack; afterward, he was a genius. Overbye's book, more grounded in reality, traces the early life of the winning and self-sufficient Albert as he romances women and ideas. The timing of Einstein's two major scientific feats, the equation E=mc2 and the general theory of relativity, coincided, respectively, with his marrying Mileva Maric, a Serbian physics student, and his subsequent struggle to divorce Mileva to marry his cousin Elsa.

Both these accounts are to a certain extent charmed by, and charming about, their famous subjects. Each offers a "human" genius, one with quirks, flaws, and feelings. And signally in each case, the tag "in love" alludes, finally, not to the human object of passion but to art or science—to the theater or to the universe.

The later Einstein became a cultural icon, personifying genius in look and name. With his unruly shock of white hair, ambling gait, warm ("absentminded") smile, and penchant for going sockless, he was a celebrity easy to love, at least from a distance. The Einstein legend, fully established in the mathematician's lifetime, has persisted long after his death. Walter Matthau (in a fright wig) played him as a warmhearted matchmaker, in a 1994 film called I.Q.; a photograph of Einstein, his tongue sticking out, adorns a popular T-shirt; and the famous head is the manifest model for "Chia Prof" in the Chia Head pottery-planter series. On many computers a shaggy-haired Einstein "office assistant" can be found ready to explain the mysteries of word processing.

Movies about geniuses, if not about genius, are reliably popular, from Amadeus to Good Will Hunting. Books such as James Gleick's Genius: The Life and Science of Richard Feynman and Sylvia Nasar's A Beautiful Mind become top sellers and beget stage or cinema versions. In those books genius is intertwined with flamboyant eccentricity and iconoclasm. In Peter Parnell's play QED, Feynman, the inventor of quantum electrodynamics, acts in plays, performs on bongo drums, enjoys nude life-drawing and the company of many women, between his work on the atom bomb and his famous diagnosis of what went wrong with the space shuttle Challenger. And the mathematician John Forbes Nash, a complicated personality described in Nasar's biography as having had a child out of wedlock and being attracted to men as well as women, was airbrushed in Ron Howard's film A Beautiful Mind into a heroic one-woman man who triumphed over his schizophrenia—a typical film example of the "tortured genius." In real life both men won Nobel prizes; indeed, the transformation of Nobel medal into Oscar statuette seems like a new kind of theatrical alchemy.

Meanwhile, the concept of genius seems broader than ever and ubiquitous. A special end-of-the-millennium issue of Esquire (November 1999) was devoted to "genius" and featured articles on the hidden quirks of famous brains from Benjamin Franklin to the physicist J. Robert Oppenheimer, a "(This Is Not a) Genius Test," and a list of twenty-one inventors, creators, and thinkers ("profiles in brilliance") for the rapidly approaching twenty-first century. "We are lucky," Esquire declared, "to be living in an age of genius." And who are these geniuses? A computer scientist (Bill Joy), a designer (Tom Ford), an actor (Leonardo DiCaprio), a basketball star (Allen Iverson), a singer (Audra McDonald), a foreign-policy expert (Fareed Zakaria), a chef (Thomas Keller), a novelist (Richard Powers), an entrepreneur (Amazon.com's Jeff Bezos), an artist (Julie Taymor). Their wealth of accomplishment is undeniable. But do they really deserve to be called geniuses?

At a bookstall in the Amsterdam airport I recently came upon a volume titled Think Like a Genius: The Ultimate User's Manual for Your Brain. Written by Todd Siler, a self-described visual artist, technology specialist, and founder of a company that specializes in innovative learning materials "for fostering integrative thinking in education, business, and the family," Think Like a Genius is made up of the short, punchy sentences that typify self-help books. This passage is from a section called "You Don't Have to Be a Genius to Think Like One":

We have been taught that a genius is someone who knows how to think deeply and with originality, an advanced thinker with an expansive mind, such as Plato, Aristotle, or Leonardo da Vinci. We have not been taught that, alongside our most celebrated geniuses, there are legions of everyday geniuses. They're not people who are mental giants. Nor are they intellectual heroes. Their theories and inventions don't change cultures or civilizations. But they have all experienced flights of exceptional thinking, often in some highly practical way. Such is the genius behind the invention of paper, Velcro, staples, nails, steel, glass, cement, currency, and other remarkably "simple" but useful things.

What is especially striking about this book—which carries a blurb from the CEO (not yet mayor) Michael Bloomberg and is studded with cartoons, line drawings, four-step exercises, and such advice as "Explore your life from a genius's perspective," and "Customize some aspects of genius to fit your way of thinking"—is its suggestion that "genius" has become an achievable goal, one of spiritual and commercial value. Resistance to exclusivity and privilege is coupled with a can-do spirit. This, in effect, is a manifesto for genius without the geniuses.

What Is "Genius"?

The word "genius" derives from the same root as "gene" and "genetic," and meant originally, in Latin, a tutelary god or spirit given to every person at birth. One's genius governed one's fortune and determined one's character, and ultimately conducted one out of this world into the afterlife. The thinkers of antiquity suggested that every person had two geniuses, one good and one evil, which competed for influence. This concept was alive and well in Shakespeare's day, and survives in the expression "his better genius." The word "genius" soon came to mean a demon or spirit in general, as in the fairy-tale "genie" or "jinni." Genius thus conceived was part of a system that would later be called psychology, because it was thought of as residing somehow both inside and outside the individual, and motivating behavior. Through the Renaissance and well into the eighteenth century the most familiar meaning of "genius" in English was something like "temperament" or "disposition": people were described as having a "daring genius" or an "indolent genius."

Joseph Addison's essay "On Genius," published in The Spectator in 1711, laid out the terrain of genius as we use the term today, to denote exceptional talent or someone who possesses it. According to Addison, there were two kinds of genius—natural and learned (the greatest of geniuses were the natural ones, whose inborn gifts freed them from dependence on models or imitation). Homer, Pindar, and Shakespeare were his examples of the first category, Aristotle, Virgil, Milton, and Francis Bacon of the second. In general terms this dichotomy—brilliant versus industrious—still underlies our notions of genius today, but despite Thomas Edison's oft quoted adage "Genius is one percent inspiration and ninety-nine percent perspiration," it's the inspiration that we dote on.

In Addison's day and for years after, the idea of the scientific genius was still for most people a contradiction in terms. Science, then and for years after, was more perspiration than inspiration. Genius was precisely what could never be quantified. As Immanuel Kant contended in his Critique of Judgment, genius is a talent for art, not for science, since "it is quite ridiculous for a man to speak and decide like a genius in things which require the most careful investigation by reason." The spontaneous generation of ideas that apparently characterizes genius seemed fundamentally at odds with the painstaking labor and analysis that characterize research. "We can readily learn all that Newton has set forth in his immortal work on the Principles of Natural Philosophy," Kant declared, "however great a head was required to discover it, but we cannot learn to write spirited poetry." But in England, in the novel Tom Jones, Henry Fielding had already poked fun at the notion of unschooled genius.

As several gentlemen in these times, by the wonderful force of genius only, without the least assistance of learning, perhaps, without being well able to read, have made a considerable figure in the republic of letters; the modern critics, I am told, have lately begun to assert, that all kind of learning is entirely useless to a writer; and, indeed, no other than a kind of fetters on the natural spriteliness and activity of the imagination.

It was with the Romantic period that the true cult of the natural genius emerged. At the beginning of the nineteenth century poets and critics such as Coleridge and Shelley, perhaps self-interestedly, staked out genius as the territory of the poet. Shakespeare—or at least the Romantic period's fantasy of Shakespeare—was the quintessential genius, his brilliance and his inventiveness the results of attributes and resources that were innate rather than learned. "Sweetest Shakespeare, fancy's child, warbl[ing] his native woodnotes wild" Milton had called him, and this portrait of untutored genius had an enormous effect on critics in the centuries that followed. The less Shakespeare had been taught, the more genius he had—so ran the thinking then, and so, to a certain extent, it continues today. Thus some scholars exulted over Shakespeare's supposed lack of formal education: he attended only a rural grammar school, not a university; he had, as his admiring rival Ben Jonson indelibly phrased it, "small Latin and less Greek"—though in point of fact the curriculum of the Stratford grammar school, rich in history and mythology, offered a solid training in the classics.

The Romantics differed among themselves in their estimation of Shakespeare's style and his degree of learning, trying in various ways to explain the causes of the poet's supposed barbarisms, separating what they judged his timeless genius from the unfortunate fact that he lived and wrote in a cruder age. William Hazlitt, an influential critic and essayist, catalogued the playwright's imperfections, including his irritating fondness for puns and wordplay and his unaccountable willingness to alter chronology and geography to suit his dramatic purposes. But these flaws did not detract, in Hazlitt's view, from what he deemed Shakespeare's unique genius: "His barbarisms were those of his age. His genius was his own."

Coleridge, on the other hand, argued in a lecture forthrightly titled "Shakespeare's Judgment Equal to His Genius" that the plays were more than "works of rude uncultivated genius." Their form, he thought, was "equally admirable with the matter; the judgment of the great poet not less deserving of our wonder than his genius." Here the word "wonder" underscores Shakespeare's quasi-miraculous achievement. Judgement—in his case, unlike so many others—is not an element that contradicts genius but, rather, a virtue, equally unfathomable, that serves as its complement.

The cult of genius inherited from these Romantic writers, one that still has enormous force today, tells us that ordinary mortals can achieve many things by dint of hard work, but the natural and effortless gifts of a true genius (like Shakespeare) will forever elude the diligent overachiever. By this logic genius, and geniuses, cannot be made, only born.

Genius and Aberration

The Romantics found genius not only in the supposedly wild and uncultivated Shakespeare but also in the poets and personalities of their own period—in the extravagant Byron and the intense and charismatic Shelley. The perhaps inevitable next step was for an artist to announce his own genius, as did that great self-promoter and fearless breaker of cultural taboos, Oscar Wilde. Flush with celebrity as London's leading "aesthete," Wilde arrived in New York in January of 1882 for a triumphal tour of the United States; he is said to have announced to Customs officials, "I have nothing to declare except my genius." Thus genius became not merely a synonym for exalted intellectual power but a performed role. With his ostentatious clothing, his green carnations, his witty epigrams, and his flair for publicity, Wilde became the avatar of self-proclaimed genius.

Where Wilde dared to tread, others followed. Gertrude Stein, with a personality as obtrusive in its way as Wilde's, would claim the mantle of genius for the so-called "lost generation" of Americans who had taken up residence in Europe during the early years of the twentieth century. As Stein wrote in her autobiography, ventriloquized through the voice of her companion and secretary, Alice B. Toklas, "The geniuses came and talked to Gertrude Stein and the wives sat with me. How they unroll, an endless vista through the years ... geniuses, near geniuses and might be geniuses, all having wives, and I have sat and talked with them, all the wives and later on, well later on too, I have sat and talked with all." On one occasion Stein proclaimed that the Jews had produced "only three originative geniuses—Christ, Spinoza, and myself," a remark recorded in Robert McAlmon's memoir, titled Being Geniuses Together.

Over the years there have been pockets of resistance to throwing the term around lightly. "Among scientists," James Gleick writes in his biography of Feynman, "it became a kind of style violation, a faux pas suggesting greenhorn credulity, to use the word genius about a living colleague." But in her biography of Nash, Sylvia Nasar uses it relentlessly and unselfconsciously, on page after page, to refer to a whole bevy of mathematicians. Her book includes references to "Nash the mathematical genius," to "the two geniuses" Nash and his fellow scientist John von Neumann, to Norbert Wiener as "a genius who was at once adulated and isolated," and to the "group of geniuses" in mathematics and theoretical physics who came to the United States from Europe in the years before and during World War II. Not only does Nasar revel in using the word but she delights in summoning up the attributes and symptoms of genius. She writes that "[Nash's] arrogance was seen as evidence of his genius," that "a profound dislike for merely absorbing knowledge and a strong compulsion to learn by doing is one of the most reliable signs of genius," and that "Nash picked up the mannerisms of other eccentric geniuses" at MIT, appropriating as his own Wiener's gesture of running his finger along grooves in the tiled walls of the corridors. She also cites D. J. Newman's condemnation of music after Beethoven, Norman Levinson's dislike of psychiatrists, and Warren Ambrose's impatience with conventional social greetings.

For Nasar and other true believers, the bona fide genius solves problems whole with spontaneous outbursts of inspiration, rather than working them through step by step, or equation by equation, like the rest of us. Nasar often describes Nash as brooding intensely over a topic before finally approaching it from an unexpected angle, startling his colleagues. She writes that he seldom read the works of other mathematicians or engaged in any preparatory research, preferring to come at problems afresh.

The eccentric genius is especially familiar to readers of detective fiction, in which such titans as Sherlock Holmes and Nero Wolfe best their plodding competitors by sheer force of mind and gift for idiosyncrasy. Holmes, with his pipe, his violin, his cocaine habit, his melancholy, his diverse and erudite publications (on topics from motets to shag tobacco, from the ancient Cornish language to bee culture), his avoidance of women, and his disdain for ordinary police work, is a classic embodiment of the genius, as is Wolfe, a portly polymath, didact, gourmand, beer drinker, and orchid fancier, who also avoids women and the ordinary investigative methods of the police. "I have no talents," Wolfe declares with customary insouciance. "I have genius or nothing."

Eccentricity has become such a strong identifying mark of genius that the very notion of a non-eccentric genius seems like a contradiction in terms. Thus in the genius sweepstakes we are drawn to Eugene O'Neill over Arthur Miller, Emily Dickinson over Felicia Hemans, Edgar Allan Poe over Henry Wadsworth Longfellow, and, of course, Mozart over Salieri. General Leslie Groves was delighted to find that his top scientist on the Manhattan Project, Robert Oppenheimer, was fluent in Sanskrit. "To physicists, Oppenheimer's command of Sanskrit seemed a curiosity," James Gleick writes in his biography of Feynman. "To General Groves, it was another sign of genius." Esquire's "genius" issue included a list of "eccentricities" gleaned from pop biographies of "seven bona fide geniuses," in which readers learned, for example, that Orson Welles supposedly wanted to play the lead in the rock opera Tommy, that Miles Davis wore his underpants backwards, and that Oppenheimer, after he lost his security clearance, drank lead paint to "make [himself] stupider." Sometimes the element of eccentricity seems to crowd out everything else, including the subject's actual achievements—threatening to make genius almost entirely a theatrical role.

The growing perception in the nineteenth century of an inextricable link between genius and eccentricity led some to speculate about a possible connection to a darker form of anomaly: insanity. The English eugenicist Francis Galton expressed deep concern about such pathological undercurrents in his 1869 book Hereditary Genius.

If genius means a sense of inspiration, or of rushes of ideas from apparently supernatural sources, or of an inordinate and burning desire to accomplish any particular end, it is perilously near to the voices heard by the insane, to their delirious tendencies, or to their monomanias. It cannot in such cases be a healthy faculty, nor can it be desirable to perpetuate it by inheritance.

Another British physician who took the insanity question seriously was Havelock Ellis. In his A Study of British Genius (1904) he classified geniuses in a number of different ways: those who were insane during a considerable portion of their lives (John Clare, William Collins, William Cowper, Christopher Smart); those who were either briefly insane or died young, sometimes by suicide (George L. Fox, Charles Lamb, Dante Gabriel Rossetti); those who became insane only at the ends of their long lives, suffering from senile dementia (Robert Southey, Jonathan Swift); and those who exhibited "marked eccentricity not amounting to insanity" (James Boswell, Laurence Oliphant). Ellis paused over the case of William Blake, noting a contemporary doctor's view that if the story of Blake's sitting naked in his summer house with his wife was to be believed, "he was certainly insane." In the end, however, Ellis's findings seemed to refute the notion of an ineluctable tie between genius and insanity. According to his tabulations, only 4.2 percent of his "men of genius" could be counted as insane by any of his criteria. "We must put out of court any theory as to genius being a form of insanity," he concluded.

While the likes of Galton and Ellis were investigating the relationship between genius and insanity, others were looking closely at geniuses for different forms of aberration. The Italian criminologist Cesare Lombroso, best known for alleging (in The Criminal Man, 1876) that certain physical types were "born criminals," followed up this study of human transgression with a parallel study of human achievement, The Man of Genius (1888), in which he asserted that genius was related both to moral degeneracy (manifested variously as apathy, impulsiveness, sexual excess, morbid vanity, excessive mutism or verbosity, the tendency to put mystical interpretations on the simplest facts) and to certain physical characteristics (prominent ears, deficiency of beard, shortness of stature, left-handedness, pallor, stammering, sterility).

Sixty years later the British physician W. R. Bett devoted an entire volume to what he called "the infirmities of genius," with geniuses and their afflictions neatly paired: "Percy Bysshe Shelley: Neurosis and Genius"; "Algernon Charles Swinburne: Epilepsy and Genius"; "Honoré de Balzac: High Blood Pressure and Genius"; "Charles Baudelaire: Syphilis, Drugs, and Genius"; "Robert Burns: Rheumatic Fever and Genius"; "Lord Byron: Lameness and Genius," and so on. Lafcadio Hearn was called "The Disfigured Genius who Worshipped Beauty." In his preface Bett announced his project in an uncompromising way: "This book is almost exclusively concerned with abnormal people—with the psychopathology of genius."

Quantifying Genius

With the invention of the intelligence quotient, or IQ, came the idea that genius could be quantified. Not surprisingly, this undermined the traditional Romantic vision of the genius as a different kind of being; it was the end of genius's aura.

It's not surprising either that IQ was the invention of an American—Lewis Terman, a psychology professor at Stanford who thought up this device for the scientific assessment of mental capacity at the beginning of the twentieth century. The French psychologist Alfred Binet had in 1905 developed a test for measuring the ability to think and reason, apart from education in any field. He gave the test to Paris schoolchildren, and arrived at the idea of a "mental age," which was based on the percentage of people who could pass a particular test geared for that age. Terman, adapting Binet's method, divided a test taker's mental age by his actual age times one hundred to arrive at the intelligence quotient. The idea was that the population might be sorted by intelligence and funneled into appropriate levels of schooling, suitable jobs, and so forth—all this decades before Nazi science made such categorizations invidious. Inductees into the U.S. Army during World War I were routinely given IQ tests. Britain's "11-plus exams," adopted in 1944, were an attempt to track students by intelligence and merit, and thus to contravene the old class system.

In 1921 Terman and his research team, through statewide testing in California, identified a group of 1,528 hightesting youngsters, whom newspapers called the "1,000 Gifted Children" or the "little geniuses" or, inevitably, the "Termites." Studied over the next eight decades, this group revealed a median IQ of 147, with some scores above 190. It included descendants of Benjamin Franklin, John and John Quincy Adams, Henry Wadsworth Longfellow, P. T. Barnum, Harriet Beecher Stowe, and Mark Twain. It also included Terman's two children. The results of the study were in some ways underwhelming. "There wasn't a Nobel laureate," Terman's successor, Albert H. Hastorf, a former Stanford provost, reported. "There wasn't a Pulitzer Prize. We didn't have a Picasso. It's my guess that Terman was a little bit disappointed." Many in the group went on to become doctors and lawyers, but the label "genius" eluded them as adults. Perhaps the most famous members were the Hollywood director Edward Dmytryk; the creator of I Love Lucy, Jess Oppenheimer; and the physiologist Ancel Keys, inventor of the portable meal called the K-ration, who was the only one of the "little geniuses" to wind up on the cover of Time or Life.

Nonetheless, the notion that there was such a thing as an IQ—and, indeed, such a thing as a "genius-level IQ"—had taken hold in the popular imagination. Thus the gossip columnist who writes a weekly feature called "Walter Scott's Personality Parade" for Parade Magazine referred to George H.W. Bush's chief of staff John Sununu as "a brilliant academician with a genius-level IQ." But, as The Washington Post noted, Sununu's IQ seemed to be constantly on the increase: it was reported as 170 when he was governor of New Hampshire and as "a genius IQ of 176" at the time of the Bush inaugural; William Safire described Sununu as a "quasi-genius, reportedly with an intelligence quotient of 180." The 180 was subsequently invoked by almost every commentator in the press. If an abrasive and strong-willed politico like Sununu, whose IQ score was said to have been based on his responses to a quiz published in Omni magazine, was a new type of genius, what did that say about the transcendent cachet that seemed formerly to have attached to that troublesome, enigmatic, and increasingly elusive term? The idea that genius could be quantified and placed on a continuum with ordinary intelligence—that the genius was just like everyone else, only smarter—was at odds with the Romantic notion of the genius as fundamentally different. Yet both ideas proved irresistible.

Once the IQ test had been developed to measure intelligence in the living, it was almost inevitable that someone should wish to test the dead. Social scientists, working under the direction of Lewis Terman, set out to ascertain the IQs of history's most famous geniuses. Terman himself, confessing his lifelong interest in "the childhood of genius," focused on Francis Galton, the author of Hereditary Genius, saying that "any psychologist who is familiar with the age norms of mental development" would recognize various details from Galton's biography as "convincing proof that Galton as a child had an intelligence quotient not far from 200; in other words, that his 'mental age' was about twice his actual age."

Much of the excitement about IQ was linked to the growing popularity of the idea of "meritocracy"; instead of a hereditary aristocracy of the titled and the entitled, there would now be a new, more deserving upper class of eminent or soon to be eminent achievers. The notion that a test—any test—could be completely objective in measuring merit and intelligence faced challenges. In the second half of the twentieth century numerous attacks were made on the IQ system, which was regarded by some as inadvertently prejudicial. The anti-IQ forces were exemplified by J. L. Dillard, who criticized "an intelligence-testing procedure which is completely invalid because of its cultural and linguistic bias." Yet the allure of quantifying genius remained as strong as that of genius itself.

Banking on Genius

Since Lewis Terman's rigorous IQ testing had somehow failed to predict Nobel laureates, Pulitzer Prize winners, and Picassos, the next step for a culture enraptured by the idea of genius was to find a means of identifying people of such potential and then helping them to use their gifts.

Since 1981 the John D. and Catherine T. MacArthur Foundation has given out what the press insists on labeling "genius grants" to artists, performers, architects, scientists, and scholars of all persuasions. The twenty-four recipients for 2002, for instance, include a trombonist, a physicist and Internet publisher, a computational linguist, a documentary filmmaker, a glass-bead artist, a children's book author, a paleo-ethnobotanist who studies fossilized plant remains, a seismologist and disaster-prevention specialist, and a roboticist. MacArthur fellows are chosen for their "individual leadership, initiative, and creativity," according to the foundation's president, Jonathan F. Fanton. The foundation, which emphasizes "the importance of the creative individual in society," scrupulously avoids the G word. Where genius actually enters the picture is more in the intentions and fantasies of the program's founder, J. Roderick MacArthur—the son of John D. MacArthur, a financial wizard who parlayed an insurance firm into an empire that included Florida real estate, New York office buildings, and pulp-and-paper factories.

"The idea behind this," Roderick MacArthur explained when the fellows program was begun, "is that Albert Einstein could not have written a grant application saying he was going to discover the theory of relativity. He needed to be free." Recipients have not tendered applications; nor do they submit progress reports after they receive their grants. "There was no management association looking at Michelangelo and asking him to fill out semi-yearly progress reports in triplicate," MacArthur said. "Our aim is to support individual genius and to free those people from the bureaucratic pettiness of academe."

Leaving aside the patronage battles and myriad stresses and strains that did hamper Michelangelo, Einstein, and other geniuses of the past (not to mention the vexatious question of whether the obstacles they encountered perversely contributed to their achievements), what is most striking here is the persistence of the Romantic assumption that geniuses "need to be free." Or perhaps, if we hark back to the notion of genius as an attribute rather than a person, that freedom encourages people to develop their genius.

Another recent effort to foster genius is the Repository for Germinal Choice—the "genius" sperm bank founded in Escondido, California, by the eyeglass millionaire Robert Klark Graham. For twenty years the sperm bank, also described as a "genius baby farm," produced hundreds of designer babies sired by men of proven high intelligence, at least three of the initial donors were Nobel Prize winners. "There is nothing wrong with trying to improve the human race," said Graham, who died in 1997. "Think of all the gains we could have from dozens of children fathered by a Thomas Edison or Albert Einstein."

Our Genius Complex

It is ironic that scientific objectivism about genius is mingled with a strong remnant of what looks like religious faith. The fact is that we cannot bring ourselves to renounce the dream of the superhuman, whether that superhumanity comes in an explicitly religious form or in the post-Enlightenment guise of artistic or scientific genius. Who can forget the story of Einstein's brain—weighed, preserved, for some time inexplicably lost, and finally the object of a scientific custody battle? What was the anomaly that made for his genius? Was the brain of exceptional size? Did it have other signifying traits? As the critic Roland Barthes put it, "Einstein's brain is a mythical object." Old photographs show Einstein cheerfully submitting to testing, his head wreathed with electrical wires, his brain waves mechanically recorded while he was instructed to think of relativity. "Paradoxically," Barthes wrote, "the more the genius of the man was materialized under the guise of his brain, the more the product of his inventiveness came to acquire a magical dimension, and gave a new incarnation to the old esoteric image of a science entirely contained in a few letters." For Barthes, the essence of Einstein's genius was captured in the contrast between photographs of the scientist "standing next to a blackboard covered with mathematical signs of obvious complexity" and cartoons that showed him "chalk still in hand, and having just written on an empty blackboard, as if without preparation, the magic formula of the world."

Deep within us lies a certain strain of longing for genius, a genius worship, that might be described as messianic: the hope that a genius will come along to save us from our technological, philosophical, spiritual, or aesthetic impasse. Is there anything wrong with cherishing this ideal?

What all the IQ tests, brain measurements, and supposed telltale pathologies show is that genius in a particular case can't be proved to exist, much less effectively predicted. It's not that there is no such thing as genius but, rather, that genius is an assessment or an accolade often retrospectively applied to an individual or an idea—not an identifiable essence.

The words we use shape the way we think. "Genius" has become too easy a word for us to say. The parallel here may in fact be addiction rather than religion: as a culture, we have become increasingly addicted to the idea of genius, so we are dependent on it for a certain kind of emulative high, an intoxication with the superlative. Nowadays it takes more and more genius, or more and more geniuses, to satisfy our craving. It may be time to go cold turkey for a while, to swear off the genius model to represent our highest aspirations for intellectual or artistic innovation. If we remind ourselves that what is really at stake is creativity and invention; if we can learn to separate the power of ideas from that of personality; then perhaps we will be less dazzled by the light of celebrity and less distracted by attempts to lionize the genius as a high-culture hero—as essence rather than force. It's not just another word that we need; it's another way of thinking about thinking.

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?

When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.

Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.

A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.

As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.

He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)