Subscribe

Follow

Nobel's Hits and Misses

Alfred Nobel invented dynamite and smokeless powder, and mistakenly hoped that those discoveries would make war too horrible to be countenanced. He held high hopes, too, that his annual five prizes for men who had conferred “the greatest benefit on mankind” would diminish nationalism and cement peace. What have the Nobel Prizes achieved? With award time approaching, Harvard historian Donald Fleming provides an illuminating survey of the Nobel Prizes and some speculation on the 1966 winners.

We noticed that you have an

AD BLOCKER

ENABLED

Please consider disabling it
for our site, or supporting our
work in one of these ways

Yet despite fantastic omissions and dubious awards, the luster of the Nobel Prizes has remained absolutely undimmed as the most glittering recognition of intellect that can come to a man or woman of the twentieth century. Soon the drama will begin all over again with a new cast of anywhere from three to ten people. The prizewinners and their biographers have left many accounts of the experience, only to be compared with the letting down of a ladder from heaven in the lives of the saints.

The golden moment will gild the rest of a lifetime. The prizewinner has been lifted up above his professional associates, authenticated as a world figure by the only genuine stamp. Why, exactly, have the Nobel Prizes riveted the attention of the twentieth century as no other distinctions have done? The answer is that they reflect and epitomize some of the principal historical transformations of the age, and more than this, they embody the psychological tensions that profound historical change produces. In the century that has seen the waning of nationalism as an untroubled faith, the Nobel Prizes have symbolized the harmonious world community that cannot seem to get born but clearly must. One of them is actually for peace and harmony among nations, but in a larger sense all of them together undertake to single out contributions from any source to the welfare of the entire human race. At the same time, in a world where nationalism, however tarnished morally, is still the mainspring of practical affairs, the prizes lend themselves to tabulation according to nationality in a kind of spiritual Olympics.

The other great paradox that the prizes have ministered to has been the ever increasing prestige of science and technology in a rapidly secularizing era, but an era that clings all the more desperately to the ideal of service to humanity as the only viable relic of traditional religion and the only bulwark against the abuse of science itself. The Nobel Prizes have been tacitly consecrated for the mind of the twentieth century by an association between service to humanity and the advancement of science.

It is easy to see how all of these trends were refracted through Alfred Nobel. He was a cosmopolitan who lived in many countries, including Russia and the United States. He was one of Darwin’s congregation, a nonbeliever in religion, but a secular humanitarian as well. He was a genius at applied science, the inventor of dynamite and smokeless powder, and equally good at parlaying his discoveries into a worldwide industrial empire. Finally, he was a bachelor who could dispose of his entire fortune as he chose.

This was the man who on his death in 1896 left a will devoting the vast bulk of his fortune to the annual award of five prizes, without regard to nationality, for those who “in the preceding year” had conferred “the greatest benefit on mankind” by “the most important discovery or invention” in physics; “the most important chemical discovery or improvement”; “the most important discovery within the domain of physiology or medicine”; “the most outstanding work of an idealistic tendency” in literature; and “the most or best work for fraternity among nations, for the abolition or reduction of standing armies and for the holding and promotion of peace congresses.” The prizes for physics and chemistry were to be awarded by the Swedish Academy of Sciences; for physiology or medicine by the Caroline Institute in Stockholm; for literature by the Swedish Academy; for peace by a committee chosen by the Norwegian Parliament (Storting). The early prizes were worth about $42,000. Their cash value had dropped by 1950 to about $32,000, but in the 1960s they have risen to about $52,000. Of course, $42,000 in 1901 had the purchasing power of at least $100,000 today.

Nobel’s will, though clear enough, was technically defective. Some members of his family tried to break it for ostensibly high-minded reasons and had to be bought off by the executors. Perhaps as a cover for this, the family insisted on further restrictions upon the terms of award. The awarding bodies themselves, once the prize-giving got under way, inevitably built up a body of common-law interpretations of the mandate that Nobel and his relatives had given.

The result has been that severe limitations have diminished the quality and scope of the Nobel Prizes. Three of the limitations were imposed by Nobel himself: that literature had to be “idealistic” to qualify; that science meant a discovery, invention, or improvement, with the narrow definition of “discovery” implied by his coupling it with the other terms; that all prizes should be for the work of the preceding year. In practice, the awarding bodies loosened up the last requirement by making the award for “recent” contributions, or for contributions of which the full significance had only recently been grasped. In itself, this policy was a great gain for flexibility; but combined with another perfectly sensible rule, it lent itself to grave injustices. The scientific juries early saw the wisdom of waiting till discoveries were proved to be sound. Yet by the time a discovery was thoroughly authenticated, it might have lost the magical attribute of being “recent” and be out of the running unless newly appreciated at some later date. But this meant that an absolutely fundamental discovery which had gone on slowly but surely building itself into the very fabric of modern science might never experience any sensational “re-discovery” or sudden burst of new relevance, because it was relevant everywhere and all the time.

Some of the greatest discoveries fell between the stools of soundness and recency. As a corollary to this, the scientific juries have consistently enforced the principle that a man cannot accumulate “credit” towards a Nobel Prize by making a number of unrelated discoveries. If any one of them is important enough, it can be rewarded on its merits, but a really notable discovery that has been passed over will do nothing to eke out the claims of another contribution by the same man. The system is loaded against the versatile or ranging intellect.

Two other major restrictions were imposed by Nobel’s relatives: that no prize should be shared by more than three persons, and that no prize should be conferred upon a dead man unless he had been recommended for the award before his death.

Apart from these standing rules, the history of the prizes has been affected by two specific problems. In the first dozen years or so, the awarding bodies were confronted with a backlog of famous writers, peace agitators, and scientists who had made their names in the nineteenth century but lived on into the twentieth in a state of some vitality and productivity. Here, of course, the rule about recency was a big help in enabling the juries to set certain legendary figures gently aside. Even so, both the juries and the general public boggled at discarding the giants who had been at the height of their powers as late as 1885 or 1890. The result was that there were far too many towering senior figures still in the running for all of them to be squeezed into the first dozen years, even if there had been any way to space them out in the order of their more or less impending deaths.

The other problem that has cut clean across the prizes has been the occurrence of the two world wars, both of which led to the suspension of individual prizes or the entire set of prizes. In the turmoils of the twentieth century, no other people could have kept the prizes going as well as the Swedes. Yet the fact remains that the omission of awards when combined with the recency requirement was calculated to cheat some people out of the prizes they deserved.

The quality of the actual prizes that began to be awarded in 1901 has been a compound of the limitations enforced by general rules and special problems and the judgment displayed by the awarding bodies within these constraints. How good, and how bad, have the selections been? There is no authoritative answer to that, but here are one man’s impressions.

The number of outright blunders is extremely small. It was preposterous to honor the complacently bellicose Theodore Roosevelt as a man of peace in 1906, even if he did serve as a broker for winding up the Russo-Japanese War. Rudolf Eucken, a deservedly forgotten philosopher who was never important, was a scandalous choice in literature. Most people would now agree that Pearl Buck was another bad choice, and some would add John Steinbeck. At any rate, the strategic moment for honoring Steinbeck at the height of his reputation was certainly missed by more than twenty years. Only one award seems to have arisen out of sheer ignorance of the facts—J. J. R. Macleod’s equal share in the prize given to Sir Frederick Banting, the discoverer of insulin. Macleod’s sole contribution consisted in providing laboratory space and giving some general advice. Banting’s true collaborators were C. H. Best and J. B. Collip. They should have been included in the prize and Macleod left out. He is the only palpably undistinguished investigator in the whole list of laureates in science. The other principal contender for this title, Johannes Fibiger, was at least honored for his own work, on an alleged form of cancer, but it is now virtually certain that his basic conclusions were wrong. The Caroline Institute learned its lesson all to well. In the forty years since Fibiger was honored, there have been no awards for cancer research whatever. The original mistake was pardonable. The absolutely fossilized disregard of all subsequent research on cancer is a more grievous, indeed aggravated, failure.

If we consider the average caliber of each series of prizes with due regard to people who were passed over, the record is mixed. In an odd way, the prizes that are hardest to find fault with, and simultaneously the most disappointing, have been for peace. If the Nobel committee set any store by naval disarmament in the 1920s, Charles Evans Hughes should have won. Of the recent prizes, it is possible to wonder whether Father Georges Pire’s admirable work with refugees has really contributed to international peace. Martin Luther King’s achievements would hardly have fallen within Nobel’s own definition. Yet King’s prize, and Chief Albert John Luthuli’s, represented belated recognition of the principle of nonviolent resistance exemplified by Gandhi. Anybody who thinks that Gandhi ought to have won is in no position to object to the others.

Given the impossible task of rewarding people for a service that nobody has yet discovered how to perform, the Norwegians have acquitted themselves creditably.

Truly distinguished literature has been produced in the twentieth century, and here the record of the Swedish Academy is inexcusably bad. In addition to most of the giants of world literature, the non-winners have included Anna Akhmatova, Aleksandr Blok, Karel Capek, Jaroslav Hasek (of The Good Soldier Schweik), Stefan George, Arthur Schnitzler, Hugo von Hofmannsthal, Robert Musil, Paul Claudel, André Malraux, Miguel de Unamuno, Ortega y Gasset, Italo Svevo, George Meredith, H. G. Wells, Katherine Mansfield, E. M. Forster, Virginia Woolf, Dylan Thomas, William James, Theodore Dreiser (the runner-up to Sinclair Lewis in 1930), Edith Wharton, Scott Fitzgerald, Ezra Pound, and Robert Frost. Prizes for Swinburne and Paul Valéry were in the making when they died—not exactly prematurely: they were both in their seventies.

The glut of secondary Scandinavian writers is notorious; but the magnificent Swedish neutrality was vindicated by snubbing the only two Scandinavian writers of genius. Neither Ibsen nor Strindberg, neither Tolstoy nor Checkhov, neither Rilke nor Proust, neither Henry James nor Mark Twain nor Joseph Conrad—how could such a record be compiled except as a joke? The answer is a combination of severely restrictive rules capriciously applied by narrow men.

The small man and the inglorious opportunity were well met in Carl David af Wirsén, the Permanent Secretary of the Swedish Academy and dominant figure in the Nobel Prizes for literature till his death in 1912. He was a man of limited horizons, vindictive, and a bigot in literature. Strindberg was out because he had satirized Wirsén—only one man even bothered to nominate him. Tolstoy was more of a problem—foolish people would go on nominating him, but Wirsén was equal to the occasion. War and Peace and Anna Karenina were great novels, agreed, but Tolstoy’s recent work was full of detestable opinions on art, government, and civilization. The Academy would seem to be endorsing these, and that was out of the question. The Ibsen menace was dispatched by saying that he was past his prime. It was not that Wirsén was hobbled by a foolish consistency. He tried repeatedly to get the prize for Swinburne, a writer singularly devoid of any content, idealistic or otherwise, who had lost his touch a mere thirty years before.

When Wirsén died in 1912 he left behind kindred spirits adept at sniffing out any trace of irony, acerbity, gloom, pessimism, skepticism, cynicism, or fatalism as the hoofprints of an unidealistic tendency. Anatole France finally slipped through in 1921 on the delightful argument that his works couldn’t have been written by Zola. Thomas Hardy remained impossible. Sinclair Lewis in 1930 was tarred with skepticism, but the Academy was looking desperately for an American winner, the main alternative was the fatalist Dreiser, and a wholesome face could be put upon the whole affair by describing Babbitt as a piece of “high-class American humor.” The scruples worked in reverse. Lewis had his reputation as a debunker to lose, he would be sadly compromised among his cronies if the charge of idealistic tendencies could be made to stick, and he let it be understood that nowadays all this meant was that he hadn’t written solely for commercial gain. As a matter of fact, after 1930 the idealistic proviso does not seem to have made much, if any, difference. The dam had burst.

This does not mean that the record since 1930 has been satisfactory. Of the winners in the decade of the thirties (Lewis, Karlfeldt, Galsworthy, Bunin, Pirandello, O’Neill, Martin du Gard, Pearl Buck, Sillanpää), only Pirandellow would now be generally recognized as a major writer of secure reputation. The winners since the Second World War—including Gide, Eliot, Faulkner, Mauriac, Hemingway, Camus, Pasternak, and Sartre—probably constitute a higher proportion of the most notable living writers than in any previous period. But the roster is weakened by the increasingly evident determination to single out the literature of previously neglected nationalities (Chilean, Icelandic, Yugoslav, Greek). In this respect, Nobel’s injunction to disregard nationality has been turned on its head.

The most damning indictment of the whole list of prizes in literature is their smugly unadventurous character. With the arguable exceptions of O’Neill, Pirandello, Eliot, and Hemingway, no prize has been given for work that was markedly experimental in technique. Neither Proust nor Joyce nor Virginia Woolf is represented. No bold experiment in literary subject matter has been recognized till the result was no longer in doubt and the power of the Nobel Prize to affect the outcome absolutely nil. The revolution in candor about sex led by Proust and D. H. Lawrence was firmly ignored till the prize for O’Neill in 1936, followed at a long interval by the selection of Gide in 1947. Even with the most conventional themes and techniques, the Swedish Academy has been leadenly cautious. With the clear exception of Yeats and the possible exceptions of O’Neill, Camus, and Sartre, no author has been caught while his career was still on the upswing: the average age of the winners has been over sixty. Only seven men, including Kipling and Camus, have been recognized in their forties. It is easy to see how this dismal record came about. The Academy wanted to be sure about the winners’ ultimate stature. But this is quite simply a violation of Nobel’s entire purpose. He wanted to recognize the most impressive recent book, not to set the seal upon the work of a lifetime or to reward the capacity for literary and physical endurance. This policy would probably have produced some colossal blunders by the lights of posterity, but then the existing record is uneven too.

Not surprisingly, the Nobel Prizes in science have been more impressive than the others. Yet even here the record in physiology or medicine is not as good as in physics or chemistry. This is not entirely the fault of the Caroline Institute. Medical science is inherently more diverse than modern physics or chemistry—potentially as diverse as the tissues, organs, and functions of the human body and the innumerable ills to which they fall prey—and never likely to rise to “first principles” in the way tat the physical sciences have increasingly done. The upshot is that there are more medical scientists more nearly on a par at the head of their profession than there are physicists or chemists. Very few of the actual winners in physiology or medicine have been unsuitable, so that the substitution of others might have produced injustices in turn. There have been too many potential laureates. As if in recognition of this, the Caroline Institute has recently identified various nonwinners whom its Nobel Committee has regarded as “prizeworthy” but for some reason passed over in favor of other investigators.

Granted the difficulties, it is still shocking that none of the following prizeworthy men was actually honored: E. H. Starling (one of the founders of the hormone concept); F. W. Twort and Félix d’Hérelle (independent discoverers of the bacteriophage); Sir Thomas Lewis (the pioneer in the interpretation of electrocardiograms); Walter B. Cannon (one of the three discoverers of chemical neuro-transmission and one of the greatest hormone investigators); E. V. McCollum (the most prolific single discoverer of vitamins and the pioneer student of trace elements in human nutrition); Peyton Rous (the discoverer of the viral transmission of tumors in fowl).

The following major figures were apparently never adjudged prizeworthy: Joseph von Mering and Oscar Minkowski (the discoverers of the role of the pancreas in diabetes, the basis of insulin therapy); Clemens von Pirquet and Béla Schick (two of the three principal discoverers of allergy); David Keilin (the elucidator of the respiratory enzyme cytochrome); Hans Berger (the inventor of the electroencephalogram for tracing brain waves); O. T. Avery (the discoverer with two associates of DNA as the carrier of heredity). If Sir Charles Sherrington, the greatest of all neuro-physiologists, had not lived to the age of seventy-five, he too would figure in this list. He had been futilely placed in nomination no fewer than 134 times, beginning in 1902, before he was finally awarded a share of the prize for 1932. Perhaps the single most frustrating experience was that of the American Ross G. Harrison, who was actually recommended for the prize in 1917 for his epoch-making innovation of tissue culture, only to lose out because the award of the prize was suspended for the duration of the First World War. When the issue was raised again later, the discovery was dismissed as “too old” and in a longer perspective not important enough. The latter judgment was simply mistaken.

Approached from the other side, of major advances, rather than great investigators, in medicine that do not figure in the annals of the Nobel Prize, the following were deliberately passed over on the grounds that there were too many contributors involved: the discovery of sex hormones, the discovery of vitamin D and its functions, the introduction of local anesthesia, and the fenestration operation to restore hearing.

The record of the Swedish Academy of Sciences in awarding the prizes for physics and chemistry is much harder to criticize. By far the most dubious prizes, in no way disreputable but hardly up to the ordinary standard, have been three or four for rather limited contributions to technology. Apart from the unpardonable omission of Mendeleev and Willard Gibbs, the only truly imposing figure of the past whom the Academy can be accused of missing is the American chemist G. N. Lewis, one of the founders of modern valency theory. The American Wallace Hume Carothers, the inventor of nylon, and the Englishman F. S. Kipping, who laid the theoretical foundations for the use of silicones in industry, were both dead before the practical importance of their research had fully emerged. The Academy liquidated a long-standing reproach against itself by finally giving a prize to the great organic chemist R. B. Woodward of Harvard in 1965.

Only about six hundred people will have won a Nobel Prize in the whole course of the twentieth century. What difference will it make to the rest of the human race? What good and what harm have the Nobel Prizes done to society?

One of the main objects that Nobel had in view was to reduce nationalism by focusing upon contributions to the world community. Here his aim has been achieved in two ways. He permanently reduced the claustrophobic aspects of life in Scandinavia by forcing the Swedes and to a lesser extent the Norwegians to be on the alert for constructive achievements anywhere in the world. On a wider scale but less intensively, the Nobel Prizes have conferred a unique international visibility upon men and women of many nationalities and annually reminded each nation of its indebtedness to the others. The effects of this are impalpable, but not to be despised.

Nobel’s other principal object was to call attention to what the winners had contributed and make it easier for them to contribute more. Many authors have acquired a wider audience for their work, particularly in translation, including the motley troop shepherded into American editions by Alfred and Blanche Knopf. On the other hand, the prospect of the authors’ paths being smoothed for further bursts of creativity has been greatly reduced by the high average age of the winners in this category.

The scientific winners have been younger. Besides that, it is easier to conceive of intellectually productive improvements in the working conditions of a scientist than of a writer. There is no doubt that a scientist can write his own ticket after he gets the one accolade that everybody has heard of. If he breaks a long drought of Nobel Prize winners in his country or institution, his leverage becomes tremendous. The most sensational recent example came last year when the geneticists François Jacob, André Lwoff, and Jacques Monod of the Pasteur Institute in Paris shared the first Nobel Prize in science awarded to any Frenchman in thirty years. They and their colleagues had been trying for some time to take over the administrative council of the institute from a body of antique and highly conservative politicians who were dragging their feet about getting the scientists’ salaries up to the level of the Sorbonne, stalling on the construction of a building for molecular biology, and refusing to accept financial support from the French government. There were no signs that this revolt was getting anywhere till the bolt of lightning from Stockholm. The three winners turned the inevitable press conference into a fierce assault upon the generally bad conditions for scientific research in France. Within two weeks of their return in triumph from Stockholm, the scientists had gained control of the institute and the old council was on the way out.

This it the credit side of the ledger, but every single item can be turned inside out to demonstrate that the Nobel Prizes have done considerable harm as well. The Nobel Foundation itself has published a tabulation by nationalities, and lists of the winners almost invariably give nationalities. According to the Nobel Foundation’s own rather arbitrary reckoning, generally but not always by citizenship at the time of award, 87 Americans have shared in 63 prizes, 58 winners fro Great Britain in 50 prizes, 52 Germans in 50, 38 Frenchmen in 32, 16 Swedes in 16, 12 Swiss in 11, and 12 Russian in 9. Tabulations of this kind, and the mentality they reflect and foster, have infected the Nobel Prizes themselves with the nationalistic tendencies that Nobel was trying to reduce.

This is not the only numbers racket which the Nobel Prizes have created. The scientific standing of American universities is frequently correlated with their roster of Nobel laureates. For what it is worth, and that is a big question, here is one man’s reckoning of the score to date. By any standard, the top of the league is Harvard. Counting men and women on the faculty when they got the prize, Harvard has had 13 participants in 11 prizes, Columbia 8 in 7, Berkeley 8 in 6, Caltech 6 in 6, the Rockefeller Institute 5 in 4, Washington University, St. Louis, 4 in 3, Bell Laboratories 4 in 2, Chicago 2 in 2, Cornell 2 in 2, and Stanford 2 in 2. By the standard of Nobel Prizes received for work done in whole or part at the institution in question, Harvard has had 13 participants in 10 prizes, Columbia 9 in 7, Berkeley 7 in 5, Chicago 5 in 5, Stanford 5 in 4, Washington, St. Louis, 5 in 4, the Rockefeller Institute 4 in 4, and Caltech 4 in 4. Of Nobel Prize winners on the faculty in June, 1966, and not emeritus, Harvard had 8, Berkeley 7, plus 1 on leave as chairman of the Atomic Energy Commission, Stanford 5, Caltech 3, and Columbia 3. By the standard of prizes awarded since 1960, Harvard has had 5 participants in 5 prizes, Berkeley 2 in 2, Columbia 1 in 1, and no other American university any at all. It is a curious fact that no member of the MIT faculty has ever received a Nobel Prize for work done there.

As long as everybody remembers that these rankings are bound to shift about over the years, what harm can they possibly do? The answer is that here, as elsewhere, the use of the Nobel Prizes as a yardstick encourages a narrow and unbalanced conception of modern science. There are no Nobel Pries in mathematics, astronomy, geology, psychology, or social science, let alone engineering, and no commensurate forms of recognition to be acquired. What is more, there is a basic asymmetry in the Nobel Prizes that do exist. As distinguished from the breadth of the mandate in physics or chemistry, there is no prize in biology in general, merely in physiology or medicine. Karl von Frisch, the discoverer of the language of bees, and Konrad Lorenz, the discoverer of “imprinting” in young animals—that is, the process by which they find out what kind of animal they are—have been turned down for Nobel Prizes on the ground that their work does not bear directly upon human beings. But the most serious consequence of the narrow mandate in physiology or medicine has been the exclusion of all students of evolution. If Charles Darwin had been living in the twentieth century, he could never have won a Nobel Prize.

Even if there had been a Nobel Prize in biology, the probabilities are that Darwin could not have won it for the doctrine of evolution through natural selection. For Nobel’s insistence upon a “discovery,” “invention,” or “improvement” as the occasion for the awards he did establish was calculated to rule out the great synthesizing concepts by which “discoveries” in the narrow sense are encompassed and elicited.

In physiology, for which there is a Nobel Prize, the most stimulating concept to be formulated in the twentieth century has been the doctrine of “homeostasis,” the self-regulating tendencies of the human organism under stress. The enunciator of this concept, Walter B. Cannon of Harvard, never got a Nobel Prize, and on the occasions when he was adjudged prizeworthy, it was not for homeostasis. Albert Einstein did win the Nobel Prize in physics for 1921, but the citation deliberately avoided any reference to the theory of relativity, spoke vaguely of his unspecified “services to theoretical physics,” and hastened on to the safe ground of rewarding him for hi “discovery of the law of the photoelectric effect.” Sir Charles Sherrington’s magisterial concept of the “integrative action of the nervous system” did not figure in his citation either.

The Nobel system has operated to exclude the greatest ideas in science, the integrating concepts that keep it from flying apart into a million isolated fragments. Anybody solely dependent on following the Nobel citations would be imbibing a narrowly positivistic conception of science as an accumulation of many hard little pellets of empirical knowledge to be shaken free of any conceptual matrix in which they were unaccountably embedded. It is a peculiarly end-of-the-nineteenth-century view, comprehensible in a man of Nobel’s generation and outlook but now hopelessly antiquated as a way of looking at science and the dynamics of scientific progress.

In the measure that laymen, including university presidents, form their impressions of science from the Nobel Prizes, they are missing the true scope of science and some of the greatest scientific contributions. It is improbably that many scientists are misled in the same way. The danger with them is that some of the most brilliant young men will confine their ambitions within the terms of the Nobel Prizes for which they are already bucking at the start of their careers. They will understand that it will do them no good to be deeply thoughtful about their work unless they make clear-cut empirical discoveries, or at any rate, predictions of empirical discoveries subsequently verified; and that if they make the discoveries or predictions, the deep thoughtfulness will not improve their chances. The discovery of a new technique or therapy, or better still, an elusive nuclear particle, will cut more ice than the most profound conceptual clarification. No doubt there are many factors that push a young scientist in this direction. The point is simply that the Nobel Prizes to nothing to redress the balance. On the contrary, they tend to penalize the deepest insights into nature.

Whether the Nobel Prizes have done more harm than good must remain a matter of opinion. They have certainly done more harm than is commonly suspected. But the practical question is whether they could do less harm and more good in the future.

The most profitable experiment that could be made with the prize in literature would be to go back to Nobel’s express intention of honoring a recent book rather than a life’s achievement. If this were done, it ought to be combined with much greater receptivity to experimental work, either in technique or in content. For example, a real service could be performed by identifying the best representatives of the theater of the absurd and differentiating them from the authors of meretricious work who are merely exploiting a vogue. Such awards might conceivably be “worse” in retrospect than under the present system, but they would be useful, which is more than can be said for crowning authors in their sixties after their reputations are securely established.

In science, there would be no insuperable difficulty in reinterpreting the prize for physiology or medicine to take account of the best contributions to evolutionary science. In the age of space travel, the physics prize could and should be stretched to include astrophysics. In all three scientific fields, the Nobel committees should frankly move beyond Alfred Nobel’s narrow conception of a “discovery” to recognize the significance of insights and conceptual clarifications.

Meanwhile the curtain is going up. Who will win this time? There has never been any telling about that, but here are some deserving people who appear to be qualified under the present system. In chemistry, Neil Bartlett of the University of British Columbia was the first to demonstrate that the so-called “inert” or “noble” gases could form stable compounds. In physiology or medicine, all of the following men richly deserve a Nobel Prize: Albert Lehninger of Johns Hopkins, the discoverer of the function of the mitochondrion in effecting energy transfers in cells; Murray Barr of the University of Western Ontario, the discoverer with E. G. Bertram of sex chromatin, by which the genetic sex of any human cell can be determined (already adjudged prizeworthy); J. H. Tjio and Albert Levan, working in Sweden, the discoverers of the true number of human chromosomes (already adjudged prizeworthy); the Englishman C. E. Ford, who first correlated human pathological conditions with the possession of too many or too few chromosomes now the accepted explanation of mongolism). It is probably too late to hope that a prize will be given to the Australian Sir Norman Gregg for his discovery of the effect of German measles in early pregnancy upon the fetus, or the great student of the brain Wilder Penfield of McGill University. The men who have made the most spectacular recent discovery are Henry Harris and J. F. Watkins of Oxford, who succeeded in 1965 in producing the first hybrid cells between different animal species, including a cross between human cells and mouse cells. Harris and Watkins may well have to wait a couple of years for a Nobel Prize, but it is difficult to believe that they can be passed over for very long.

In literature there has never been a Japanese winner; the Swedish Academy is lusting after new literatures to embrace, and the novelist Yukio Mishima would be a respectable figure by world standards. There are at least three Italian writers of appropriate stature, the novelist Alberto Moravia and the poets Giuseppe Ungaretti and Eugenio Montale. Samuel Beckett is too much to hope for. But perhaps the most distinguished recipient would be either the greatest living poet in English, Robert Graves, or the greatest living poet in Spanish, Pablo Neruda.

Most Popular

Two hundred fifty years of slavery. Ninety years of Jim Crow. Sixty years of separate but equal. Thirty-five years of racist housing policy. Until we reckon with our compounding moral debts, America will never be whole.

And if thy brother, a Hebrew man, or a Hebrew woman, be sold unto thee, and serve thee six years; then in the seventh year thou shalt let him go free from thee. And when thou sendest him out free from thee, thou shalt not let him go away empty: thou shalt furnish him liberally out of thy flock, and out of thy floor, and out of thy winepress: of that wherewith the LORD thy God hath blessed thee thou shalt give unto him. And thou shalt remember that thou wast a bondman in the land of Egypt, and the LORD thy God redeemed thee: therefore I command thee this thing today.

— Deuteronomy 15: 12–15

Besides the crime which consists in violating the law, and varying from the right rule of reason, whereby a man so far becomes degenerate, and declares himself to quit the principles of human nature, and to be a noxious creature, there is commonly injury done to some person or other, and some other man receives damage by his transgression: in which case he who hath received any damage, has, besides the right of punishment common to him with other men, a particular right to seek reparation.

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Most of the big names in futurism are men. What does that mean for the direction we’re all headed?

In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.

Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”