Anna Nicole Smith's role as a harbinger of the future is not widely acknowledged. Born Vickie Lynn Hogan, Smith first came to the attention of the American public in 1993, when she earned the title Playmate of the Year. In 1994 she married J. Howard Marshall, a Houston oil magnate said to be worth more than half a billion dollars. He was eighty-nine and wheelchairbound; she was twenty-six and quiveringly mobile. Fourteen months later Marshall died. At his funeral the widow appeared in a white dress with a vertical neckline. She also claimed that Marshall had promised half his fortune to her. The inevitable litigation sprawled from Texas to California and occupied batteries of lawyers, consultants, and public-relations specialists for more than seven years.

Even before Smith appeared, Marshall had disinherited his older son. And he had infuriated his younger son by lavishing millions on a mistress, an exotic dancer, who then died in a bizarre face-lift accident. To block Marshall senior from squandering on Smith money that Marshall junior regarded as rightfully his, the son seized control of his father's assets by means that the trial judge later said were so "egregious," "malicious," and "fraudulent" that he regretted being unable to fine the younger Marshall more than $44 million in punitive damages.
[See a correction]

In its epic tawdriness the Marshall affair was natural fodder for the tabloid media. Yet one aspect of it may soon seem less a freak show than a cliché. If an increasingly influential group of researchers is correct, the lurid spectacle of intergenerational warfare will become a typical social malady.

The scientists' argument is circuitous but not complex. In the past century U.S. life expectancy has climbed from forty-seven to seventy-seven, increasing by nearly two thirds. Similar rises happened in almost every country. And this process shows no sign of stopping: according to the United Nations, by 2050 global life expectancy will have increased by another ten years. Note, however, that this tremendous increase has been in average life expectancy—that is, the number of years that most people live. There has been next to no increase in the maximum lifespan, the number of years that one can possibly walk the earth—now thought to be about 120. In the scientists' projections, the ongoing increase in average lifespan is about to be joined by something never before seen in human history: a rise in the maximum possible age at death.

Stem-cell banks, telomerase amplifiers, somatic gene therapy—the list of potential longevity treatments incubating in laboratories is startling. Three years ago a multi-institutional scientific team led by Aubrey de Grey, a theoretical geneticist at Cambridge University, argued in a widely noted paper that the first steps toward "engineered negligible senescence"—a rough-and-ready version of immortality—would have "a good chance of success in mice within ten years." The same techniques, De Grey says, should be ready for human beings a decade or so later. "In ten years we'll have a pill that will give you twenty years," says Leonard Guarente, a professor of biology at MIT. "And then there'll be another pill after that. The first hundred-and-fifty-year-old may have already been born."

Critics regard such claims as wildly premature. In March ten respected researchers predicted in the New England Journal of Medicine that "the steady rise in life expectancy during the past two centuries may soon come to an end," because rising levels of obesity are making people sicker. The research team leader, S. Jay Olshansky, of the University of Illinois School of Public Health, also worries about the "potential impact of infectious disease." Believing that medicine can and will overcome these problems, his "cautious and I think defensibly optimistic estimate" is that the average lifespan will reach eighty-five or ninety—in 2100. Even this relatively slow rate of increase, he says, will radically alter the underpinnings of human existence. "Pushing the outer limits of lifespan" will force the world to confront a situation no society has ever faced before: an acute shortage of dead people.

The twentieth-century jump in life expectancy transformed society. Fifty years ago senior citizens were not a force in electoral politics. Now the AARP is widely said to be the most powerful organization in Washington. Medicare, Social Security, retirement, Alzheimer's, snowbird economies, the population boom, the golfing boom, the cosmetic-surgery boom, the nostalgia boom, the recreational-vehicle boom, Viagra—increasing longevity is entangled in every one. Momentous as these changes have been, though, they will pale before what is coming next.

From religion to real estate, from pensions to parent-child dynamics, almost every aspect of society is based on the orderly succession of generations. Every quarter century or so children take over from their parents—a transition as fundamental to human existence as the rotation of the planet about its axis. In tomorrow's world, if the optimists are correct, grandparents will have living grandparents; children born decades from now will ignore advice from people who watched the Beatles on The Ed Sullivan Show. Intergenerational warfare—the Anna Nicole Smith syndrome—will be but one consequence. Trying to envision such a world, sober social scientists find themselves discussing pregnant seventy-year-olds, offshore organ farms, protracted adolescence, and lifestyles policed by insurance companies. Indeed, if the biologists are right, the coming army of centenarians will be marching into a future so unutterably different that they may well feel nostalgia for the long-ago days of three score and ten.

The oldest in vitro fertilization clinic in China is located on the sixth floor of a no-star hotel in Changsha, a gritty fly-over city in the south-central portion of the country. It is here that the clinic's founder and director, Lu Guangxiu, pursues her research into embryonic stem cells.

Most cells don't divide, whatever elementary school students learn—they just get old and die. The body subcontracts out the job of replacing them to a special class of cells called stem cells. Embryonic stem cells—those in an early-stage embryo—can grow into any kind of cell: spleen, nerve, bone, whatever. Rather than having to wait for a heart transplant, medical researchers believe, a patient could use stem cells to grow a new heart: organ transplant without an organ donor.

The process of extracting stem cells destroys an early-stage embryo, which has led the Bush administration to place so many strictures on stem-cell research that scientists complain it has been effectively banned in this country. A visit to Lu's clinic not long ago suggested that ultimately Bush's rules won't stop anything. Capitalism won't let them.

During a conversation Lu accidentally brushed some papers to the floor. They were faxes from venture capitalists in San Francisco, Hong Kong, and Stuttgart. "I get those all the time," she said. Her operation was short of money—a chronic problem for scientists in poor countries. But it had something of value: thousands of frozen embryos, an inevitable by-product of in vitro fertilizations. After obtaining permission from patients, Lu uses the embryos in her work. It is possible that she has access to more embryonic stem cells than all U.S. researchers combined.

Sooner or later, in one nation or another, someone like Lu will cut a deal: frozen embryos for financial backing. Few are the stem-cell researchers who believe that their work will not lead to tissue-and-organ farms, and that these will not have a dramatic impact on the human lifespan. If Organs 'Я' Us is banned in the United States, Americans will fly to longevity centers elsewhere. As Steve Hall wrote in Merchants of Immortality, biotechnology increasingly resembles the software industry. Dependence on venture capital, loathing of regulation, pathological secretiveness, penchant for hype, willingness to work overseas—they're all there. Already the U.S. Patent Office has issued 400 patents concerning human stem cells.

Longevity treatments will almost certainly drive up medical costs, says Dana Goldman, the director of health economics at the RAND Corporation, and some might drive them up significantly. Implanted defibrillators, for example, could constantly monitor people's hearts for signs of trouble, electrically regulating the organs when they miss a beat. Researchers believe that the devices would reduce heart-disease deaths significantly. At the same time, Goldman says, they would by themselves drive up the nation's health-care costs by "many billions of dollars" (Goldman and his colleagues are working on nailing down how much), and they would be only one of many new medical interventions. In developed nations anti-retroviral drugs for AIDS typically cost about $15,000 a year. According to James Lubitz, the acting chief of the aging and chronic-disease statistics branch of the CDC National Center for Health Statistics, there is no a priori reason to suppose that lifespan extension will be cheaper, that the treatments will have to be administered less frequently, or that their inventors will agree to be less well compensated. To be sure, as Ramez Naam points out in More Than Human, which surveys the prospects for "biological enhancement," drugs inevitably fall in price as their patents expire. But the same does not necessarily hold true for medical procedures: heart bypass operations are still costly, decades after their invention. And in any case there will invariably be newer, more effective, and more costly drugs. Simple arithmetic shows that if 80 million U.S. senior citizens were to receive $15,000 worth of treatment every year, the annual cost to the nation would be $1.2 trillion—"the kind of number," Lubitz says, "that gets people's attention."

The potential costs are enormous, but the United States is a rich nation. As a share of gross domestic product the cost of U.S. health care roughly doubled from 1980 to the present, explains David M. Cutler, a health-care economist at Harvard. Yet unlike many cost increases, this one signifies that people are better off. "Would you rather have a heart attack with 1980 medicine at the 1980 price?" Cutler asks. "We get more and better treatments now, and we pay more for the additional services. I don't look at that and see an obvious disaster."

The critical issue, in Goldman's view, will be not the costs per se but determining who will pay them. "We're going to have a very public debate about whether this will be covered by insurance," he says. "My sense is that it won't. It'll be like cosmetic surgery—you pay out of pocket." Necessarily, a pay-as-you-go policy would limit access to longevity treatments. If high-level anti-aging therapy were expensive enough, it could become a perk for movie stars, politicians, and CEOs. One can envision Michael Moore fifty years from now, still denouncing the rich in political tracts delivered through the next generation's version of the Internet—neural implants, perhaps. Donald Trump, a 108-year-old multibillionaire in 2054, will be firing the children of the apprentices he fired in 2004. Meanwhile, the maids, chauffeurs, and gofers of the rich will stare mortality in the face.

Short of overtly confiscating rich people's assets, it would be hard to avoid this divide. Yet as Goldman says, there will be "furious" political pressure to avert the worst inequities. For instance, government might mandate that insurance cover longevity treatments. In fact, it is hard to imagine any democratic government foolhardy enough not to guarantee access to those treatments, especially when the old are increasing in number and political clout. But forcing insurers to cover longevity treatments would only change the shape of the social problem. "Most everyone will want to take [the treatment]," Goldman says. "So that jacks up the price of insurance, which leads to more people uninsured. Either way, we may be bifurcating society."

Ultimately, Goldman suggests, the government would probably end up paying outright for longevity treatments: an enormous new entitlement program. How could it be otherwise? Older voters would want it because it is in their interest; younger ones would want it because they, too, will age. "At the same time," he says, "nobody likes paying taxes, so there would be constant pressure to contain costs."

To control spending, the program might give priority to people with healthy habits; no point in retooling the genomes of smokers, risk takers, and addicts of all kinds. A kind of reverse eugenics might occur, in which governments would freely allow the birth of people with "bad" genes but would let nature take its course on them as they aged. Having shed the baggage of depression, addiction, mental retardation, and chemical-sensitivity syndrome, tomorrow's legions of perduring old would be healthier than the young. In this scenario moralists and reformers would have a field day.

Meanwhile, the gerontocratic elite will have a supreme weapon against the young: compound interest. According to a 2004 study by three researchers at the London Business School, historically the average rate of real return on stock markets worldwide has been about five percent. Thus a twenty-year-old who puts $10,000 in the market in 2010 should expect by 2030 to have about $27,000 in real terms—a tidy increase. But that happy forty-year-old will be in the same world as septuagenarians and octogenarians who began investing their money during the Carter administration. If someone who turned seventy in 2010 had invested $10,000 when he was twenty, he would have about $115,000. In the same twenty-year period during which the young person's account grew from $10,000 to $27,000, the old person's account would grow from $115,000 to $305,000. Inexorably, the gap between them will widen.

The result would be a tripartite society: the very old and very rich on top, beta-testing each new treatment on themselves; a mass of the ordinary old, forced by insurance into supremely healthy habits, kept alive by medical entitlement; and the diminishingly influential young. In his novel Holy Fire (1996) the science-fiction writer and futurist Bruce Sterling conjured up a version of this dictatorship-by-actuary: a society in which the cautious, careful centenarian rulers, supremely fit and disproportionately affluent if a little frail, look down with ennui and mild contempt on their juniors. Marxist class warfare, upgraded to the biotech era!

In the past, twenty- and thirty-year-olds had the chance of sudden windfalls in the form of inheritances. Some economists believe that bequests from previous generations have provided as much as a quarter of the start-up capital for each new one—money for college tuitions, new houses, new businesses. But the image of an ingénue's getting a leg up through a sudden bequest from Aunt Tilly will soon be a relic of late-millennium romances.

Instead of helping their juniors begin careers and families, tomorrow's rich oldsters will be expending their disposable income to enhance their memories, senses, and immune systems. Refashioning their flesh to ever higher levels of performance, they will adjust their metabolisms on computers, install artificial organs that synthesize smart drugs, and swallow genetically tailored bacteria and viruses that clean out arteries, fine-tune neurons, and repair broken genes. Should one be reminded of H. G. Wells's The Time Machine, in which humankind is divided into two species, the ethereal Eloi and the brutish, underground-dwelling Morlocks? "As I recall," Goldman told me recently, "in that book it didn't work out very well for the Eloi."

When lifespans extend indefinitely, the effects are felt throughout the life cycle, but the biggest social impact may be on the young. According to Joshua Goldstein, a demographer at Princeton, adolescence will in the future evolve into a period of experimentation and education that will last from the teenage years into the mid-thirties. In a kind of wanderjahr prolonged for decades, young people will try out jobs on a temporary basis, float in and out of their parents' homes, hit the Europass-and-hostel circuit, pick up extra courses and degrees, and live with different people in different places. In the past the transition from youth to adulthood usually followed an orderly sequence: education, entry into the labor force, marriage, and parenthood. For tomorrow's thirtysomethings, suspended in what Goldstein calls "quasi-adulthood," these steps may occur in any order.

From our short-life-expectancy point of view, quasi-adulthood may seem like a period of socially mandated fecklessness—what Leon Kass, the chair of the President's Council on Bioethics, has decried as the coming culture of "protracted youthfulness, hedonism, and sexual license." In Japan, ever in the demographic forefront, as many as one out of three young adults is either unemployed or working part-time, and many are living rent-free with their parents. Masahiro Yamada, a sociologist at Tokyo Gakugei University, has sarcastically dubbed them parasaito shinguru, or "parasite singles." Adult offspring who live with their parents are common in aging Europe, too. In 2003 a report from the British Prudential financial-services group awarded the 6.8 million British in this category the mocking name of "kippers"—"kids in parents' pockets eroding retirement savings."

To Kass, the main cause of this stasis is "the successful pursuit of longer life and better health." Kass's fulminations easily lend themselves to ridicule. Nonetheless, he is in many ways correct. According to Yuji Genda, an economist at Tokyo University, the drifty lives of parasite singles are indeed a by-product of increased longevity, mainly because longer-lived seniors are holding on to their jobs. Japan, with the world's oldest population, has the highest percentage of working senior citizens of any developed nation: one out of three men over sixty-five is still on the job. Everyone in the nation, Genda says, is "tacitly aware" that the old are "blocking the door."

In a world of 200-year-olds "the rate of rise in income and status perhaps for the first hundred years of life will be almost negligible," the crusty maverick economist Kenneth Boulding argued in a prescient article from 1965. "It is the propensity of the old, rich, and powerful to die that gives the young, poor, and powerless hope." (Boulding died in 1993, opening up a position for another crusty maverick economist.)

Kass believes that "human beings, once they have attained the burdensome knowledge of good and bad, should not have access to the tree of life." Accordingly, he has proposed a straightforward way to prevent the problems of youth in a society dominated by the old: "resist the siren song of the conquest of aging and death." Senior citizens, in other words, should let nature take its course once humankind's biblical seventy-year lifespan is up. Unfortunately, this solution is self-canceling, since everyone who agrees with it is eventually eliminated. Opponents, meanwhile, live on and on. Kass, who is sixty-six, has another four years to make his case.

Increased longevity may add to marital strains. The historian Lawrence Stone was among the first to note that divorce was rare in previous centuries partly because people died so young that bad unions were often dissolved by early funerals. As people lived longer, Stone argued, divorce became "a functional substitute for death." Indeed, marriages dissolved at about the same rate in 1860 as in 1960, except that in the nineteenth century the dissolution was more often due to the death of a partner, and in the twentieth century to divorce. The corollary that children were as likely to live in households without both biological parents in 1860 as in 1960 is also true. Longer lifespans are far from the only reason for today's higher divorce rates, but the evidence seems clear that they play a role. The prospect of spending another twenty years sitting across the breakfast table from a spouse whose charm has faded must have already driven millions to divorce lawyers. Adding an extra decade or two can only exacerbate the strain.

Worse, child-rearing, a primary marital activity, will be even more difficult than it is now. For the past three decades, according to Ben J. Wattenberg, a senior fellow at the American Enterprise Institute, birth rates around the world have fallen sharply as women have taken advantage of increased opportunities for education and work outside the home. "More education, more work, lower fertility," he says. The title of Wattenberg's latest book, published in October, sums up his view of tomorrow's demographic prospects: Fewer. In his analysis, women's continuing movement outside the home will lead to a devastating population crash—the mirror image of the population boom that shaped so much of the past century. Increased longevity will only add to the downward pressure on birth rates, by making childbearing even more difficult. During their twenties, as Goldstein's quasi-adults, men and women will be unmarried and relatively poor. In their thirties and forties they will finally grow old enough to begin meaningful careers—the worst time to have children. Waiting still longer will mean entering the maelstrom of reproductive technology, which seems likely to remain expensive, alienating, and prone to complications. Thus the parental paradox: increased longevity means less time for pregnancy and child-rearing, not more.

Even when women manage to fit pregnancy into their careers, they will spend a smaller fraction of their lives raising children than ever before. In the mid nineteenth century white women in the United States had a life expectancy of about forty years and typically bore five or six children. (I specify Caucasians because records were not kept for African-Americans.) These women literally spent more than half their lives caring for offspring. Today U.S. white women have a life expectancy of nearly eighty and bear an average of 1.9 children—below replacement level. If a woman spaces two births close together, she may spend only a quarter of her days in the company of offspring under the age of eighteen. Children will become ever briefer parentheses in long, crowded adult existences. It seems inevitable that the bonds between generations will fray.

Purely from a financial standpoint, parenthood has always been a terrible deal. Mom and Dad fed, clothed, housed, and educated the kids, but received little in the way of tangible return. Ever since humankind began acquiring property, wealth has flowed from older generations to younger ones. Even in those societies where children herded cattle and tilled the land for their aged progenitors, the older generation consumed so little and died off so quickly that the net movement of assets and services was always downward. "Of all the misconceptions that should be banished from discussions of aging," F. Landis MacKellar, an economist at the International Institute for Applied Systems Analysis, in Austria, wrote in the journal Population and Development Review in 2001, "the most persistent and egregious is that in some simpler and more virtuous age children supported their parents."

This ancient pattern changed at the beginning of the twentieth century, when government pension and social-security schemes spread across Europe and into the Americas. Within the family parents still gave much more than they received, according to MacKellar, but under the new state plans the children in effect banded together outside the family and collectively reimbursed the parents. In the United States workers pay less to Social Security than they eventually receive; retirees are subsidized by the contributions of younger workers. But on the broadest level financial support from the young is still offset by the movement of assets within families—a point rarely noted by critics of "greedy geezers."

Increased longevity will break up this relatively equitable arrangement. Here concerns focus less on the super-rich than on middle-class senior citizens, those who aren't surfing the crest of compound interest. These people will face a Hobson's choice. On the one hand, they will be unable to retire at sixty-five, because the young would end up bankrupting themselves to support them—a reason why many would-be reformers propose raising the retirement age. On the other hand, it will not be feasible for most of tomorrow's nonagenarians and centenarians to stay at their desks, no matter how fit and healthy they are.

The case against early retirement is well known. In economic jargon the ratio of retirees to workers is known as the "dependency ratio," because through pension and Social Security payments people who are now in the work force funnel money to people who have left it. A widely cited analysis by three economists at the Organization for Economic Cooperation and Development estimated that in 2000 the overall dependency ratio in the United States was 21.7 retirees for every 100 workers, meaning (roughly speaking) that everyone older than sixty-five had five younger workers contributing to his pension. By 2050 the dependency ratio will have almost doubled, to 38 per 100; that is, each retiree will be supported by slightly more than two current workers. If old-age benefits stay the same, in other words, the burden on younger workers, usually in the form of taxes, will more than double.

This may be an underestimate. The OECD analysis did not assume any dramatic increase in longevity, or the creation of any entitlement program to pay for longevity care. If both occur, as gerontological optimists predict, the number of old will skyrocket, as will the cost of maintaining them. To adjust to these "very bad fiscal effects," says the OECD economist Pablo Antolin, one of the report's co-authors, societies have only two choices: "raising the retirement age or cutting the benefits." He continues, "This is arithmetic—it can't be avoided." The recent passage of a huge new prescription-drug program by an administration and Congress dominated by the "party of small government" suggests that benefits will not be cut. Raising the age of retirement might be more feasible politically, but it would lead to a host of new problems—see today's Japan.

In the classic job pattern, salaries rise steadily with seniority. Companies underpay younger workers and overpay older workers as a means of rewarding employees who stay at their jobs. But as people have become more likely to shift firms and careers, the pay increases have become powerful disincentives for companies to retain employees in their fifties and sixties. Employers already worried about the affordability of older workers are not likely to welcome calls to raise the retirement age; the last thing they need is to keep middle managers around for another twenty or thirty years. "There will presumably be an elite group of super-rich who would be immune to all these pressures," Ronald Lee, an economic demographer at the University of California at Berkeley, says. "Nobody will kick Bill Gates out of Microsoft as long as he owns it. But there will be a lot of pressure on the average old person to get out."

In Lee's view, the financial downsizing need not be inhumane. One model is the university, which shifted older professors to emeritus status, reducing their workload in exchange for reduced pay. Or, rather, the university could be a model: age-discrimination litigation and professors' unwillingness to give up their perks, Lee says, have largely torpedoed the system. "It's hard to reduce someone's salary when they are older," he says. "For the person, it's viewed as a kind of disgrace. As a culture we need to get rid of that idea."

The Pentagon has released few statistics about the hundreds or thousands of insurgents captured in Afghanistan and Iraq, but one can be almost certain that they are disproportionately young. Young people have ever been in the forefront of political movements of all stripes. University students protested Vietnam, took over the U.S. embassy in Tehran, filled Tiananmen Square, served as the political vanguard for the Taliban. "When we are forty," the young writer Filippo Marinetti promised in the 1909 Futurist Manifesto, "other younger and stronger men will probably throw us in the wastebasket like useless manuscripts—we want it to happen!"

The same holds true in business and science. Steve Jobs and Stephen Wozniak founded Apple in their twenties; Albert Einstein dreamed up special relativity at about the same age. For better and worse, young people in developed nations will have less chance to shake things up in tomorrow's world. Poorer countries, where the old have less access to longevity treatments, will provide more opportunity, political and financial. As a result, according to Fred C. Iklé, an analyst with the Center for Strategic and International Studies, "it is not fanciful to imagine a new cleavage opening up in the world order." On one side would be the "'bioengineered' nations," societies dominated by the "becalmed temperament" of old people. On the other side would be the legions of youth—"the protagonists," as the political theorist Samuel Huntington has described them, "of protest, instability, reform, and revolution."

Because poorer countries would be less likely to be dominated by a gerontocracy, tomorrow's divide between old and young would mirror the contemporary division between rich northern nations and their poorer southern neighbors. But the consequences might be different—unpredictably so. One assumes, for instance, that the dictators who hold sway in Africa and the Middle East would not hesitate to avail themselves of longevity treatments, even if few others in their societies could afford them. Autocratic figures like Arafat, Franco, Perón, and Stalin often leave the scene only when they die. If the human lifespan lengthens greatly, the dictator in Gabriel García Márquez's The Autumn of the Patriarch, who is "an indefinite age somewhere between 107 and 232 years," may no longer be regarded as a product of magical realism.

Bioengineered nations, top-heavy with the old, will need to replenish their labor forces. Here immigration is the economist's traditional solution. In abstract terms, the idea of importing young workers from poor regions of the world seems like a win-win solution: the young get jobs, the old get cheap service. In practice, though, host nations have found that the foreigners in their midst are stubbornly … foreign. European nations are wondering whether they really should have let in so many Muslims. In the United States, traditionally hospitable to migrants, bilingual education is under attack and the southern border is increasingly locked down. Japan, preoccupied by Nihonjinron (theories of "Japaneseness"), has always viewed immigrants with suspicion if not hostility. Facing potential demographic calamity, the Japanese government has spent millions trying to develop a novel substitute for immigrants: robots smart and deft enough to take care of the aged.

According to Ronald Lee, the Berkeley demographer, rises in life expectancy have in the past stimulated economic growth. Because they arose mainly from reductions in infant and child mortality, these rises produced more healthy young workers, which in turn led to more-productive societies. Believing they would live a long time, those young workers saved more for retirement than their forebears, increasing society's stock of capital—another engine of growth. But these positive effects are offset when increases in longevity come from old people's neglecting to die. Older workers are usually less productive than younger ones, earning less and consuming more. Worse, the soaring expenses of entitlement programs for the old are likely, Lee believes, "to squeeze out government expenditures on the next generation," such as education and childhood public-health programs. "I think there's evidence that something like this is already happening among the industrial countries," he says. The combination will force a slowdown in economic growth: the economic pie won't grow as fast. But there's a bright side, at least potentially. If the fall in birth rates is sufficiently vertiginous, the number of people sharing that relatively smaller pie may shrink fast enough to let everyone have a bigger piece. One effect of the longevity-induced "birth dearth" that Wattenburg fears, in other words, may be higher per capita incomes.

For the past thirty years the United States has financed its budget deficits by persuading foreigners to buy U.S. Treasury bonds. In the nature of things, most of these foreigners have lived in other wealthy nations, especially Japan and China. Unfortunately for the United States, those other countries are marching toward longevity crises of their own. They, too, will have fewer young, productive workers. They, too, will be paying for longevity treatments for the old. They, too, will be facing a grinding economic slowdown. For all these reasons they may be less willing to finance our government. If so, Uncle Sam will have to raise interest rates to attract investors, which will further depress growth—a vicious circle.

Longevity-induced slowdowns could make young nations more attractive as investment targets, especially for the cash-strapped pension-and-insurance plans in aging countries. The youthful and ambitious may well follow the money to where the action is. If Mexicans and Guatemalans have fewer rich old people blocking their paths, the river of migration may begin to flow in the other direction. In a reverse brain drain, the Chinese coast guard might discover half-starved American postgraduates stuffed into the holds of smugglers' ships. Highways out of Tijuana or Nogales might bear road signs telling drivers to watch out for norteamericano families running across the blacktop, the children's Hello Kitty backpacks silhouetted against a yellow warning background.

Given that today nobody knows precisely how to engineer major increases in the human lifespan, contemplating these issues may seem premature. Yet so many scientists believe that some of the new research will pay off, and that lifespans will stretch like taffy, that it would be shortsighted not to consider the consequences. And the potential changes are so enormous and hard to grasp that they can't be understood and planned for at the last minute. "By definition," says Aubrey de Grey, the Cambridge geneticist, "you live with longevity for a very long time."

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A controversial treatment shows promise, especially for victims of trauma.

It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)

Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.

Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.