Are racial categories still an important—or even a valid—tool of government policy? In recent years the debate in America has been between those who think that race is paramount and those who think it is increasingly irrelevant, and in the next election cycle this debate will surely intensify around a California ballot initiative that would all but prohibit the state from asking its citizens what their racial backgrounds are. But the ensuing polemics will only obscure the more fundamental question: What, when each generation is more racially and ethnically mixed than its predecessor, does race even mean anymore? If your mother is Asian and your father is African-American, what, racially speaking, are you? (And if your spouse is half Mexican and half Russian Jewish, what are your children?)

Five decades after the end of legal segregation, and only thirty-six years after the Supreme Court struck down anti-miscegenation laws, young African-Americans are considerably more likely than their elders to claim mixed heritage. A study by the Population Research Center, in Portland, Oregon, projects that the black intermarriage rate will climb dramatically in this century, to a point at which 37 percent of African-Americans will claim mixed ancestry by 2100. By then more than 40 percent of Asian-Americans will be mixed. Most remarkable, however, by century's end the number of Latinos claiming mixed ancestry will be more than two times the number claiming a single background.

Not surprisingly, intermarriage rates for all groups are highest in the states that serve as immigration gateways. By 1990 Los Angeles County had an intermarriage rate five times the national average. Latinos and Asians, the groups that have made up three quarters of immigrants over the past forty years, have helped to create a climate in which ethnic or racial intermarriage is more accepted today than ever before. Nationally, whereas only eight percent of foreign-born Latinos marry non-Latinos, 32 percent of second-generation and 57 percent of third-generation Latinos marry outside their ethnic group. Similarly, whereas only 13 percent of foreign-born Asians marry non-Asians, 34 percent of second-generation and 54 percent of third-generation Asian-Americans do.

Meanwhile, as everyone knows, Latinos are now the largest minority group in the nation. Two thirds of Latinos, in turn, are of Mexican heritage. This is significant in itself, because their sheer numbers have helped Mexican-Americans do more than any other group to alter the country's old racial thinking. For instance, Texas and California, where Mexican-Americans are the largest minority, were the first two states to abolish affirmative action: when the collective "minority" populations in those states began to outnumber whites, the racial balance that had made affirmative action politically viable was subverted.

Many Mexican-Americans now live in cities or regions where they are a majority, changing the very idea of what it means to be a member of a "minority" group. Because of such demographic changes, a number of the policies designed to integrate nonwhites into the mainstream—affirmative action in college admissions, racial set-asides in government contracting—have been rendered more complicated or even counterproductive in recent years. In California cities where whites have become a minority, it is no longer clear what "diversity" means or what the goals of integration policies should be. The selective magnet-school program of the Los Angeles Unified School District, for example, was originally developed as an alternative to forced busing—a way to integrate ethnicminority students by encouraging them to look beyond their neighborhoods. Today, however, the school district is 71 percent Latino, and Latinos' majority status actually puts them at a disadvantage when applying to magnet schools.

But it is not merely their growing numbers (they will soon be the majority in both California and Texas, and they are already the single largest contemporary immigrant group nationwide) that make Mexican-Americans a leading indicator of the country's racial future; rather, it's what they represent. They have always been a complicating element in the American racial system, which depends on an oversimplified classification scheme. Under the pre-civil-rights formulation, for example, if you had "one drop" of African blood, you were fully black. The scheme couldn't accommodate people who were part one thing and part another. Mexicans, who are a product of intermingling—both cultural and genetic—between the Spanish and the many indigenous peoples of North and Central America, have a history of tolerating and even reveling in such ambiguity. Since the conquest of Mexico, in the sixteenth century, they have practiced mestizaje—racial and cultural synthesis—both in their own country and as they came north. Unlike the English-speaking settlers of the western frontier, the Spaniards were willing everywhere they went to allow racial and cultural mixing to blur the lines between themselves and the natives. The fact that Latin America is far more heavily populated by people of mixed ancestry than Anglo America is the clearest sign of the difference between the two outlooks on race.

Nativists once deplored the Mexican tendency toward hybridity. In the mid nineteenth century, at the time of the conquest of the Southwest, Secretary of State James Buchanan feared granting citizenship to a "mongrel race." And in the late 1920s Represent-ative John C. Box, of Texas, warned his colleagues on the House Immigration and Naturalization Committee that the continued influx of Mexican immigrants could lead to the "distressing process of mongrelization" in America. He argued that because Mexicans were the products of mixing, they harbored a relaxed attitude toward interracial unions and were likely to mingle freely with other races in the United States.

Box was right. The typical cultural isolation of immigrants notwithstanding, those immigrants' children and grandchildren are strongly oriented toward the American melting pot. Today two thirds of multiracial and multi-ethnic births in California involve a Latino parent. Mexicanidad, or "Mexicanness," is becoming the catalyst for a new American cultural synthesis.

In the same way that the rise in the number of multi-racial Americans muddles U.S. racial statistics, the growth of the Mexican-American mestizo population has begun to challenge the Anglo-American binary view of race. In the 1920 census Mexicans were counted as whites. Ten years later they were reassigned to a separate Mexican "racial" category. In 1940 they were officially reclassified as white. Today almost half the Latinos in California, which is home to a third of the nation's Latinos (most of them of Mexican descent), check "other" as their race. In the first half of the twentieth century Mexican-American advocates fought hard for the privileges that came with being white in America. But since the 1960s activists have sought to reap the benefits of being nonwhite minorities. Having spent so long trying to fit into one side or the other of the binary system, Mexican-Americans have become numerous and confident enough to simply claim their brownness—their mixture. This is a harbinger of America's future.

The original melting-pot concept was incomplete: it applied only to white ethnics (Irish, Italians, Poles, and so forth), not to blacks and other nonwhites. Israel Zangwill, the playwright whose 1908 drama The Melting Pot popularized the concept, even wrote that whites were justified in avoiding intermarriage with blacks. In fact, multiculturalism—the ideology that promotes the permanent coexistence of separate but equal cultures in one place—can be seen as a by-product of America's exclusion of African-Americans from the melting pot; those whom assimilation rejected came to reject assimilation. Although the multicultural movement has always encompassed other groups, blacks gave it its moral impetus.

But the immigrants of recent decades are helping to forge a new American identity, something more complex than either a melting pot or a confederation of separate but equal groups. And this identity is emerging not as a result of politics or any specific public policies but because of powerful underlying cultural forces. To be sure, the civil-rights movement was instrumental in the initial assault on racial barriers. And immigration policies since 1965 have tended to favor those immigrant groups—Asians and Latinos—who are most open to intermarriage. But in recent years the government's major contribution to the country's growing multiracialism has been—as it should continue to be—a retreat from dictating limits on interracial intimacy and from exalting (through such policies as racial set-asides and affirmative action) race as the most important American category of being. As a result, Americans cross racial lines more often than ever before in choosing whom to sleep with, marry, or raise children with.

Unlike the advances of the civil-rights movement, the future of racial identity in America is unlikely to be determined by politics or the courts or public policy. Indeed, at this point perhaps the best thing the government can do is to acknowledge changes in the meaning of race in America and then get out of the way. The Census Bureau's decision to allow Americans to check more than one box in the "race" section of the 2000 Census was an important step in this direction. No longer forced to choose a single racial identity, Americans are now free to identify themselves as mestizos—and with this newfound freedom we may begin to endow racial issues with the complexity and nuance they deserve.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

No other place mixes affordability, opportunity, and wealth so well. What’s its secret?

If the American dream has not quite shattered as the Millennial generation has come of age, it has certainly scattered. Living affordably and trying to climb higher than your parents did were once considered complementary ambitions. Today, young Americans increasingly have to choose one or the other—they can either settle in affordable but stagnant metros or live in economically vibrant cities whose housing prices eat much of their paychecks unless they hit it big.

The dissolution of the American dream isn’t just a feeling; it is an empirical observation. In 2014, economists at Harvard and Berkeley published a landmark study examining which cities have the highest intergenerational mobility—that is, the best odds that a child born into a low-income household will move up into the middle class or beyond. Among large cities, the top of the list was crowded with rich coastal metropolises, including San Francisco, San Jose, Los Angeles, San Diego, and New York City.