Two milestones in the history of American education are converging this spring. The second is reshaping the legacy of the first.

The first landmark moment will arrive May 17, with the 60th anniversary of the Supreme Court's Brown v. Board of Education decision striking down "separate but equal" segregation in public education. The second watershed will follow in June, with the completion of what is likely to be the last school year ever in which a majority of America's K-12 public-school students are white.

That demographic transformation is both reinvigorating and reframing Brown's fundamental goal of ensuring educational opportunity for all Americans. The unanimous 1954 Brown decision was a genuine hinge in American history. Although its mandate to dismantle segregated public schools initially faced "massive resistance" across the South, the ruling provided irresistible moral authority to the drive for legal equality that culminated in the passage of the Civil Rights and Voting Rights acts a decade later.

Coming nearly 60 years after the Supreme Court had upheld segregation in the 1896 Plessy v. Ferguson decision, Chief Justice Earl Warren's ringing opinion in Brown was the belated midcourse correction that began America's transformation into a truly multiracial world nation.

But a distinct note of disenchantment is surfacing as scholars and advocates assess Brown's legacy. "Brown was unsuccessful in its purported mission—to undo the school segregation that persists as a central feature of American public education today," Richard Rothstein, a veteran liberal educational analyst, declared in a paper this month.

That seems excessively pessimistic. Just before Brown, only about one in seven African-Americans, compared with more than one in three whites, held a high school degree. Today, the Census Bureau reports, the share of all African-American adults holding high school degrees (85 percent) nearly equals the share of whites (89 percent); blacks have slightly passed whites on that measure among young adults ages 25 to 29.

Before Brown, only about one in 40 African-Americans earned a college degree. Now more than one in five hold one. Educational advances have also keyed other gains, including the growth of a substantial black middle-class and health gains that have cut the white-black gap in life expectancy at birth by more than half since 1950.

Yet many other disparities remain. Whites (especially from more affluent families) still complete college at much higher rates than African-Americans. That's one reason census figures show the median income for African-American families remains only about three-fifths that for whites, not much better than in 1967. Hispanics, now the largest minority group, are likewise making clear gains but still trail whites and blacks on the key measures of educational attainment, on some fronts substantially.

Brown's core mission of encouraging integration can best be defined as unfinished. Many civil-rights advocates, such as Gary Orfield, codirector of the Civil Rights Project at UCLA, argue that after gains through the late 1980s, the public-school system is undergoing a "resegregation" that has left African-American and Latino students "experiencing more isolation … [than] a generation ago." Other analysts question whether segregation is worsening, but no one denies that racial and economic isolation remains daunting: One recent study found that three-fourths of African-Americans and two-thirds of Hispanics attend schools where a majority of the students qualify as low-income.

The second big educational milestone arriving this spring should recast the debate over the first. From Brown to the ongoing affirmative-action debates that the Supreme Court revisited again this week, fairness has been the strongest argument for measures meant to provide educational chances for all. But as our society diversifies, broadening the circle of opportunity has become a matter not only of equity but also of competitiveness.

The National Center for Educational Statistics recently projected that minorities will become a majority of the K-12 public-school student body for the first time in 2014—and that majority will steadily widen. As recently as 1997, whites represented more than three-fifths of public-school students. This transformation isn't just limited to a few immigration hubs: Minorities now represent a majority in 310 of the 500 largest public-school districts, federal statistics show.

These minority young people are the nation's future workers, consumers, and taxpayers. If more of them don't obtain the education and training to reach the middle class, the U.S. "will be a poorer and less competitive society," says Rice University sociologist Steven Murdock, former Census Bureau director under George W. Bush and the author of Changing Texas, a recent book on that state's demographic transformation.

The increasing diversity and shrinking white share of America's youth population complicates Brown's original aim of promoting integrated schools. But that change only adds greater urgency to the decision's broader goal of ensuring all young people the opportunity to develop their talents.

The barriers to fulfilling that vision, from family breakdown to persistent residential and educational segregation, remain formidable. The difference is that as our society grows inexorably more diverse, the consequences of failing to overcome those barriers are rising—for all Americans. "These are realities," says Murdock, "that we are going to have to live with whether we are left, right, or in between."

About the Author

Most Popular

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?

When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.

Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.

Jim Gilmore joins the race, and the Republican field jockeys for spots in the August 6 debate in Cleveland.

After decades as the butt of countless jokes, it’s Cleveland’s turn to laugh: Seldom have so many powerful people been so desperate to get to the Forest City. There’s one week until the Republican Party’s first primary debate of the cycle on August 6, and now there’s a mad dash to get into the top 10 and qualify for the main event.

With former Virginia Governor Jim Gilmore filing papers to run for president on July 29, there are now 17 “major” candidates vying for the GOP nomination, though that’s an awfully imprecise descriptor. It takes in candidates with lengthy experience and a good chance at the White House, like Scott Walker and Jeb Bush; at least one person who is polling well but is manifestly unserious, namely Donald Trump; and people with long experience but no chance at the White House, like Gilmore. Yet it also excludes other people with long experience but no chance at the White House, such as former IRS Commissioner Mark Everson.