In March of 1863, a fugitive slave named Gordon found his way to the Union Army lines in Baton Rouge, Louisiana. Exhausted from his efforts to escape his slaveholders and their dogs, he showed up in tattered rags. When doctors examined him, they saw that his back was marred by a lattice of keloid scars, evidence of the severe whippings he’d endured in bondage. He was photographed, and the image of this former slave, stripped to the waist, with lash marks inscribed on his skin like a bas-relief, was widely distributed in the North—as indisputable evidence of the evil that had brought the nation to the brink of self-destruction. Unlike the authors of slave narratives, Gordon’s ruined flesh could not be accused of hyperbole.

Gordon enlisted in the Union Army, and the image of his lacerated back came to represent an imperative in future struggles for racial equality. Merely highlighting the existence of injustice was insufficient; you had to show the brutal consequences of that injustice, as vividly as possible.

This kind of scar-bearing was an integral part of the twentieth-century movement to uproot Jim Crow, which reached its zenith sixty years ago this Saturday, with the Supreme Court’s ruling in Brown v. Board of Education. Thurgood Marshall’s assault on the edifice of segregation had been confounded by the question of whether the Fourteenth Amendment prohibited racial segregation. The Supreme Court’s decision in Plessy v. Ferguson, in 1896, had held that a putatively benign social separation could coexist with the amendment’s guarantee of equal protection under the law. The majority opinion, in fact, went so far as to argue that efforts to overturn segregation had been motivated by blacks’ misperceptions of the practice:

We consider the underlying fallacy of the plaintiff’s argument to consist in the assumption that the enforced separation of the two races stamps the colored race with a badge of inferiority. If this be so, it is not by reason of anything found in the act, but solely because the colored race chooses to put that construction upon it.

To combat the notion that the evils of segregation were so much hyperbole, Marshall and the other lawyers at the N.A.A.C.P. Legal Defense Fund called upon the psychologists Kenneth and Mamie Clark, whose famous “doll tests” had demonstrated that racism was damaging to the minds of black children. Beginning in 1939, the Clarks had conducted experiments showing that, when presented with two dolls identical in every way except color, black children consistently attributed favorable characteristics like beauty and intelligence to the white dolls, while reserving their most negative assessments for the dolls they most resembled. The Clarks’ work demonstrated that scars need not be visible in order to be indelible, and their data helped to bolster Marshall’s contention that racial separation violated the Fourteenth Amendment’s equal-protection clause.

The nascent civil-rights movement drew its moral authority, in some measure, from the image of African-Americans who were psychologically “damaged” by the legacy of slavery and the ongoing travesty of segregation. But those arguments, about the extent to which racism had wounded the African-American mind, have had a far more complicated legacy than the celebration of Brown would suggest. As the historian Daryl Michael Scott argues in his 1997 book, “Contempt and Pity”:

Liberals used damage imagery to play upon the sympathies of the white middle class. Oppression was wrong, they suggested, because it damaged personalities and changes had to be made to promote the well-being of African Americans. Rather than standing on the ideals of the American creed and making reparations for the nation’s failure to live up to the separate but equal doctrine set forth in Plessy v. Ferguson, liberals capitulated to the historic tendency of posing blacks as objects of pity.

Six decades after the Supreme Court struck down de-jure segregation, vast swaths of the American education system remain separated by race—indeed, there has been a trend toward resegregation in many areas, particularly in the South. But the most telling indicator of the ambiguous legacy of Brown may be the way we perceive the kinds of arguments that led to the decision.

In 1986, the anthropologist John Ogbu conducted a study of African-American academic performance, and he concluded that many black students viewed high educational achievement as a form of “acting white.” Ogbu’s conclusions were widely disputed by other researchers, yet the term—succinct in its oversimplification—leapt from scholarly journals into public debates about race. The Clarks’ doll tests were seen as an indictment of white racism, but the notion of “acting white”—fundamentally rooted in a similar tendency to ascribe virtue to whiteness—was nonetheless deployed as a means of pointing toward African-Americans’ own self-defeating behavior.

This rhetoric was not confined to white conservatives. In 2004, at a dinner sponsored by the N.A.A.C.P. Legal Defense Fund to mark the fiftieth anniversary of their victory in the Brown case, Bill Cosby departed from his notes and launched into a tirade against the shortcomings of impoverished African-Americans. Speaking of Kenneth Clark, by then an elderly widower, Cosby said:

Kenneth Clark, somewhere in his home in upstate New York … just looking ahead. Thank God, he doesn’t know what’s going on, thank God. But these people, the ones up here in the balcony fought so hard. Looking at the incarcerated, these are not political criminals. These are people going around stealing Coca-Cola. People getting shot in the back of the head over a piece of pound cake! Then we all run out and are outraged, “The cops shouldn’t have shot him.” What the hell was he doing with the pound cake in his hand?

Cosby’s remarks were applauded by many on the right, as well as by more than a few African-Americans. What was once considered “damage” had been transformed—by the passage of a few decades and by the insistence that racism was behind us now—into “pathology.” Cosby’s intemperate rhetoric tapped into a vein of frustration, seldom voiced in public, that, a half century beyond the most crucial judicial decision of the civil-rights era, the problems once attributed to legal segregation managed to persist. Despite Cosby’s invective, it was never clear where that frustration should be attributed. There are no metrics for how quickly a group should recover from legally enforced subordination, and no statistical rendering of ongoing racial inequalities could match the explanatory power of a “Colored Only” sign. If these complexities confounded people like Cosby, who’d actually lived through segregation, there was scant hope that they’d be readily perceived by many people who hadn’t.

Yet some things have remained constant. Alarmingly, versions of the Clarks’ doll test conducted in the past few years still yield results similar to those of the original experiments. In 2011, the sociologist Karolyn Tyson showed that concerns over “acting white” among black students tended to arise not in overwhelmingly black schools but precisely in settings in which black students were underrepresented. And yet, sixty years after Brown, the prevailing idea in these debates remains one that is similar to the argument presented in Plessy: that the major, and perhaps the only, problem with ongoing segregation is the way black people perceive and respond to it.

The United States may not be “post-racial,” as many claimed in the wake of Barack Obama’s election, but it clearly sees itself as post-racism, at least when it comes to explaining the color-coded disparities that still define the lives of millions of its citizens.

Jelani Cobb is Associate Professor of History and Director of the Institute for African American Studies at University of Connecticut.