the curious cat

Menu

Tag Archives: social breakdown

“When an Indian child has been brought up among us, taught our language and habituated to our customs, yet if he goes to see his relations and makes one Indian ramble with them, there is no persuading him ever to return. [But] when white persons of either sex have been taken prisoners young by the Indians, and lived a while among them, tho’ ransomed by their friends, and treated with all imaginable tenderness to prevail with them to stay among the English, yet in a short time they become disgusted with our manner of life, and the care and pains that are necessary to support it, and take the first good opportunity of escaping again into the woods, from whence there is no reclaiming them.”
~ Benjamin Franklin

“The Indians, their old masters, gave them their choice and, without requiring any consideration, told them that they had been long as free as themselves. They chose to remain, and the reasons they gave me would greatly surprise you: the most perfect freedom, the ease of living, the absence of those cares and corroding solicitudes which so often prevail with us… all these and many more motives which I have forgot made them prefer that life of which we entertain such dreadful opinions. It cannot be, therefore, so bad as we generally conceive it to be; there must be in their social bond something singularly captivating, and far superior to anything to be boasted of among us; for thousands of Europeans are Indians, and we have no examples of even one of those Aborigines having from choice become Europeans! There must be something more congenial to our native dispositions than the fictitious society in which we live, or else why should children, and even grown persons, become in a short time so invincibly attached to it? There must be something very bewitching in their manners, something very indelible and marked by the very hands of nature. For, take a young Indian lad, give him the best education you possibly can, load him with your bounty, with presents, nay with riches, yet he will secretly long for his native woods, which you would imagine he must have long since forgot, and on the first opportunity he can possibly find, you will see him voluntarily leave behind all you have given him and return with inexpressible joy to lie on the mats of his fathers…

“Let us say what will of them, of their inferior organs, of their want of bread, etc., they are as stout and well made as the Europeans. Without temples, without priests, without kings and without laws, they are in many instances superior to us, and the proofs of what I advance are that they live without care, sleep without inquietude, take life as it comes, bearing all its asperities with unparalleled patience, and die without any kind of apprehension for what they have done or for what they expect to meet with hereafter. What system of philosophy can give us so many necessary qualifications for happiness? They most certainly are much more closely connected to nature than we are; they are her immediate children…”
~ J. Hector St. John de Crèvecœur

Western, educated, industrialized, rich, and democratic. Such societies, as the acronym goes, are WEIRD. But what exactly makes them weird?

This occurred to me because of reading Sebastian Junger’s Tribe. Much of what gets attributed to these WEIRD descriptors has been around for more than a half millennia, at least since the colonial imperialism began in Europe. From the moment Europeans settled in the Americas, a significant number of colonists were choosing to live among the natives because in many ways that lifestyle was a happier and healthier way of living with less stress and work, less poverty and inequality, not only lacking in arbitrary political power but also allowing far more personal freedom, especially for women.

Today, when we bother to think much about the problems we face, we mostly blame them on the side effects of modernity. But colonial imperialism began when Europe was still under the sway of of monarchies, state churches, and feudalism. There was nothing WEIRD about Western civilization at the time.

Those earlier Europeans hadn’t yet started to think of themselves in terms of a broad collective identity such as ‘Westerners’. They weren’t particularly well educated, not even the upper classes. Industrialization was centuries away. As for being rich, there was some wealth back then but it was limited to a few and even for those few it was rather unimpressive by modern standards. And Europeans back then were extremely anti-democratic.

Since European colonists were generally no more WEIRD than various native populations, we must look for other differences between them. Why did so many Europeans choose to live among the natives? Why did so many captured Europeans who were adopted into tribes refuse or resist being ‘saved’? And why did the colonial governments have to create laws and enforce harsh punishments in their having tried to stop people from ‘going native’?

European society, on both sides of the ocean, was severely oppressive and violent. That was particularly true in Virginia that was built on the labor of indentured servitude, at a time when most were worked to death before getting the opportunity for release from bondage. They had plenty of reason to seek the good life among the natives. But life was less than pleasant in the other colonies as well. A similar pattern repeated itself.

Thomas Morton went off with some men into the wilderness to start their own community where they commingled with the natives. This set an intolerable example that threatened Puritan social control and so the Puritans destroyed their community of the free. Roger Williams, a Puritan minister, took on the mission to convert the natives but found himself converted instead. He fled Puritan oppression because, as he put, the natives were more civilized than the colonists. Much later on, Thomas Paine living near some still free Indian tribes observed that their communities demonstrated greater freedom, self-governance, and natural rights than the colonies. He hoped Americans could take that lesson to heart. Other founders, from John Adams to Thomas Jefferson, looked admiringly to the example of their Indian neighbors.

The point is that whatever is problematic about Western society has been that way for a long time. Modernity has worsened this condition of unhappiness and dysfunction. There is no doubt that becoming more WEIRD has made us ever more weird, but Westerners were plenty weird from early on before they became WEIRD. Maybe the turning point for the Western world was the loss of our own traditional cultures and tribal lifestyles, as the Roman model of authoritarianism spread across Europe and became dominant. We have yet to shake off these chains of the past and instead have forced them upon everyone else.

It is what some call the Wetiko, one of the most infectious and deadly of mind viruses. “The New World fell not to a sword but to a meme,” as Daniel Quinn stated it (Beyond Civilization, p. 50). But it is a mind virus that can only take hold after immunity is destroyed. As long as there were societies of the free, the contagion was contained because the sick could be healed. But the power of the contagion is that the rabidly infected feel an uncontrollable compulsion to attack and kill the uninfected, the very people who would offer healing. Then the remaining survivors become infected and spread it further. A plague of victimization until no one is left untouched, until there is nowhere else to escape. Once all alternatives are eliminated, once a demiurgic monoculture comes to power, we are trapped in what Philip K. Dick called the Black Iron Prison. Sickness becomes all we know.

The usefulness of taking note of contemporary WEIRD societies isn’t that WEIRDness is the disease but that it shows the full blown set of symptoms of the disease. But the onset of the symptoms comes long after the infection, like a slow-growing malignant tumor in the brain. Still, symptoms are important, specifically when there is a comparison to a healthy population. That is what the New World offered the European mind, a comparison. The earliest accounts of native societies in the Americas helped Europeans to diagnose their own disease and helped motivate them to begin looking for a cure, although the initial attempts were fumbling and inept. The first thing that some Europeans did was simply to imagine what a healthy community might look like. That is what Thomas More attempted to do six centuries ago with his book Utopia.

Maybe the key is that of social concerns. Utopian visions have always focused on the social aspect, typically describing how people would ideally live together in communities. That was also the focus of the founders when they sought out alternative possibilities of organizing society. The change with colonialism, as feudalism was breaking down, was loss of belonging, of community and kinship. Modern individualism began not as an ideal but as a side effect of social breakdown, a condition of isolation and disconnection forced upon entire populations rather than having been freely chosen. And it was traumatizing to the Western psyche, and still is traumatizing, as seen with the high rates of mental illnesses in WEIRD societies, especially in the hyper-individualistic United States. That began before the defining factors of the WEIRD took hold. It was that trauma that made the WEIRD possible.

The colonists, upon meeting natives, discovered what had been lost. And for many colonists, that loss had happened within living memory. The hunger for what was lost was undeniable. To have seen a traditional communities that still functioned would have been like taking a breath of fresh air after having spent months in the stench of a ship’s hull. Not only did these native communities demonstrate what was recently lost but also what had been lost so much earlier. As many Indian tribes had more democratic practices, so did many European tribes prior to feudalism. But colonists had spent their entire lives being told democracy was impossible, that personal freedom was dangerous or even sinful.

The difference today is that none of this is within living memory for most of us, specifically Americans, unless one was raised Amish or in some similar community. The closest a typical American comes to this experience is by joining the military during war time. That is one of the few opportunities for a modern equivalent to a tribe, at least within WEIRD societies. And maybe a large part of the trauma soldiers struggle with isn’t merely the physical violence of war but the psychological violence of returning to the state of alienation, the loss of a bond that was closer than that of their own families.

Sebastian Junger notes that veterans who return to strong social support experience low rates of long-term PTSD, similar to Johann Hari’s argument about addiction in Chasing the Scream and his argument about depression in Lost Connections. Trauma, depression, addiction, etc — these are consequences of or worsened by isolation. These responses are how humans cope under stressful and unnatural conditions.

The question for Western society isn’t so much why tribal life might be so appealing—it seems obvious on the face of it—but why Western society is so unappealing. On a material level it is clearly more comfortable and protected from the hardships of the natural world. But as societies become more affluent they tend to require more, rather than less, time and commitment by the individual, and it’s possible that many people feel that affluence and safety simply aren’t a good trade for freedom. One study in the 1960s found that nomadic !Kung people of the Kalahari Desert needed to work as little as twelve hours a week in order to survive—roughly one-quarter the hours of the average urban executive at the time. “The ‘camp’ is an open aggregate of cooperating persons which changes in size and composition from day to day,” anthropologist Richard Lee noted with clear admiration in 1968. “The members move out each day to hunt and gather, and return in the evening to pool the collected foods in such a way that every person present receives an equitable share… Because of the strong emphasis on sharing, and the frequency of movement, surplus accumulation… is kept to a minimum.”

The Kalahari is one of the harshest environments in the world, and the !Kung were able to continue living a Stone-Age existence well into the 1970s precisely because no one else wanted to live there. The !Kung were so well adapted to their environment that during times of drought, nearby farmers and cattle herders abandoned their livelihoods to join them in the bush because foraging and hunting were a more reliable source of food. The relatively relaxed pace of !Kung life—even during times of adversity—challenged long-standing ideas that modern society created a surplus of leisure time. It created exactly the opposite: a desperate cycle of work, financial obligation, and more work. The !Kung had far fewer belongings than Westerners, but their lives were under much greater personal control. […]

First agriculture, and then industry, changed two fundamental things about the human experience. The accumulation of personal property allowed people to make more and more individualistic choices about their lives, and those choices unavoidably diminished group efforts toward a common good. And as society modernized, people found themselves able to live independently from any communal group. A person living in a modern city or a suburb can, for the first time in history, go through an entire day—or an entire life—mostly encountering complete strangers. They can be surrounded by others and yet feel deeply, dangerously alone.

The evidence that this is hard on us is overwhelming. Although happiness is notoriously subjective and difficult to measure, mental illness is not. Numerous cross-cultural studies have shown that modern society—despite its nearly miraculous advances in medicine, science, and technology—is afflicted with some of the highest rates of depression, schizophrenia, poor health, anxiety, and chronic loneliness in human history. As affluence and urbanization rise in a society, rates of depression and suicide tend to go up rather than down. Rather than buffering people from clinical depression, increased wealth in a society seems to foster it.

Suicide is difficult to study among unacculturated tribal peoples because the early explorers who first encountered them rarely conducted rigorous ethnographic research. That said, there is remarkably little evidence of depression-based suicide in tribal societies. Among the American Indians, for example, suicide was understood to apply in very narrow circumstances: in old age to avoid burdening the tribe, in the ritual paroxysms of grief following the death of a spouse, in a hopeless but heroic battle with an enemy, and in an attempt to avoid the agony of torture. Among tribes that were ravaged by smallpox, it was also understood that a person whose face had been hideously disfigured by lesions might kill themselves. According to The Ethics of Suicide: Historical Sources , early chroniclers of the American Indians couldn’t find any other examples of suicide that were rooted in psychological causes. Early sources report that the Bella Coola, the Ojibwa, the Montagnais, the Arapaho, the Plateau Yuma, the Southern Paiute, and the Zuni, among many others, experienced no suicide at all.

This stands in stark contrast to many modern societies, where the suicide rate is as high as 25 cases per 100,000 people. (In the United States, white middle-aged men currently have the highest rate at nearly 30 suicides per 100,000.) According to a global survey by the World Health Organization, people in wealthy countries suffer depression at as much as eight times the rate they do in poor countries, and people in countries with large income disparities—like the United States—run a much higher lifelong risk of developing severe mood disorders. A 2006 study comparing depression rates in Nigeria to depression rates in North America found that across the board, women in rural areas were less likely to get depressed than their urban counterparts. And urban North American women—the most affluent demographic of the study—were the most likely to experience depression.

The mechanism seems simple: poor people are forced to share their time and resources more than wealthy people are, and as a result they live in closer communities. Inter-reliant poverty comes with its own stresses—and certainly isn’t the American ideal—but it’s much closer to our evolutionary heritage than affluence. A wealthy person who has never had to rely on help and resources from his community is leading a privileged life that falls way outside more than a million years of human experience. Financial independence can lead to isolation, and isolation can put people at a greatly increased risk of depression and suicide. This might be a fair trade for a generally wealthier society—but a trade it is. […]

The alienating effects of wealth and modernity on the human experience start virtually at birth and never let up. Infants in hunter-gatherer societies are carried by their mothers as much as 90 percent of the time, which roughly corresponds to carrying rates among other primates. One can get an idea of how important this kind of touch is to primates from an infamous experiment conducted in the 1950s by a primatologist and psychologist named Harry Harlow. Baby rhesus monkeys were separated from their mothers and presented with the choice of two kinds of surrogates: a cuddly mother made out of terry cloth or an uninviting mother made out of wire mesh. The wire mesh mother, however, had a nipple that dispensed warm milk. The babies took their nourishment as quickly as possible and then rushed back to cling to the terry cloth mother, which had enough softness to provide the illusion of affection. Clearly, touch and closeness are vital to the health of baby primates—including humans.

In America during the 1970s, mothers maintained skin-to-skin contact with babies as little as 16 percent of the time, which is a level that traditional societies would probably consider a form of child abuse. Also unthinkable would be the modern practice of making young children sleep by themselves. In two American studies of middle-class families during the 1980s, 85 percent of young children slept alone in their own room—a figure that rose to 95 percent among families considered “well educated.” Northern European societies, including America, are the only ones in history to make very young children sleep alone in such numbers. The isolation is thought to make many children bond intensely with stuffed animals for reassurance. Only in Northern European societies do children go through the well-known developmental stage of bonding with stuffed animals; elsewhere, children get their sense of safety from the adults sleeping near them.

The point of making children sleep alone, according to Western psychologists, is to make them “self-soothing,” but that clearly runs contrary to our evolution. Humans are primates—we share 98 percent of our DNA with chimpanzees—and primates almost never leave infants unattended, because they would be extremely vulnerable to predators. Infants seem to know this instinctively, so being left alone in a dark room is terrifying to them. Compare the self-soothing approach to that of a traditional Mayan community in Guatemala: “Infants and children simply fall asleep when sleepy, do not wear specific sleep clothes or use traditional transitional objects, room share and cosleep with parents or siblings, and nurse on demand during the night.” Another study notes about Bali: “Babies are encouraged to acquire quickly the capacity to sleep under any circumstances, including situations of high stimulation, musical performances, and other noisy observances which reflect their more complete integration into adult social activities.”

As modern society reduced the role of community, it simultaneously elevated the role of authority. The two are uneasy companions, and as one goes up, the other tends to go down. In 2007, anthropologist Christopher Boehm published an analysis of 154 foraging societies that were deemed to be representative of our ancestral past, and one of their most common traits was the absence of major wealth disparities between individuals. Another was the absence of arbitrary authority. “Social life is politically egalitarian in that there is always a low tolerance by a group’s mature males for one of their number dominating, bossing, or denigrating the others,” Boehm observed. “The human conscience evolved in the Middle to Late Pleistocene as a result of… the hunting of large game. This required… cooperative band-level sharing of meat.”

Because tribal foragers are highly mobile and can easily shift between different communities, authority is almost impossible to impose on the unwilling. And even without that option, males who try to take control of the group—or of the food supply—are often countered by coalitions of other males. This is clearly an ancient and adaptive behavior that tends to keep groups together and equitably cared for. In his survey of ancestral-type societies, Boehm found that—in addition to murder and theft—one of the most commonly punished infractions was “failure to share.” Freeloading on the hard work of others and bullying were also high up on the list. Punishments included public ridicule, shunning, and, finally, “assassination of the culprit by the entire group.” […]

Most tribal and subsistence-level societies would inflict severe punishments on anyone who caused that kind of damage. Cowardice is another form of community betrayal, and most Indian tribes punished it with immediate death. (If that seems harsh, consider that the British military took “cowards” off the battlefield and executed them by firing squad as late as World War I.) It can be assumed that hunter-gatherers would treat their version of a welfare cheat or a dishonest banker as decisively as they would a coward. They may not kill him, but he would certainly be banished from the community. The fact that a group of people can cost American society several trillion dollars in losses—roughly one-quarter of that year’s gross domestic product—and not be tried for high crimes shows how completely de-tribalized the country has become.

Dishonest bankers and welfare or insurance cheats are the modern equivalent of tribe members who quietly steal more than their fair share of meat or other resources. That is very different from alpha males who bully others and openly steal resources. Among hunter-gatherers, bullying males are often faced down by coalitions of other senior males, but that rarely happens in modern society. For years, the United States Securities and Exchange Commission has been trying to force senior corporate executives to disclose the ratio of their pay to that of their median employees. During the 1960s, senior executives in America typically made around twenty dollars for every dollar earned by a rank-and-file worker. Since then, that figure has climbed to 300-to-1 among S&P 500 companies, and in some cases it goes far higher than that. The US Chamber of Commerce managed to block all attempts to force disclosure of corporate pay ratios until 2015, when a weakened version of the rule was finally passed by the SEC in a strict party-line vote of three Democrats in favor and two Republicans opposed.

In hunter-gatherer terms, these senior executives are claiming a disproportionate amount of food simply because they have the power to do so. A tribe like the !Kung would not permit that because it would represent a serious threat to group cohesion and survival, but that is not true for a wealthy country like the United States. There have been occasional demonstrations against economic disparity, like the Occupy Wall Street protest camp of 2011, but they were generally peaceful and ineffective. (The riots and demonstrations against racial discrimination that later took place in Ferguson, Missouri, and Baltimore, Maryland, led to changes in part because they attained a level of violence that threatened the civil order.) A deep and enduring economic crisis like the Great Depression of the 1930s, or a natural disaster that kills tens of thousands of people, might change America’s fundamental calculus about economic justice. Until then, the American public will probably continue to refrain from broadly challenging both male and female corporate leaders who compensate themselves far in excess of their value to society.

That is ironic, because the political origins of the United States lay in confronting precisely this kind of resource seizure by people in power. King George III of England caused the English colonies in America to rebel by trying to tax them without allowing them a voice in government. In this sense, democratic revolutions are just a formalized version of the sort of group action that coalitions of senior males have used throughout the ages to confront greed and abuse. Thomas Paine, one of the principal architects of American democracy, wrote a formal denunciation of civilization in a tract called Agrarian Justice : “Whether… civilization has most promoted or most injured the general happiness of man is a question that may be strongly contested,” he wrote in 1795. “[Both] the most affluent and the most miserable of the human race are to be found in the countries that are called civilized.”

When Paine wrote his tract, Shawnee and Delaware warriors were still attacking settlements just a few hundred miles from downtown Philadelphia. They held scores of white captives, many of whom had been adopted into the tribe and had no desire to return to colonial society. There is no way to know the effect on Paine’s thought process of living next door to a communal Stone-Age society, but it might have been crucial. Paine acknowledged that these tribes lacked the advantages of the arts and science and manufacturing, and yet they lived in a society where personal poverty was unknown and the natural rights of man were actively promoted.

In that sense, Paine claimed, the American Indian should serve as a model for how to eradicate poverty and bring natural rights back into civilized life.

The signs on every playground in my city, New York, say this: “Playground rules prohibit adults except in the company of children.”

Apparently, because any adult who simply wants to sit on a bench and watch kids at play could be a creep, it’s best to just ban them all. The idea that children and adults go naturally together has been replaced by distrust and disgust. [ . . . ]

By separating the generations this way, we are creating a new society, one that actively distrusts anyone who wants to help a kid other than his own. Compare this anxiety with what goes on in Japan. There the youngest kids wear bright yellow hats when they go to school.

“Doesn’t that put them in danger?” asked a friend I was telling about this. To her, a kid who calls attention to himself is a kid who could be attracting a predator.

But attracting adult attention is exactly what the yellow hats are supposed to do. In Japan, the assumption is that the easier it is to see children the easier it is for grown-ups to look out for them.

Japan’s belief is that children are our collective responsibility. America’s is that children are private treasures under constant threat of theft.

Which brings me to the flip side of our obsession with stranger danger: the idea that anytime a parent lets her kids do anything on their own, she is actually requiring the rest of us grown-ups to “baby-sit” them free of charge. [ . . . ]

It didn’t matter that he was perfectly well-behaved, only that when a store employee asked his age, he was deemed an unbearable burden to the store. The manager had him detained until his father could come pick him up.

This detention outraged many people, but a significant contingent sided with the store, saying that the employees there shouldn’t have had to “baby-sit” the boy.

But, but — no one did have to baby-sit him. He was just a person in public, albeit a young one.

Take “stranger danger,” the classic Halloween horror. Even when I was a kid, back in the “Bewitched” and “Brady Bunch” costume era, parents were already worried about neighbors poisoning candy. Sure, the folks down the street might smile and wave the rest of the year, but apparently they were just biding their time before stuffing us silly with strychnine-laced Smarties.

That was a wacky idea, but we bought it. We still buy it, even though Joel Best, a sociologist at the University of Delaware, has researched the topic and spends every October telling the press that there has never been a single case of any child being killed by a stranger’s Halloween candy. (Oh, yes, he concedes, there was once a Texas boy poisoned by a Pixie Stix. But his dad did it for the insurance money. He was executed.)

Anyway, you’d think that word would get out: poisoned candy not happening. But instead, most Halloween articles to this day tell parents to feed children a big meal before they go trick-or-treating, so they won’t be tempted to eat any candy before bringing it home for inspection. As if being full has ever stopped any kid from eating free candy!

So stranger danger is still going strong, and it’s even spread beyond Halloween to the rest of the year. Now parents consider their neighbors potential killers all year round. That’s why they don’t let their kids play on the lawn, or wait alone for the school bus: “You never know!” The psycho-next-door fear went viral.

Of nonfamily abductions, just 115 children (90 reported) in 1999 were estimated to fit a stereotypical kidnapping by a stranger or slight acquaintance. Forty of those were killed. That’s 1 child out of every 750,000 kidnapped, and 1 out of about every 2 million killed.

Of all children reported missing (whether the estimate or based on reports), 99.8% were returned home or located; the remaining number were virtually all runaways.

Family, through noncustodial abduction or kicking a child out; a child’s own action as a runaway, for whatever cause and for whatever duration; and accidents are most of these reports. All of these problems can be mitigated by various means, but none of them fit into our picture of not letting a kid walk down the street because she or he will be snatched.

You wouldn’t know any of this from reading typical parental advice regarding stranger danger.

The problem is that the very premise makes it seem as if this is a situation kids are routinely faced with, something as common as, “Would your kids eat a cookie if someone offered it?” What is so hard to understand is that first of all, the vast majority of crimes against children are committed not by strangers they meet at the park but by people they know. That means they are far likelier to encounter their abuser at the dinner table than at the park. So it is bizarre to keep acting as if the park is teeming with danger.

Secondly, there is something twisted and weird about only looking at risk when we think of kids. Every aspect of children’s lives is seen as somehow dangerous: what they’re eating, wearing, watching and doing and, of course, what could happen to them if they ever left the house.

Which, increasingly, we don’t let them do — despite there being a crime rate that is similar to the one in 1963.

The concatenation of advantages and disadvantages is visible in economic sorting at the neighbourhood level, leading to social sorting in terms of schools, churches and community groups. Putnam writes: “Our kids are increasingly growing up with kids like them who have parents like us.” This represents, he warns, “an incipient class apartheid”.

For the well-educated, the phrase “our kids” may well bring to mind conditions of relative affluence, in which children grow up in a family with two married and attentive (even overattentive) parents; attend high-performing schools; and feel themselves embedded in a network of friends and mentors ready to help them navigate life’s challenges. By contrast, “their kids”—the kids of poor and working-class parents—face a world in which social capital is in short supply. As Mr. Putnam shows powerfully and poignantly—combining reporting with empirical analysis—the disparity results in too many children in nonaffluent circumstances feeling alone, emotionally stunted and unable to summon the will to climb today’s economic ladder into the middle or upper class.

The American dream is in crisis, Putnam argues, because Americans used to care about other people’s kids and now they only care about their own kids. But, he writes, “America’s poor kids do belong to us and we to them. They are our kids.”

“If it takes a village to raise a child, the prognosis for America’s children isn’t good: In recent years, villages all over America, rich and poor, have deteriorated as we’ve shirked collective responsibility for our kids,” Mr. Putnam wrote. “And most Americans don’t have the resources … to replace collective provision with private provision.” [ . . . ]

Mr. Putnam, whose 2000 book Bowling Alone looked at declining civic ties among adults, argues that students in poverty growing up in the middle of the last century had greater economic and social mobility than their counterparts do today in large part because adults at all socioeconomic levels were more likely then to see all students as “our kids.”

PS: Sure, but Herrnstein’s point was that it could be genetic, it could be nurture, but that sort of mating is happening and it’s going to pose a huge inequality problem in this country.

RP: He’s right. And if it were just genetics, there might not be anything we could do about it. But if it’s partly just the resources that we’re investing in these kids, which is my thesis, that’s fixable in principle. That’s not like a law of genetics. My argument is basically we need to think of these kids coming from poor backgrounds and broken homes – they’re also our kids.

When I was growing up in Port Clinton 50 years ago, my parents talked about, “We’ve got to do things for our kids. We’ve got to pay higher taxes so our kids can have a better swimming pool, or we’ve got to pay higher taxes so we can have a new French department in school,” or whatever. When they said that, they did not just mean my sister and me — it was all the kids here in town, of all sorts.But what’s happened, and this is sort of the bowling alone story, is that over this last 30, 40, 50 years, the meaning of “our kids” has narrowed and narrowed and narrowed so that now when people say, “We’ve got to do something for our kids,” they mean MY biological kids. [ . . . ]

PS: But liberals like you make this argument about all kinds of things, like infrastructure or education: pay now or you’ll pay more later. Americans feel that they’re already paying enough in taxes and they don’t trust that those investments will be made efficiently enough.

RP: America’s best investment ever, in the whole history of our country, was to invest in the public high school and secondary school at the beginning of the 20th century. It dramatically raised the growth rate of America because it was a huge investment in human capital. The best economic analyses now say that investment in the public high schools in 1910 accounted for all of the growth of the American economy between then and about 1970. That huge investment paid off for everybody. Everybody in America had a higher income.

Now, some rich farmer could have said, “Well, why should I be paying for those other kids to go to high school? My kids are already off in Chicago and I don’t care about [other kids].” But most people in America didn’t. This was not something hatched in Washington – small town people got together and said, “Look, we ought to do this for our kids… We ought to have a high school so that every kid who grows up here — they’re all our kids — gets a good high school education.”