As a coda to yesterday’s article on Harvard University, where a committee is trying to ban undergraduates from joining social clubs, readers may be interested in a dissent offered by Steven Pinker, the influential psychology professor, who declared that the recommendation of his colleagues is “at odds with the ideals of a university.”

If you’re catching up, the Harvard committee argued that “allowing our students pick their own social spaces and friends,” while not without value, is at odds with principles of “inclusiveness and equality,” and should be sacrificed in the name of progress.

This is a terrible recommendation, which is at odds with the ideals of a university.

A university is an institution with circumscribed responsibilities which engages in a contract with its students. Its main responsibility is to provide them with an education. It is not an arbiter over their lives, 24/7. What they do on their own time is none of the university’s business.

One of the essential values in higher education is that people can differ in their values, and that these differences can be constructively discussed. Harvard has a right to value mixed-sex venues everywhere, all the time, with no exceptions. If some of its students find value in private, single-sex associations, some of the time, a university is free to argue against, discourage, or even ridicule those choices. But it is not a part of the mandate of a university to impose these values on its students over their objections.

Universities ought to be places where issues are analyzed, distinctions are made, evidence is evaluated, and policies crafted to attain clearly stated goals. This recommendation is a sledgehammer which doesn’t distinguish between single-sex and other private clubs. It doesn’t target illegal or objectionable behavior such as drunkenness or public disturbances. Nor by any stretch of the imagination could it be seen as an effective, rationally justified, evidence-based policy tailored to reduce sexual assault.

This illiberal policy can only contribute to the impression in the country at large that elite universities are not dispassionate forums for clarifying values, analyzing problems, and proposing evidence-based solutions, but are institutions determined to impose their ideology and values on a diverse population by brute force.

Were I running a university, I would happily conform to the principles set forth by Pinker, a course that would best serve students of all sorts in my estimation.

But Harvard University and most other elite institutions of undergraduate education long ago rejected the proposition that “a university is an institution with circumscribed responsibilities which engages in a contract with its students. Its main responsibility is to provide them with an education. It is not an arbiter over their lives, 24/7. What they do on their own time is none of the university’s business.”

As prospective students who peruse Harvard’s residential life web page learn in the first sentence they read, “From the moment our Freshmen walk into Harvard Yard until they graduate they are surrounded by people dedicated to making Harvard ‘home.’” That is partly because Harvard buys into the core premise of the residential college model: that students spend more time outside the classroom than inside of the classroom, and living in an ordered community will help them learn from faculty and one another even as they take meals, attend events, and socialize.

But the particulars of residential life at elite institutions are also shaped by other imperatives: catering to the desires of consumerist parents and students; adhering to increasingly intrusive regulations from the federal government; protecting the safety of young people who are no longer considered adults; and conforming to a normative judgment that colleges should be home, or at least “home.”

Alan Jacobs, a professor at Baylor University, has written eloquently about that last piece. A residential college is not a home, he observed, but “a place where people from all over the world, from a wide range of social backgrounds, and with a wide range of interests and abilities, come to live together temporarily, for about 30 weeks a year, before moving on to their careers. It is an essentially public space, though with controls on ingress and egress to prevent chaos and foster friendship and fellowship.”

The significance of the distinction, described in the wake of a controversy at Yale University:

Residential colleges have long been defended as transitional spaces between the world of home and a fully independent adult life, and it would be a great mistake to think of them as merely continuing the ethos of home.

That would leave young people totally unprepared for that “adult life,” which I think we might, for the purposes of this discussion, define as that period during which there is no one to run to to demand control over other people’s Halloween costumes … by the time one gets to college one’s “public individuality” should be sufficiently developed that the wearing of costumes should be seen as an essentially trivial matter that students can deal with among themselves. If they can’t, the university needs to acknowledge that they’re dealing with some serious cases of arrested development.

In a fascinating article called “The Japanese Preschool’s Pedagogy of Peripheral Participation,” Akiko Hayashi and Joseph Tobin describe a twofold strategy commonly deployed in Japan to deal with preschoolers’ conflicts: machi no hoiku and mimamoru. The former means “caring by waiting”; the second means “standing guard.” When children come into conflict, the teacher makes sure the students know that she is present—she may even add, kamisama datte miterun, daiyo (the gods too are watching)—but she does not intervene unless absolutely necessary. Even if the children start to fight she may not intervene; that will depend on whether a child is genuinely attempting to hurt another or the two are halfheartedly “play-fighting.”

The idea is to give children every possible opportunity to resolve their own conflicts—even past the point at which it might, to an American observer, seem that a conflict is irresolvable. This requires patient waiting; and of course one can wait too long—just as one can intervene too quickly. The mimamoru strategy is meant to reassure children that their authorities will not allow anything really bad to happen to them, though perhaps some unpleasant moments may arise. But those unpleasant moments must be tolerated, else how will the children learn to respond constructively and effectively to conflict—conflict which is, after all, inevitable in any social environment? And if children don’t begin to learn such responses in preschool when will they learn it?Imagine if at university they had developed no such abilities and were constantly dependent on authorities to ease every instance of social friction.

What a mess that would be.

The child-development expert Erika Christakis made much the same point in her thoughtful if much maligned email at Yale, where she urged administrators to stop attempting to shape campus norms around Halloween costumes for the sake of students:

I don’t wish to trivialize genuine concerns about cultural and personal representation, and other challenges to our lived experience in a plural community. I know that many decent people have proposed guidelines on Halloween costumes from a spirit of avoiding hurt and offense. I laud those goals, in theory … But in practice, I wonder if we should reflect more transparently, as a community, on the consequences of an institutional (which is to say: bureaucratic and administrative) exercise of implied control over college students … we can have this discussion of costumes on many levels: we can talk about complex issues of identify, free speech, cultural appropriation, and virtue “signalling.” But I wanted to share my thoughts with you from a totally different angle, as an educator concerned with the developmental stages of childhood and young adulthood.

… Is there no room anymore for a child or young person to be a little bit obnoxious … a little bit inappropriate or provocative or, yes, offensive? American universities were once a safe space not only for maturation but also for a certain regressive, or even transgressive, experience; increasingly, it seems, they have become places of censure and prohibition. And the censure and prohibition come from above, not from yourselves!

Are we all okay with this transfer of power?

Have we lost faith in young people’s capacity—in your capacity—to exercise self-censure, through social norming, and also in your capacity to ignore or reject things that trouble you? We tend to view this shift from individual to institutional agency as a tradeoff between libertarian vs. liberal values (“liberal” in the American, not European sense of the word) … But—again, speaking as a child development specialist—I think there might be something missing in our discourse about the exercise of free speech on campus, and it is this: What does this debate about Halloween costumes say about our view of young adults, of their strength and judgment?

Jacobs, Christakis, and Pinker are inclined to treat college students like people who are either already capable of exercising autonomous judgments about their social lives, in a manner closer to adults than to children, or who had better be empowered to do so right away, in the relatively safe environment of undergraduate campuses, because they will need that skill in the real world.

But elite institutions of higher education—and many students who attend them, though not necessarily a majority—want to proceed with a very different model of residential life.

They reject the premise that what students do on their own time is their own business, because they might do things that upset, offend, or exclude others; they believe that administrators of residential life can enlighten students morally by way of imposing bureaucratic rules and structures, resulting in more inclusive, equal campuses. Perfecting campus life, or getting as close as possible, is a bigger focus than exposing students to the autonomy that they’ll be forced to navigate after graduation.

In sketching these visions, I do not want to imply that they present a binary choice, with radically polarized camps utterly refusing to draw on the best insights of the other approach. Christakis explicitly nodded to the good intentions of the bureaucrats, and the way she and her husband adjudicated their positions as “masters” of Silliman College illustrate their moderate view of college as an in-between space. And even the most heavy-handed progressive college administrator will pay lip service to the notion that college students benefit from exercises in autonomy like studying abroad in unfamiliar countries.

On the whole, however, I see a landscape of higher education where classical liberal values like free speech and due process have powerful defenders against the excesses of utopian progressives, but where there are no organized advocates of the proposition that institutions should largely butt-out of the lives of undergraduates, so that they may develop and assert different values and learn to deal with the consequences.

About the Author

Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.

Most Popular

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer.

Five days after Hurricane Maria made landfall in Puerto Rico, its devastating impact is becoming clearer. Most of the U.S. territory currently has no electricity or running water, fewer than 250 of the island’s 1,600 cellphone towers are operational, and damaged ports, roads, and airports are slowing the arrival and transport of aid. Communication has been severely limited and some remote towns are only now being contacted. Jenniffer Gonzalez, the Resident Commissioner of Puerto Rico, told the Associated Press that Hurricane Maria has set the island back decades.

A small group of programmers wants to change how we code—before catastrophe strikes.

There were six hours during the night of April 10, 2014, when the entire population of Washington State had no 911 service. People who called for help got a busy signal. One Seattle woman dialed 911 at least 37 times while a stranger was trying to break into her house. When he finally crawled into her living room through a window, she picked up a kitchen knife. The man fled.

The 911 outage, at the time the largest ever reported, was traced to software running on a server in Englewood, Colorado. Operated by a systems provider named Intrado, the server kept a running counter of how many calls it had routed to 911 dispatchers around the country. Intrado programmers had set a threshold for how high the counter could go. They picked a number in the millions.

The greatest threats to free speech in America come from the state, not from activists on college campuses.

The American left is waging war on free speech. That’s the consensus from center-left to far right; even Nazis and white supremacists seek to wave the First Amendment like a bloody shirt. But the greatest contemporary threat to free speech comes not from antifa radicals or campus leftists, but from a president prepared to use the power and authority of government to chill or suppress controversial speech, and the political movement that put him in office, and now applauds and extends his efforts.

The most frequently cited examples of the left-wing war on free speech are the protests against right-wing speakers that occur on elite college campuses, some of which have turned violent.New York’s Jonathan Chait has described the protests as a “war on the liberal mind” and the “manifestation of a serious ideological challenge to liberalism—less serious than the threat from the right, but equally necessary to defeat.” Most right-wing critiques fail to make such ideological distinctions, and are far more apocalyptic—some have unironically proposed state laws that define how universities are and are not allowed to govern themselves in the name of defending free speech.

A growing body of research debunks the idea that school quality is the main determinant of economic mobility.

One of the most commonly taught stories American schoolchildren learn is that of Ragged Dick, Horatio Alger’s 19th-century tale of a poor, ambitious teenaged boy in New York City who works hard and eventually secures himself a respectable, middle-class life. This “rags to riches” tale embodies one of America’s most sacred narratives: that no matter who you are, what your parents do, or where you grow up, with enough education and hard work, you too can rise the economic ladder.

A body of research has since emerged to challenge this national story, casting the United States not as a meritocracy but as a country where castes are reinforced by factors like the race of one’s childhood neighbors and how unequally income is distributed throughout society. One such study was published in 2014, by a team of economists led by Stanford’s Raj Chetty. After analyzing federal income tax records for millions of Americans, and studying, for the first time, the direct relationship between a child’s earnings and that of their parents, they determined that the chances of a child growing up at the bottom of the national income distribution to ever one day reach the top actually varies greatly by geography. For example, they found that a poor child raised in San Jose, or Salt Lake City, has a much greater chance of reaching the top than a poor child raised in Baltimore, or Charlotte. They couldn’t say exactly why, but they concluded that five correlated factors—segregation, family structure, income inequality, local school quality, and social capital—were likely to make a difference. Their conclusion: America is land of opportunity for some. For others, much less so.

One hundred years ago, a retail giant that shipped millions of products by mail moved swiftly into the brick-and-mortar business, changing it forever. Is that happening again?

Amazon comes to conquer brick-and-mortar retail, not to bury it. In the last two years, the company has opened 11 physical bookstores. This summer, it bought Whole Foods and its 400 grocery locations. And last week, the company announced a partnership with Kohl’s to allow returns at the physical retailer’s stores.

Why is Amazon looking more and more like an old-fashioned retailer? The company’s do-it-all corporate strategy adheres to a familiar playbook—that of Sears, Roebuck & Company. Sears might seem like a zombie today, but it’s easy to forget how transformative the company was exactly 100 years ago, when it, too, was capitalizing on a mail-to-consumer business to establish a physical retail presence.

The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.

It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.

National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 17.

National Geographic Magazine has opened its annual photo contest for 2017, with the deadline for submissions coming up on November 17. The Grand Prize Winner will receive $10,000 (USD), publication in National Geographic Magazine and a feature on National Geographic’s Instagram account. The folks at National Geographic were, once more, kind enough to let me choose among the contest entries so far for display here. The captions below were written by the individual photographers, and lightly edited for style.

What the Trump administration has been threatening is not a “preemptive strike.”

Donald Trump lies so frequently and so brazenly that it’s easy to forget that there are political untruths he did not invent. Sometimes, he builds on falsehoods that predated his election, and that enjoy currency among the very institutions that generally restrain his power.

That’s the case in the debate over North Korea. On Monday, The New York Timesdeclared that “the United States has repeatedly suggested in recent months” that it “could threaten pre-emptive military action” against North Korea. On Sunday, The Washington Post—after asking Americans whether they would “support or oppose the U.S. bombing North Korean military targets” in order “to get North Korea to give up its nuclear weapons”—announced that “Two-thirds of Americans oppose launching a preemptive military strike.” Citing the Post’s findings, The New York Times the same day reported that Americans are “deeply opposed to the kind of pre-emptive military strike” that Trump “has seemed eager to threaten.”

More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.

One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”

Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”

Senators Lindsey Graham and Bill Cassidy sparred with Bernie Sanders and Amy Klobuchar on CNN hours after their bill dismantling Obamacare appeared to collapse.

Ordinarily, you debate to stave off defeat. But for Senators Lindsey Graham and Bill Cassidy on Monday night, the defeat came first.

By the time the two GOP senators stepped on CNN’s stage Monday night for a prime-time debate over their health-care proposal, they knew they had already lost.

A few hours earlier, Senator Susan Collins became the third Republican to formally reject the pair’s legislation to repeal and replace the Affordable Care Act, effectively killing its chances for passage through the Senate this week. Graham and Cassidy had hoped to use the forum to make a closing argument for their plan, and to line it up against Senator Bernie Sanders and his call for a single-payer, “Medicare-for-All” health-care system. Instead, the two senators found themselves defending a proposal that was no less hypothetical—and probably much less popular—than Sanders’s supposed liberal fantasy.