32,000 students at the Nanterre campus of the University of Paris, but no student center, no bookstore, no student-run newspaper, no freshman orientation, no corporate recruiting system. The 480,000-volume central library is open only 10 hours a day, closed on Sundays and holidays. Only 30 of the library’s 100 computers have Internet access. (…) When the government proposed a reform in 2003 to streamline curriculums and budgets by allowing each university more flexibility and independence, students and professors rebelled. (…) While students are ready to protest against something they dislike, there is little sense of belonging or pride in one’s surroundings. During the recent protests over the contested labor law, that attitude of alienation contributed to the destruction of property, even computers and books, at some universities. NYT (2006)

How, you may ask, can anyone call higher education wasteful in an age when its financial payoff is greater than ever? The earnings premium for college graduates has rocketed to 73 percent—that is, those with a bachelor’s degree earn, on average, 73 percent more than those who have only a high-school diploma, up from about 50 percent in the late 1970s. The key issue, however, isn’t whether college pays, but why. The simple, popular answer is that schools teach students useful job skills. But this dodges puzzling questions. First and foremost: From kindergarten on, students spend thousands of hours studying subjects irrelevant to the modern labor market. Why do English classes focus on literature and poetry instead of business and technical writing? Why do advanced-math classes bother with proofs almost no student can follow? When will the typical student use history? Trigonometry? Art? Music? Physics? Latin? The class clown who snarks “What does this have to do with real life?” is onto something. The disconnect between college curricula and the job market has a banal explanation: Educators teach what they know—and most have as little firsthand knowledge of the modern workplace as I do. Yet this merely complicates the puzzle. If schools aim to boost students’ future income by teaching job skills, why do they entrust students’ education to people so detached from the real world? Because, despite the chasm between what students learn and what workers do, academic success is a strong signal of worker productivity. Suppose your law firm wants a summer associate. A law student with a doctorate in philosophy from Stanford applies. What do you infer? The applicant is probably brilliant, diligent, and willing to tolerate serious boredom. If you’re looking for that kind of worker—and what employer isn’t?—you’ll make an offer, knowing full well that nothing the philosopher learned at Stanford will be relevant to this job. The labor market doesn’t pay you for the useless subjects you master; it pays you for the preexisting traits you signal by mastering them. This is not a fringe idea. Michael Spence, Kenneth Arrow, and Joseph Stiglitz—all Nobel laureates in economics—made seminal contributions to the theory of educational signaling. Every college student who does the least work required to get good grades silently endorses the theory. But signaling plays almost no role in public discourse or policy making. As a society, we continue to push ever larger numbers of students into ever higher levels of education. The main effect is not better jobs or greater skill levels, but a credentialist arms race. Lest I be misinterpreted, I emphatically affirm that education confers some marketable skills, namely literacy and numeracy. Nonetheless, I believe that signaling accounts for at least half of college’s financial reward, and probably more. Most of the salary payoff for college comes from crossing the graduation finish line. Suppose you drop out after a year. You’ll receive a salary bump compared with someone who’s attended no college, but it won’t be anywhere near 25 percent of the salary premium you’d get for a four-year degree. Similarly, the premium for sophomore year is nowhere near 50 percent of the return on a bachelor’s degree, and the premium for junior year is nowhere near 75 percent of that return. Indeed, in the average study, senior year of college brings more than twice the pay increase of freshman, sophomore, and junior years combined. Unless colleges delay job training until the very end, signaling is practically the only explanation. This in turn implies a mountain of wasted resources—time and money that would be better spent preparing students for the jobs they’re likely to do. The conventional view—that education pays because students learn—assumes that the typical student acquires, and retains, a lot of knowledge. She doesn’t. Teachers often lament summer learning loss: Students know less at the end of summer than they did at the beginning. But summer learning loss is only a special case of the problem of fade-out: Human beings have trouble retaining knowledge they rarely use. Of course, some college graduates use what they’ve learned and thus hold on to it—engineers and other quantitative types, for example, retain a lot of math. But when we measure what the average college graduate recalls years later, the results are discouraging, to say the least. In 2003, the United States Department of Education gave about 18,000 Americans the National Assessment of Adult Literacy. The ignorance it revealed is mind-numbing. Fewer than a third of college graduates received a composite score of “proficient”—and about a fifth were at the “basic” or “below basic” level. You could blame the difficulty of the questions—until you read them. Plenty of college graduates couldn’t make sense of a table explaining how an employee’s annual health-insurance costs varied with income and family size, or summarize the work-experience requirements in a job ad, or even use a newspaper schedule to find when a television program ended. Tests of college graduates’ knowledge of history, civics, and science have had similarly dismal results.Of course, college students aren’t supposed to just download facts; they’re supposed to learn how to think in real life. How do they fare on this count? The most focused study of education’s effect on applied reasoning, conducted by Harvard’s David Perkins in the mid-1980s, assessed students’ oral responses to questions designed to measure informal reasoning, such as “Would a proposed law in Massachusetts requiring a five-cent deposit on bottles and cans significantly reduce litter?” The benefit of college seemed to be zero: Fourth-year students did no better than first-year students.Other evidence is equally discouraging. One researcher tested Arizona State University students’ ability to “apply statistical and methodological concepts to reasoning about everyday-life events.” (…) Educational psychologists have discovered that much of our knowledge is “inert.” Students who excel on exams frequently fail to apply their knowledge to the real world. Take physics. As the Harvard psychologist Howard Gardner writes, Students who receive honor grades in college-level physics courses are frequently unable to solve basic problems and questions encountered in a form slightly different from that on which they have been formally instructed and tested. The same goes for students of biology, mathematics, statistics, and, I’m embarrassed to say, economics. I try to teach my students to connect lectures to the real world and daily life. My exams are designed to measure comprehension, not memorization. Yet in a good class, four test-takers out of 40 demonstrate true economic understanding. (…) Indeed, today’s college students are less willing than those of previous generations to do the bare minimum of showing up for class and temporarily learning whatever’s on the test. Fifty years ago, college was a full-time job. The typical student spent 40 hours a week in class or studying. Effort has since collapsed across the board. “Full time” college students now average 27 hours of academic work a week—including just 14 hours spent studying. What are students doing with their extra free time? Having fun. (…) Arum and Roksa cite a study finding that students at one typical college spent 13 hours a week studying, 12 hours “socializing with friends,” 11 hours “using computers for fun,” eight hours working for pay, six hours watching TV, six hours exercising, five hours on “hobbies,” and three hours on “other forms of entertainment.” Grade inflation completes the idyllic package by shielding students from negative feedback. The average GPA is now 3.2. What does this mean for the individual student? Would I advise an academically well-prepared 18-year-old to skip college because she won’t learn much of value? Absolutely not. Studying irrelevancies for the next four years will impress future employers and raise her income potential. If she tried to leap straight into her first white-collar job, insisting, “I have the right stuff to graduate, I just choose not to,” employers wouldn’t believe her. To unilaterally curtail your education is to relegate yourself to a lower-quality pool of workers. For the individual, college pays. This does not mean, however, that higher education paves the way to general prosperity or social justice. When we look at countries around the world, a year of education appears to raise an individual’s income by 8 to 11 percent. By contrast, increasing education across a country’s population by an average of one year per person raises the national income by only 1 to 3 percent. In other words, education enriches individuals much more than it enriches nations. How is this possible? Credential inflation: As the average level of education rises, you need more education to convince employers you’re worthy of any specific job. One research team found that from the early 1970s through the mid‑1990s, the average education level within 500 occupational categories rose by 1.2 years. But most of the jobs didn’t change much over that span—there’s no reason, except credential inflation, why people should have needed more education to do them in 1995 than in 1975. What’s more, all American workers’ education rose by 1.5 years in that same span—which is to say that a great majority of the extra education workers received was deployed not to get better jobs, but to get jobs that had recently been held by people with less education. As credentials proliferate, so do failed efforts to acquire them. Students can and do pay tuition, kill a year, and flunk their finals. Any respectable verdict on the value of education must account for these academic bankruptcies. Failure rates are high, particularly for students with low high-school grades and test scores; all told, about 60 percent of full-time college students fail to finish in four years. Simply put, the push for broader college education has steered too many students who aren’t cut out for academic success onto the college track. The college-for-all mentality has fostered neglect of a realistic substitute: vocational education. It takes many guises—classroom training, apprenticeships and other types of on-the-job training, and straight-up work experience—but they have much in common. All vocational education teaches specific job skills, and all vocational education revolves around learning by doing, not learning by listening. Research, though a bit sparse, suggests that vocational education raises pay, reduces unemployment, and increases the rate of high-school completion. Defenders of traditional education often appeal to the obscurity of the future. What’s the point of prepping students for the economy of 2018, when they’ll be employed in the economy of 2025 or 2050? But ignorance of the future is no reason to prepare students for occupations they almost surely won’t have—and if we know anything about the future of work, we know that the demand for authors, historians, political scientists, physicists, and mathematicians will stay low. It’s tempting to say that students on the college track can always turn to vocational education as a Plan B, but this ignores the disturbing possibility that after they crash, they’ll be too embittered to go back and learn a trade. The vast American underclass shows that this disturbing possibility is already our reality.Education is so integral to modern life that we take it for granted. Young people have to leap through interminable academic hoops to secure their place in the adult world. My thesis, in a single sentence: Civilized societies revolve around education now, but there is a better—indeed, more civilized—way. If everyone had a college degree, the result would be not great jobs for all, but runaway credential inflation. Trying to spread success with education spreads education but not success.Bryan Caplan

“Super People,” the writer James Atlas has called them—the stereotypical ultra-high-achieving elite college students of today. A double major, a sport, a musical instrument, a couple of foreign languages, service work in distant corners of the globe, a few hobbies thrown in for good measure: They have mastered them all, and with a serene self-assurance that leaves adults and peers alike in awe. A friend who teaches at a top university once asked her class to memorize 30 lines of the eighteenth-century poet Alexander Pope. Nearly every single kid got every single line correct. It was a thing of wonder, she said, like watching thoroughbreds circle a track. These enviable youngsters appear to be the winners in the race we have made of childhood. But the reality is very different, as I have witnessed in many of my own students and heard from the hundreds of young people whom I have spoken with on campuses or who have written to me over the last few years. Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it. When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them—the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine. In short, our entire system of elite education. (…) I taught many wonderful young people during my years in the Ivy League—bright, thoughtful, creative kids whom it was a pleasure to talk with and learn from. But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice. (…) So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk. You have no margin for error, so you avoid the possibility that you will ever make an error. Once, a student at Pomona told me that she’d love to have a chance to think about the things she’s studying, only she doesn’t have the time. I asked her if she had ever considered not trying to get an A in every class. She looked at me as if I had made an indecent suggestion. (…) “Return on investment”: that’s the phrase you often hear today when people talk about college. What no one seems to ask is what the “return” is supposed to be. Is it just about earning more money? Is the only purpose of an education to enable you to get a job? (…) At least the classes at elite schools are academically rigorous, demanding on their own terms, no? Not necessarily. In the sciences, usually; in other disciplines, not so much. There are exceptions, of course, but professors and students have largely entered into what one observer called a “nonaggression pact.” Students are regarded by the institution as “customers,” people to be pandered to instead of challenged. Professors are rewarded for research, so they want to spend as little time on their classes as they can. The profession’s whole incentive structure is biased against teaching, and the more prestigious the school, the stronger the bias is likely to be. The result is higher marks for shoddier work. It is true that today’s young people appear to be more socially engaged than kids have been for several decades and that they are more apt to harbor creative or entrepreneurial impulses. But it is also true, at least at the most selective schools, that even if those aspirations make it out of college—a big “if”—they tend to be played out within the same narrow conception of what constitutes a valid life: affluence, credentials, prestige. Experience itself has been reduced to instrumental function, via the college essay. From learning to commodify your experiences for the application, the next step has been to seek out experiences in order to have them to commodify. The New York Times reports that there is now a thriving sector devoted to producing essay-ready summers, but what strikes one is the superficiality of the activities involved: a month traveling around Italy studying the Renaissance, “a whole day” with a band of renegade artists. A whole day! I’ve noticed something similar when it comes to service. Why is it that people feel the need to go to places like Guatemala to do their projects of rescue or documentation, instead of Milwaukee or Arkansas? When students do stay in the States, why is it that so many head for New Orleans? Perhaps it’s no surprise, when kids are trained to think of service as something they are ultimately doing for themselves—that is, for their résumés. “Do well by doing good,” goes the slogan. How about just doing good? If there is one idea, above all, through which the concept of social responsibility is communicated at the most prestigious schools, it is “leadership.” “Harvard is for leaders,” goes the Cambridge cliché. To be a high-achieving student is to constantly be urged to think of yourself as a future leader of society. But what these institutions mean by leadership is nothing more than getting to the top. Making partner at a major law firm or becoming a chief executive, climbing the greasy pole of whatever hierarchy you decide to attach yourself to. I don’t think it occurs to the people in charge of elite colleges that the concept of leadership ought to have a higher meaning, or, really, any meaning. The irony is that elite students are told that they can be whatever they want, but most of them end up choosing to be one of a few very similar things. As of 2010, about a third of graduates went into financing or consulting at a number of top schools, including Harvard, Princeton, and Cornell. Whole fields have disappeared from view: the clergy, the military, electoral politics, even academia itself, for the most part, including basic science. It’s considered glamorous to drop out of a selective college if you want to become the next Mark Zuckerberg, but ludicrous to stay in to become a social worker. “What Wall Street figured out,” as Ezra Klein has put it, “is that colleges are producing a large number of very smart, completely confused graduates. Kids who have ample mental horsepower, an incredible work ethic and no idea what to do next.” (…) The sign of the system’s alleged fairness is the set of policies that travel under the banner of “diversity.” And that diversity does indeed represent nothing less than a social revolution. Princeton, which didn’t even admit its first woman graduate student until 1961—a year in which a grand total of one (no doubt very lonely) African American matriculated at its college—is now half female and only about half white. But diversity of sex and race has become a cover for increasing economic resegregation. Elite colleges are still living off the moral capital they earned in the 1960s, when they took the genuinely courageous step of dismantling the mechanisms of the WASP aristocracy. The truth is that the meritocracy was never more than partial. Visit any elite campus across our great nation, and you can thrill to the heart-warming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals. Kids at schools like Stanford think that their environment is diverse if one comes from Missouri and another from Pakistan, or if one plays the cello and the other lacrosse. Never mind that all of their parents are doctors or bankers. That doesn’t mean there aren’t a few exceptions, but that is all they are. In fact, the group that is most disadvantaged by our current admissions policies are working-class and rural whites, who are hardly present on selective campuses at all. (…) The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent. As of 2006, only about 15 percent of students at the most competitive schools came from the bottom half. The more prestigious the school, the more unequal its student body is apt to be. And public institutions are not much better than private ones. As of 2004, 40 percent of first-year students at the most selective state campuses came from families with incomes of more than $100,000, up from 32 percent just five years earlier. The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game. The more hurdles there are, the more expensive it is to catapult your kid across them. Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools. The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely. Today, fewer than half of high-scoring students from low-income families even enroll at four-year schools. The problem isn’t that there aren’t more qualified lower-income kids from which to choose. Elite private colleges will never allow their students’ economic profile to mirror that of society as a whole. They can’t afford to—they need a critical mass of full payers and they need to tend to their donor base—and it’s not even clear that they’d want to. And so it is hardly a coincidence that income inequality is higher than it has been since before the Great Depression, or that social mobility is lower in the United States than in almost every other developed country. Elite colleges are not just powerless to reverse the movement toward a more unequal society; their policies actively promote it. Is there anything that I can do, a lot of young people have written to ask me, to avoid becoming an out-of-touch, entitled little shit? I don’t have a satisfying answer, short of telling them to transfer to a public university. You cannot cogitate your way to sympathy with people of different backgrounds, still less to knowledge of them. You need to interact with them directly, and it has to be on an equal footing: not in the context of “service,” and not in the spirit of “making an effort,” either—swooping down on a member of the college support staff and offering to “buy them a coffee,” as a former Yalie once suggested, in order to “ask them about themselves.” Instead of service, how about service work? That’ll really give you insight into other people. How about waiting tables so that you can see how hard it is, physically and mentally? You really aren’t as smart as everyone has been telling you; you’re only smarter in a certain way. There are smart people who do not go to a prestigious college, or to any college—often precisely for reasons of class. There are smart people who are not “smart.” (…) U.S. News and World Report supplies the percentage of freshmen at each college who finished in the highest 10 percent of their high school class. Among the top 20 universities, the number is usually above 90 percent. I’d be wary of attending schools like that. Students determine the level of classroom discussion; they shape your values and expectations, for good and ill. It’s partly because of the students that I’d warn kids away from the Ivies and their ilk. Kids at less prestigious schools are apt to be more interesting, more curious, more open, and far less entitled and competitive. (…) The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do. They should refuse to be impressed by any opportunity that was enabled by parental wealth. Of course, they have to stop cooperating with U.S. News.More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind. The changes must go deeper, though, than reforming the admissions process. That might address the problem of mediocrity, but it won’t address the greater one of inequality. The problem is the Ivy League itself. We have contracted the training of our leadership class to a set of private institutions. However much they claim to act for the common good, they will always place their interests first. The arrangement is great for the schools, but is Harvard’s desire for alumni donations a sufficient reason to perpetuate the class system? I used to think that we needed to create a world where every child had an equal chance to get to the Ivy League. I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education. High-quality public education, financed with public money, for the benefit of all: the exact commitment that drove the growth of public higher education in the postwar years. Everybody gets an equal chance to go as far as their hard work and talent will take them—you know, the American dream. Everyone who wants it gets to have the kind of mind-expanding, soul-enriching experience that a liberal arts education provides. We recognize that free, quality K–12 education is a right of citizenship. We also need to recognize—as we once did and as many countries still do—that the same is true of higher education. We have tried aristocracy. We have tried meritocracy. Now it’s time to try democracy. William Deresiewicz

The biggest problem is that the advice in Deresiewicz’s title is perversely wrongheaded. If your kid has survived the application ordeal and has been offered a place at an elite university, don’t punish her for the irrationalities of a system she did nothing to create; by all means send her there! The economist Caroline Hoxby has shown that selective universities spend twenty times more on student instruction, support, and facilities than less selective ones, while their students pay for a much smaller fraction of it, thanks to gifts to the college. Because of these advantages, it’s the selective institutions that are the real bargains in the university marketplace. Holding qualifications constant, graduates of a selective university are more likely to graduate on time, will tend to find a more desirable spouse, and will earn 20 percent more than those of less selective universities—every year for the rest of their working lives. These advantages swamp any differences in tuition and other expenses, which in any case are often lower than those of less selective schools because of more generous need-based financial aid. The Ivy admissions sweepstakes may be irrational, but the parents and teenagers who clamber to win it are not. Any rethinking of elite university admissions must begin with an inkling of the goals of a university education. As the song says, if you don’t know where you’re going, any road will take you there. One contributor to the admissions mess is that so few of a university’s thought leaders can say anything coherent about what those goals are. (…) It’s easy to agree with him that “the first thing that college is for is to teach you to think,” but much harder to figure out what that means. Deresiewicz knows what it does not mean—“the analytical and rhetorical skills that are necessary for success in business and the professions”—but this belletristic disdain for the real world is unhelpful. The skills necessary for success in the professions include organizing one’s thoughts so that they may be communicated clearly to others, breaking a complex problem into its components, applying general principles to specific cases, discerning cause and effect, and negotiating tradeoffs between competing values. In what rarefied ivory chateau do these skills not count as “thinking”? In its place Deresiewicz says only that learning to think consists of “contemplating things from a distance,” with no hint as to what that contemplation should consist of or where it should lead. This leads to Deresiewicz’s second goal, “building a self,” which he explicates as follows: “it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul.” Perhaps I am emblematic of everything that is wrong with elite American education, but I have no idea how to get my students to build a self or become a soul. (…) I heartily agree with Deresiewicz that high-quality postsecondary education is a public good which should be accessible to any citizen who can profit from it. At the same time, there are reasons for students to distribute themselves among colleges with different emphases and degrees of academic rigor. People vary in their innate and acquired intelligence, their taste for abstraction, their familiarity with literate culture, their priorities in life, and their personality traits relevant to learning. I could not offer a course in brain science or linguist theory to a representative sample of the college-age population without baffling many students at one end and boring an equal number at the other. Also, students learn as much from their peers as their professors, and benefit from a cohort with which they can bat around ideas. Not least, a vibrant research institution must bring smarter undergraduates into the fold, to challenge received wisdom, inject energy and innovation, and replenish its senescing membership. All this is to say that there are good reasons to have selective universities. The question is, How well are the Ivies fulfilling their mandate? After three stints teaching at Harvard spanning almost four decades, I am repeatedly astounded by the answer. Like many observers of American universities, I used to believe the following story. Once upon a time Harvard was a finishing school for the plutocracy, where preppies and Kennedy scions earned gentleman’s Cs while playing football, singing in choral groups, and male-bonding at final clubs, while the blackballed Jews at CCNY founded left-wing magazines and slogged away in labs that prepared them for their Nobel prizes in science. Then came Sputnik, the ’60s, and the decline of genteel racism and anti-Semitism, and Harvard had to retool itself as a meritocracy, whose best-and-brightest gifts to America would include recombinant DNA, Wall Street quants, The Simpsons, Facebook, and the masthead of The New Republic. This story has a grain of truth in it: Hoxby has documented that the academic standards for admission to elite universities have risen over the decades. But entrenched cultures die hard, and the ghost of Oliver Barrett IV still haunts every segment of the Harvard pipeline. At the admissions end, it’s common knowledge that Harvard selects at most 10 percent (some say 5 percent) of its students on the basis of academic merit. At an orientation session for new faculty, we were told that Harvard “wants to train the future leaders of the world, not the future academics of the world,” and that “We want to read about our student in Newsweek 20 years hence” (prompting the woman next to me to mutter, “Like the Unabomber”). The rest are selected “holistically,” based also on participation in athletics, the arts, charity, activism, travel, and, we inferred (Not in front of the children!), race, donations, and legacy status (since anything can be hidden behind the holistic fig leaf). The lucky students who squeeze through this murky bottleneck find themselves in an institution that is single-mindedly and expensively dedicated to the pursuit of knowledge. It has an astonishing library system that pays through the nose for rare manuscripts, obscure tomes, and extortionately priced journals; exotic laboratories at the frontiers of neuroscience, regenerative medicine, cosmology, and other thrilling pursuits; and a professoriate with erudition in an astonishing range of topics, including many celebrity teachers and academic rock stars. The benefits of matching this intellectual empyrean with the world’s smartest students are obvious. So why should an ability to play the bassoon or chuck a lacrosse ball be given any weight in the selection process? The answer, ironically enough, makes the admissocrats and Deresiewicz strange bedfellows: the fear of selecting a class of zombies, sheep, and grinds. But as with much in the Ivies’ admission policies, little thought has given to the consequences of acting on this assumption. Jerome Karabel has unearthed a damning paper trail showing that in the first half of the twentieth century, holistic admissions were explicitly engineered to cap the number of Jewish students. Ron Unz, in an exposé even more scathing than Deresiewicz’s, has assembled impressive circumstantial evidence that the same thing is happening today with Asians. Just as troublingly, why are elite universities, of all institutions, perpetuating the destructive stereotype that smart people are one-dimensional dweebs? It would be an occasion for hilarity if anyone suggested that Harvard pick its graduate students, faculty, or president for their prowess in athletics or music, yet these people are certainly no shallower than our undergraduates. In any case, the stereotype is provably false. Camilla Benbow and David Lubinski have tracked a large sample of precocious teenagers identified solely by high performance on the SAT, and found that when they grew up, they not only excelled in academia, technology, medicine, and business, but won outsize recognition for their novels, plays, poems, paintings, sculptures, and productions in dance, music, and theater. A comparison to a Harvard freshman class would be like a match between the Harlem Globetrotters and the Washington Generals. What about the rationalization that charitable extracurricular activities teach kids important lessons of moral engagement? There are reasons to be skeptical. A skilled professional I know had to turn down an important freelance assignment because of a recurring commitment to chauffeur her son to a resumé-building “social action” assignment required by his high school. This involved driving the boy for 45 minutes to a community center, cooling her heels while he sorted used clothing for charity, and driving him back—forgoing income which, judiciously donated, could have fed, clothed, and inoculated an African village. The dubious “lessons” of this forced labor as an overqualified ragpicker are that children are entitled to treat their mothers’ time as worth nothing, that you can make the world a better place by destroying economic value, and that the moral worth of an action should be measured by the conspicuousness of the sacrifice rather than the gain to the beneficiary. Knowing how our students are selected, I should not have been surprised when I discovered how they treat their educational windfall once they get here. A few weeks into every semester, I face a lecture hall that is half-empty, despite the fact that I am repeatedly voted a Harvard Yearbook Favorite Professor, that the lectures are not video-recorded, and that they are the only source of certain material that will be on the exam. I don’t take it personally; it’s common knowledge that Harvard students stay away from lectures in droves, burning a fifty-dollar bill from their parents’ wallets every time they do. Obviously they’re not slackers; the reason is that they are crazy-busy. Since they’re not punching a clock at Safeway or picking up kids at day-care, what could they be doing that is more important than learning in class? The answer is that they are consumed by the same kinds of extracurricular activities that got them here in the first place. Some of these activities, like writing for the campus newspaper, are clearly educational, but most would be classified in any other setting as recreation: sports, dance, improv comedy, and music, music, music (many students perform in more than one ensemble). The commitments can be draconian: a member of the crew might pull an oar four hours a day, seven days a week, and musical ensembles can be just as demanding. Many students have told me that the camaraderie, teamwork, and sense of accomplishment made these activities their most important experiences at Harvard. But it’s not clear why they could not have had the same experiences at Tailgate State, or, for that matter, the local YMCA, opening up places for less “well-rounded” students who could take better advantage of the libraries, labs, and lectures. The anti-intellectualism of Ivy League undergraduate education is by no means indigenous to the student culture. It’s reinforced by the administration, which treats academics as just one option in the college activity list. Though students are flooded with hortatory messages from deans and counselors, “Don’t cut class” is not among them, and professors are commonly discouraged from getting in the way of the students’ fun. Deans have asked me not to schedule a midterm on a big party day, and to make it easy for students to sell their textbooks before the ink is dry on their final exams. A failing grade is like a death sentence: just the first step in a mandatory appeal process. It’s not that students are unconditionally pampered. They may be disciplined by an administrative board with medieval standards of jurisprudence, pressured to sign a kindness pledge suitable for kindergarten, muzzled by speech codes that would not pass the giggle test if challenged on First Amendment grounds, and publicly shamed for private emails that express controversial opinions. The common denominator (belying any hope that an elite university education helps students develop a self) is that they are not treated as competent grown-ups, starting with the first law of adulthood: first attend to your priorities, then you get to play. My third surprise was what happens to Harvard students at the other end of the pipeline: they get snatched up by the big consulting and investment firms, helping to explain that 20 percent boost in their expected earnings. Why, I wondered, do these cutthroat institutions hire rowers and baritones who know diddly-squat about business just because they have a transcript with the word “Veritas” on it? Wouldn’t they get more value by hiring the best finance major from Ohio State? I asked some people familiar with this world to explain what seemed to me like a massive market failure. They responded candidly. First, an Ivy degree is treated as a certification of intelligence and self-discipline. Apparently adding a few Harvard students to a team raises its average intelligence and makes it more effective at solving problems. That, the employers feel, is more valuable than specific knowledge, which smart people can pick up quickly in any case. Second, a little education can go a long way. As one business-school professor put it, “I have observed many smart people who have little idea of how to logically think through a problem, who infer causation from a correlation, and who use anecdotes as evidence far beyond the predictability warranted. Most of the undergrads who go to the consulting firms did take a course in social science, and much of this basic logic can be obtained there.” More disconcertingly, I was told that Ivy League graduates are a prestige good: having a lot of them in your firm is like wearing a Rolex or driving a Bentley. Also, if something goes wrong, your keister is covered. As they used to say about computers, “No one ever got fired for buying IBM.” Is this any way to run a meritocracy? Ivy admissions policies force teenagers and their mothers into a potlatch of conspicuous leisure and virtue. The winners go to an exorbitant summer camp, most of them indifferent to the outstanding facilities of scholarship and research that are bundled with it. They can afford this insouciance because the piece of paper they leave with serves as a quarter-million-dollar IQ and Marshmallow test. The self-fulfilling aura of prestige ensures that companies will overlook better qualified graduates of store-brand schools. And the size of the jackpot means that it’s rational for families to play this irrational game. What would it take to fix this wasteful and unjust system? Let’s daydream for a moment. If only we had some way to divine the suitability of a student for an elite education, without ethnic bias, undeserved advantages to the wealthy, or pointless gaming of the system. If only we had some way to match jobs with candidates that was not distorted by the halo of prestige. A sample of behavior that could be gathered quickly and cheaply, assessed objectively, and double-checked for its ability to predict the qualities we value…. We do have this magic measuring stick, of course: it’s called standardized testing. I suspect that a major reason we slid into this madness and can’t seem to figure out how to get out of it is that the American intelligentsia has lost the ability to think straight about objective tests. After all, if the Ivies admitted the highest scoring kids at one end, and companies hired the highest scoring graduates across all universities at the other (with tests that tap knowledge and skill as well as aptitude), many of the perversities of the current system would vanish overnight. Other industrialized countries, lacking our squeamishness about testing, pick their elite students this way, as do our firms in high technology. And as Adrian Wooldridge pointed out in these pages two decades ago, test-based selection used to be the enlightened policy among liberals and progressives, since it can level a hereditary caste system by favoring the Jenny Cavilleris (poor and smart) over the Oliver Barretts (rich and stupid). If, for various reasons, a university didn’t want a freshman class composed solely of scary-smart kids, there are simple ways to shake up the mixture. Unz suggests that Ivies fill a certain fraction of the incoming class with the highest-scoring applicants, and select the remainder from among the qualified applicant pool by lottery. One can imagine various numerical tweaks, including ones that pull up the number of minorities or legacies to the extent that those goals can be publicly justified. Grades or class rank could also be folded into the calculation. Details aside, it’s hard to see how a simple, transparent, and objective formula would be worse than the eye-of-newt-wing-of-bat mysticism that jerks teenagers and their moms around and conceals unknown mischief. So why aren’t creative alternatives like this even on the table? A major reason is that popular writers like Stephen Jay Gould and Malcolm Gladwell, pushing a leftist or heart-above-head egalitarianism, have poisoned their readers against aptitude testing. They have insisted that the tests don’t predict anything, or that they do but only up to a limited point on the scale, or that they do but only because affluent parents can goose their children’s scores by buying them test-prep courses. (…) Paul Sackett and his collaborators have shown that SAT scores predict future university grades, holding all else constant, whereas parental SES does not. Matt McGue has shown, moreover, that adolescents’ test scores track the SES only of their biological parents, not (for adopted kids) of their adoptive parents, suggesting that the tracking reflects shared genes, not economic privilege. (…) After first denying that we have ever tried meritocracy, Deresiewicz concludes by saying that we have tried it, and now should try “democracy” instead, by which he seems to mean a world in which the distribution of incomes of Ivy League families would be identical to that of the country as a whole. But as long as the correlation between wealth and aptitude is not zero, that goal is neither possible nor desirable. Still, he’s right that the current system is harmful and unfair. What he could have said is that elite universities are nothing close to being meritocracies. We know that because they don’t admit most of their students on the basis of academic aptitude. And perhaps that’s what we should try next. Steven Pinker

In the spring of 2008, I did a daylong stint on the Yale admissions committee. We—that is, three admissions staff, a member of the college dean’s office, and me, the faculty representative—were going through submissions from eastern Pennsylvania. The applicants had been assigned a score from one to four, calculated from a string of figures and codes—SATs, GPA, class rank, numerical scores to which the letters of recommendation had been converted, special notations for legacies and diversity cases. The ones had already been admitted, and the threes and fours could get in only under special conditions—if they were a nationally ranked athlete, for instance, or a “DevA,” (an applicant in the highest category of “development” cases, which means a child of very rich donors). Our task for the day was to adjudicate among the twos. Huge bowls of junk food were stationed at the side of the room to keep our energy up.

The junior officer in charge, a young man who looked to be about 30, presented each case, rat-a-tat-tat, in a blizzard of admissions jargon that I had to pick up on the fly. “Good rig”: the transcript exhibits a good degree of academic rigor. “Ed level 1”: parents have an educational level no higher than high school, indicating a genuine hardship case. “MUSD”: a musician in the highest category of promise. Kids who had five or six items on their list of extracurriculars—the “brag”—were already in trouble, because that wasn’t nearly enough. We listened, asked questions, dove into a letter or two, then voted up or down.

With so many accomplished applicants to choose from, we were looking for kids with something special, “PQs”—personal qualities—that were often revealed by the letters or essays. Kids who only had the numbers and the résumé were usually rejected: “no spark,” “not a team-builder,” “this is pretty much in the middle of the fairway for us.” One young person, who had piled up a truly insane quantity of extracurriculars and who submitted nine letters of recommendation, was felt to be “too intense.” On the other hand, the numbers and the résumé were clearly indispensable. I’d been told that successful applicants could either be “well-rounded” or “pointy”—outstanding in one particular way—but if they were pointy, they had to be really pointy: a musician whose audition tape had impressed the music department, a scientist who had won a national award.

“Super People,” the writer James Atlas has called them—the stereotypical ultra-high-achieving elite college students of today. A double major, a sport, a musical instrument, a couple of foreign languages, service work in distant corners of the globe, a few hobbies thrown in for good measure: They have mastered them all, and with a serene self-assurance that leaves adults and peers alike in awe. A friend who teaches at a top university once asked her class to memorize 30 lines of the eighteenth-century poet Alexander Pope. Nearly every single kid got every single line correct. It was a thing of wonder, she said, like watching thoroughbreds circle a track.

These enviable youngsters appear to be the winners in the race we have made of childhood. But the reality is very different, as I have witnessed in many of my own students and heard from the hundreds of young people whom I have spoken with on campuses or who have written to me over the last few years. Our system of elite education manufactures young people who are smart and talented and driven, yes, but also anxious, timid, and lost, with little intellectual curiosity and a stunted sense of purpose: trapped in a bubble of privilege, heading meekly in the same direction, great at what they’re doing but with no idea why they’re doing it.

When I speak of elite education, I mean prestigious institutions like Harvard or Stanford or Williams as well as the larger universe of second-tier selective schools, but I also mean everything that leads up to and away from them—the private and affluent public high schools; the ever-growing industry of tutors and consultants and test-prep courses; the admissions process itself, squatting like a dragon at the entrance to adulthood; the brand-name graduate schools and employment opportunities that come after the B.A.; and the parents and communities, largely upper-middle class, who push their children into the maw of this machine. In short, our entire system of elite education.

I should say that this subject is very personal for me. Like so many kids today, I went off to college like a sleepwalker. You chose the most prestigious place that let you in; up ahead were vaguely understood objectives: status, wealth—“success.” What it meant to actually get an education and why you might want one—all this was off the table. It was only after 24 years in the Ivy League—college and a Ph.D. at Columbia, ten years on the faculty at Yale—that I started to think about what this system does to kids and how they can escape from it, what it does to our society and how we can dismantle it.

A young woman from another school wrote me this about her boyfriend at Yale:

Before he started college, he spent most of his time reading and writing short stories. Three years later, he’s painfully insecure, worrying about things my public-educated friends don’t give a second thought to, like the stigma of eating lunch alone and whether he’s “networking” enough. No one but me knows he fakes being well-read by thumbing through the first and last chapters of any book he hears about and obsessively devouring reviews in lieu of the real thing. He does this not because he’s incurious, but because there’s a bigger social reward for being able to talk about books than for actually reading them.

I taught many wonderful young people during my years in the Ivy League—bright, thoughtful, creative kids whom it was a pleasure to talk with and learn from. But most of them seemed content to color within the lines that their education had marked out for them. Very few were passionate about ideas. Very few saw college as part of a larger project of intellectual discovery and development. Everyone dressed as if they were ready to be interviewed at a moment’s notice.

Look beneath the façade of seamless well-adjustment, and what you often find are toxic levels of fear, anxiety, and depression, of emptiness and aimlessness and isolation. A large-scale survey of college freshmen recently found that self-reports of emotional well-being have fallen to their lowest level in the study’s 25-year history.

So extreme are the admission standards now that kids who manage to get into elite colleges have, by definition, never experienced anything but success. The prospect of not being successful terrifies them, disorients them. The cost of falling short, even temporarily, becomes not merely practical, but existential. The result is a violent aversion to risk. You have no margin for error, so you avoid the possibility that you will ever make an error. Once, a student at Pomona told me that she’d love to have a chance to think about the things she’s studying, only she doesn’t have the time. I asked her if she had ever considered not trying to get an A in every class. She looked at me as if I had made an indecent suggestion.

There are exceptions, kids who insist, against all odds, on trying to get a real education. But their experience tends to make them feel like freaks. One student told me that a friend of hers had left Yale because she found the school “stifling to the parts of yourself that you’d call a soul.”

“Return on investment”: that’s the phrase you often hear today when people talk about college. What no one seems to ask is what the “return” is supposed to be. Is it just about earning more money? Is the only purpose of an education to enable you to get a job? What, in short, is college for?

The first thing that college is for is to teach you to think. That doesn’t simply mean developing the mental skills particular to individual disciplines. College is an opportunity to stand outside the world for a few years, between the orthodoxy of your family and the exigencies of career, and contemplate things from a distance.

Learning how to think is only the beginning, though. There’s something in particular you need to think about: building a self. The notion may sound strange. “We’ve taught them,” David Foster Wallace once said, “that a self is something you just have.” But it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul. The job of college is to assist you to begin to do that. Books, ideas, works of art and thought, the pressure of the minds around you that are looking for their own answers in their own ways.

College is not the only chance to learn to think, but it is the best. One thing is certain: If you haven’t started by the time you finish your B.A., there’s little likelihood you’ll do it later. That is why an undergraduate experience devoted exclusively to career preparation is four years largely wasted.

Elite schools like to boast that they teach their students how to think, but all they mean is that they train them in the analytic and rhetorical skills that are necessary for success in business and the professions. Everything is technocratic—the development of expertise—and everything is ultimately justified in technocratic terms.

Religious colleges—even obscure, regional schools that no one has ever heard of on the coasts—often do a much better job in that respect. What an indictment of the Ivy League and its peers: that colleges four levels down on the academic totem pole, enrolling students whose SAT scores are hundreds of points lower than theirs, deliver a better education, in the highest sense of the word.

At least the classes at elite schools are academically rigorous, demanding on their own terms, no? Not necessarily. In the sciences, usually; in other disciplines, not so much. There are exceptions, of course, but professors and students have largely entered into what one observer called a “nonaggression pact.” Students are regarded by the institution as “customers,” people to be pandered to instead of challenged. Professors are rewarded for research, so they want to spend as little time on their classes as they can. The profession’s whole incentive structure is biased against teaching, and the more prestigious the school, the stronger the bias is likely to be. The result is higher marks for shoddier work.

It is true that today’s young people appear to be more socially engaged than kids have been for several decades and that they are more apt to harbor creative or entrepreneurial impulses. But it is also true, at least at the most selective schools, that even if those aspirations make it out of college—a big “if”—they tend to be played out within the same narrow conception of what constitutes a valid life: affluence, credentials, prestige.

Experience itself has been reduced to instrumental function, via the college essay. From learning to commodify your experiences for the application, the next step has been to seek out experiences in order to have them to commodify. The New York Times reports that there is now a thriving sector devoted to producing essay-ready summers, but what strikes one is the superficiality of the activities involved: a month traveling around Italy studying the Renaissance, “a whole day” with a band of renegade artists. A whole day!

I’ve noticed something similar when it comes to service. Why is it that people feel the need to go to places like Guatemala to do their projects of rescue or documentation, instead of Milwaukee or Arkansas? When students do stay in the States, why is it that so many head for New Orleans? Perhaps it’s no surprise, when kids are trained to think of service as something they are ultimately doing for themselves—that is, for their résumés. “Do well by doing good,” goes the slogan. How about just doing good?

If there is one idea, above all, through which the concept of social responsibility is communicated at the most prestigious schools, it is “leadership.” “Harvard is for leaders,” goes the Cambridge cliché. To be a high-achieving student is to constantly be urged to think of yourself as a future leader of society. But what these institutions mean by leadership is nothing more than getting to the top. Making partner at a major law firm or becoming a chief executive, climbing the greasy pole of whatever hierarchy you decide to attach yourself to. I don’t think it occurs to the people in charge of elite colleges that the concept of leadership ought to have a higher meaning, or, really, any meaning.

The irony is that elite students are told that they can be whatever they want, but most of them end up choosing to be one of a few very similar things. As of 2010, about a third of graduates went into financing or consulting at a number of top schools, including Harvard, Princeton, and Cornell. Whole fields have disappeared from view: the clergy, the military, electoral politics, even academia itself, for the most part, including basic science. It’s considered glamorous to drop out of a selective college if you want to become the next Mark Zuckerberg, but ludicrous to stay in to become a social worker. “What Wall Street figured out,” as Ezra Klein has put it, “is that colleges are producing a large number of very smart, completely confused graduates. Kids who have ample mental horsepower, an incredible work ethic and no idea what to do next.”

For the most selective colleges, this system is working very well indeed. Application numbers continue to swell, endowments are robust, tuition hikes bring ritual complaints but no decline in business. Whether it is working for anyone else is a different question.

It almost feels ridiculous to have to insist that colleges like Harvard are bastions of privilege, where the rich send their children to learn to walk, talk, and think like the rich. Don’t we already know this? They aren’t called elite colleges for nothing. But apparently we like pretending otherwise. We live in a meritocracy, after all.

The sign of the system’s alleged fairness is the set of policies that travel under the banner of “diversity.” And that diversity does indeed represent nothing less than a social revolution. Princeton, which didn’t even admit its first woman graduatestudent until 1961—a year in which a grand total of one (no doubt very lonely) African American matriculated at its college—is now half female and only about half white. But diversity of sex and race has become a cover for increasing economic resegregation. Elite colleges are still living off the moral capital they earned in the 1960s, when they took the genuinely courageous step of dismantling the mechanisms of the WASP aristocracy.

The truth is that the meritocracy was never more than partial. Visit any elite campus across our great nation, and you can thrill to the heart-warming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals. Kids at schools like Stanford think that their environment is diverse if one comes from Missouri and another from Pakistan, or if one plays the cello and the other lacrosse. Never mind that all of their parents are doctors or bankers.

That doesn’t mean there aren’t a few exceptions, but that is all they are. In fact, the group that is most disadvantaged by our current admissions policies are working-class and rural whites, who are hardly present on selective campuses at all. The only way to think these places are diverse is if that’s all you’ve ever seen.

Let’s not kid ourselves: The college admissions game is not primarily about the lower and middle classes seeking to rise, or even about the upper-middle class attempting to maintain its position. It is about determining the exact hierarchy of status within the upper-middle class itself. In the affluent suburbs and well-heeled urban enclaves where this game is principally played, it is not about whether you go to an elite school. It’s about which one you go to. It is Penn versus Tufts, not Penn versus Penn State. It doesn’t matter that a bright young person can go to Ohio State, become a doctor, settle in Dayton, and make a very good living. Such an outcome is simply too horrible to contemplate.

This system is exacerbating inequality, retarding social mobility, perpetuating privilege, and creating an elite that is isolated from the society that it’s supposed to lead. The numbers are undeniable. In 1985, 46 percent of incoming freshmen at the 250 most selective colleges came from the top quarter of the income distribution. By 2000, it was 55 percent. As of 2006, only about 15 percent of students at the most competitive schools came from the bottom half. The more prestigious the school, the more unequal its student body is apt to be. And public institutions are not much better than private ones. As of 2004, 40 percent of first-year students at the most selective state campuses came from families with incomes of more than $100,000, up from 32 percent just five years earlier.

The major reason for the trend is clear. Not increasing tuition, though that is a factor, but the ever-growing cost of manufacturing children who are fit to compete in the college admissions game. The more hurdles there are, the more expensive it is to catapult your kid across them. Wealthy families start buying their children’s way into elite colleges almost from the moment they are born: music lessons, sports equipment, foreign travel (“enrichment” programs, to use the all-too-perfect term)—most important, of course, private-school tuition or the costs of living in a place with top-tier public schools. The SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely. Today, fewer than half of high-scoring students from low-income families even enroll at four-year schools.

The problem isn’t that there aren’t more qualified lower-income kids from which to choose. Elite private colleges will never allow their students’ economic profile to mirror that of society as a whole. They can’t afford to—they need a critical mass of full payers and they need to tend to their donor base—and it’s not even clear that they’d want to.

And so it is hardly a coincidence that income inequality is higher than it has been since before the Great Depression, or that social mobility is lower in the United States than in almost every other developed country. Elite colleges are not just powerless to reverse the movement toward a more unequal society; their policies actively promote it.

Is there anything that I can do, a lot of young people have written to ask me, to avoid becoming an out-of-touch, entitled little shit? I don’t have a satisfying answer, short of telling them to transfer to a public university. You cannot cogitate your way to sympathy with people of different backgrounds, still less to knowledge of them. You need to interact with them directly, and it has to be on an equal footing: not in the context of “service,” and not in the spirit of “making an effort,” either—swooping down on a member of the college support staff and offering to “buy them a coffee,” as a former Yalie once suggested, in order to “ask them about themselves.”

Instead of service, how about service work? That’ll really give you insight into other people. How about waiting tables so that you can see how hard it is, physically and mentally? You really aren’t as smart as everyone has been telling you; you’re only smarter in a certain way. There are smart people who do not go to a prestigious college, or to any college—often precisely for reasons of class. There are smart people who are not “smart.”

I am under no illusion that it doesn’t matter where you go to college. But there are options. There are still very good public universities in every region of the country. The education is often impersonal, but the student body is usually genuinely diverse in terms of socioeconomic background, with all of the invaluable experiential learning that implies.

U.S. News and World Report supplies the percentage of freshmen at each college who finished in the highest 10 percent of their high school class. Among the top 20 universities, the number is usually above 90 percent. I’d be wary of attending schools like that. Students determine the level of classroom discussion; they shape your values and expectations, for good and ill. It’s partly because of the students that I’d warn kids away from the Ivies and their ilk. Kids at less prestigious schools are apt to be more interesting, more curious, more open, and far less entitled and competitive.

If there is anywhere that college is still college—anywhere that teaching and the humanities are still accorded pride of place—it is the liberal arts college. Such places are small, which is not for everyone, and they’re often fairly isolated, which is also not for everyone. The best option of all may be the second-tier—not second-rate—colleges, like Reed, Kenyon, Wesleyan, Sewanee, Mount Holyoke, and others. Instead of trying to compete with Harvard and Yale, these schools have retained their allegiance to real educational values.

Not being an entitled little shit is an admirable goal. But in the end, the deeper issue is the situation that makes it so hard to be anything else. The time has come, not simply to reform that system top to bottom, but to plot our exit to another kind of society altogether.

The education system has to act to mitigate the class system, not reproduce it. Affirmative action should be based on class instead of race, a change that many have been advocating for years. Preferences for legacies and athletes ought to be discarded. SAT scores should be weighted to account for socioeconomic factors. Colleges should put an end to résumé-stuffing by imposing a limit on the number of extracurriculars that kids can list on their applications. They ought to place more value on the kind of service jobs that lower-income students often take in high school and that high achievers almost never do. They should refuse to be impressed by any opportunity that was enabled by parental wealth. Of course, they have to stop cooperating with U.S. News.

More broadly, they need to rethink their conception of merit. If schools are going to train a better class of leaders than the ones we have today, they’re going to have to ask themselves what kinds of qualities they need to promote. Selecting students by GPA or the number of extracurriculars more often benefits the faithful drudge than the original mind.

The changes must go deeper, though, than reforming the admissions process. That might address the problem of mediocrity, but it won’t address the greater one of inequality. The problem is the Ivy League itself. We have contracted the training of our leadership class to a set of private institutions. However much they claim to act for the common good, they will always place their interests first. The arrangement is great for the schools, but is Harvard’s desire for alumni donations a sufficient reason to perpetuate the class system?

I used to think that we needed to create a world where every child had an equal chance to get to the Ivy League. I’ve come to see that what we really need is to create one where you don’t have to go to the Ivy League, or any private college, to get a first-rate education.

High-quality public education, financed with public money, for the benefit of all: the exact commitment that drove the growth of public higher education in the postwar years. Everybody gets an equal chance to go as far as their hard work and talent will take them—you know, the American dream. Everyone who wants it gets to have the kind of mind-expanding, soul-enriching experience that a liberal arts education provides. We recognize that free, quality K–12 education is a right of citizenship. We also need to recognize—as we once did and as many countries still do—that the same is true of higher education. We have tried aristocracy. We have tried meritocracy. Now it’s time to try democracy.

Voir encore:

The Trouble With HarvardThe Ivy League is broken and only standardized tests can fix it
Steven Pinker
The New Republic
September 5, 2014

The most-read article in the history of this magazine is not about war, politics, or great works of art. It’s about the admissions policies of a handful of elite universities, most prominently my employer, Harvard, which is figuratively and literally immolated on the cover.

It’s not surprising that William Deresiewicz’s “Don’t Send Your Kid to the Ivy League” has touched a nerve. Admission to the Ivies is increasingly seen as the bottleneck to a pipeline that feeds a trickle of young adults into the remaining lucrative sectors of our financialized, winner-take-all economy. And their capricious and opaque criteria have set off an arms race of credential mongering that is immiserating the teenagers and parents (in practice, mostly mothers) of the upper middle class.

Deresiewicz writes engagingly about the wacky ways of elite university admissions, and he deserves credit for opening a debate on policies which have been shrouded in Victorian daintiness and bureaucratic obfuscation. Unfortunately, his article is a poor foundation for diagnosing and treating the illness. Long on dogmatic assertion and short on objective analysis, the article is driven by a literarism which exalts bohemian authenticity over worldly success and analytical brainpower. And his grapeshot inflicts a lot of collateral damage while sparing the biggest pachyderms in the parlor.

We can begin with his defamation of the students of elite universities. Like countless graybeards before him, Deresiewicz complains that the kids today are just no good: they are stunted, meek, empty, incurious zombies; faithful drudges; excellent sheep; and, in a flourish he uses twice, “out-of-touch, entitled little shits.” I have spent my career interacting with these students, and do not recognize the targets of this purple invective. Nor does Deresiewicz present any reason to believe that the 18-year-olds of today’s Ivies are more callow or unsure of their lives than the 18-year-olds of yesterday’s Ivies, the non-Ivies, or the country at large.

The charges on which Deresiewicz indicts students are trumped-up. He waxes sarcastic that they try to get an A in every class (would he advise them to turn in shoddy work in his course, or in some other professor’s?); that they don’t read every page of every book they pick up, or of every book whose review they have read (confession: neither do I); that they seek affluence, success, and prestigious careers (better they should smoke weed and play video games on their parents’ couches?); that they “superficially” spend no more than “A whole day!” with renegade artists (and if they spent two days with them?).

The only mitigation that Deresiewicz allows his young defendants is that they suffer from “toxic levels of fear, anxiety, and depression, of emptiness and aimlessness and isolation.” But the survey he alludes to simply found that about half of today’s college students rate themselves “above average” in emotional health, compared to more than 60 percent in 1985. Perhaps we should be impressed that fewer students today are victims of the Lake Wobegon fallacy! More to the point, the data don’t show that Ivy League students are worse off than their non-Ivy peers, and if anything they point in the opposite direction: the students at private universities are more sanguine about their emotional health than those at the public universities and four-year colleges that Deresiewicz romanticizes.

It’s true that many off-brand institutions in the matchless American university system are bargains. The honors program of a 50,000-student campus is likely to have an aggregation of talent that rivals that of the Ivies. Liberal-arts colleges in the boondocks, with their paucity of non-academic diversions, can nurture a student culture that is more engaged with ideas and books. The PhD glut has sent brilliant scientists and humanists into every outpost of the academic archipelago. And in many fields the best programs are at lesser-known universities, which can nimbly expand into new intellectual frontiers while their Ivy League counterparts, stultified by tradition and cushioned by reputation, become backwaters.

Still, there are no grounds for the sweeping pronouncements about the virtues of non-Ivy students (“more interesting, more curious, more open, and far less entitled and competitive”) that Deresiewicz prestidigitates out of thin air. It’s these schools, after all, that are famous for their jocks, stoners, Bluto Blutarskys, gut-course-hunters, term-paper-downloaders, and majors in such intellectually challenging fields as communications, marketing, and sports management. In another use of the argument “If I say it, it’s true,” Deresiewicz decrees that obscure religious colleges “do a much better job” in teaching their students “how to think,” and that they “deliver a better education, in the highest sense of the word” than elite universities—and then, breathtakingly, elevates an assertion that was based on nothing but his say-so (and that is almost certainly false) into an “indictment of the Ivy League and his peers.”

But the biggest problem is that the advice in Deresiewicz’s title is perversely wrongheaded. If your kid has survived the application ordeal and has been offered a place at an elite university, don’t punish her for the irrationalities of a system she did nothing to create; by all means send her there! The economist Caroline Hoxby has shown that selective universities spend twenty times more on student instruction, support, and facilities than less selective ones, while their students pay for a much smaller fraction of it, thanks to gifts to the college. Because of these advantages, it’s the selective institutions that are the real bargains in the university marketplace. Holding qualifications constant, graduates of a selective university are more likely to graduate on time, will tend to find a more desirable spouse, and will earn 20 percent more than those of less selective universities—every year for the rest of their working lives. These advantages swamp any differences in tuition and other expenses, which in any case are often lower than those of less selective schools because of more generous need-based financial aid. The Ivy admissions sweepstakes may be irrational, but the parents and teenagers who clamber to win it are not.

Any rethinking of elite university admissions must begin with an inkling of the goals of a university education. As the song says, if you don’t know where you’re going, any road will take you there. One contributor to the admissions mess is that so few of a university’s thought leaders can say anything coherent about what those goals are. Deresiewicz’s fumbling attempt is typical.

It’s easy to agree with him that “the first thing that college is for is to teach you to think,” but much harder to figure out what that means. Deresiewicz knows what it does not mean—“the analytical and rhetorical skills that are necessary for success in business and the professions”—but this belletristic disdain for the real world is unhelpful. The skills necessary for success in the professions include organizing one’s thoughts so that they may be communicated clearly to others, breaking a complex problem into its components, applying general principles to specific cases, discerning cause and effect, and negotiating tradeoffs between competing values. In what rarefied ivory chateau do these skills not count as “thinking”? In its place Deresiewicz says only that learning to think consists of “contemplating things from a distance,” with no hint as to what that contemplation should consist of or where it should lead.

This leads to Deresiewicz’s second goal, “building a self,” which he explicates as follows: “it is only through the act of establishing communication between the mind and the heart, the mind and experience, that you become an individual, a unique being—a soul.” Perhaps I am emblematic of everything that is wrong with elite American education, but I have no idea how to get my students to build a self or become a soul. It isn’t taught in graduate school, and in the hundreds of faculty appointments and promotions I have participated in, we’ve never evaluated a candidate on how well he or she could accomplish it. I submit that if “building a self” is the goal of a university education, you’re going to be reading anguished articles about how the universities are failing at it for a long, long time.

I think we can be more specific. It seems to me that educated people should know something about the 13-billion-year prehistory of our species and the basic laws governing the physical and living world, including our bodies and brains. They should grasp the timeline of human history from the dawn of agriculture to the present. They should be exposed to the diversity of human cultures, and the major systems of belief and value with which they have made sense of their lives. They should know about the formative events in human history, including the blunders we can hope not to repeat. They should understand the principles behind democratic governance and the rule of law. They should know how to appreciate works of fiction and art as sources of aesthetic pleasure and as impetuses to reflect on the human condition.

On top of this knowledge, a liberal education should make certain habits of rationality second nature. Educated people should be able to express complex ideas in clear writing and speech. They should appreciate that objective knowledge is a precious commodity, and know how to distinguish vetted fact from superstition, rumor, and unexamined conventional wisdom. They should know how to reason logically and statistically, avoiding the fallacies and biases to which the untutored human mind is vulnerable. They should think causally rather than magically, and know what it takes to distinguish causation from correlation and coincidence. They should be acutely aware of human fallibility, most notably their own, and appreciate that people who disagree with them are not stupid or evil. Accordingly, they should appreciate the value of trying to change minds by persuasion rather than intimidation or demagoguery.

I believe (and believe I can persuade you) that the more deeply a society cultivates this knowledge and mindset, the more it will flourish. The conviction that they are teachable gets me out of bed in the morning. Laying the foundations in just four years is a formidable challenge. If on top of all this, students want to build a self, they can do it on their own time.

I heartily agree with Deresiewicz that high-quality postsecondary education is a public good which should be accessible to any citizen who can profit from it. At the same time, there are reasons for students to distribute themselves among colleges with different emphases and degrees of academic rigor. People vary in their innate and acquired intelligence, their taste for abstraction, their familiarity with literate culture, their priorities in life, and their personality traits relevant to learning. I could not offer a course in brain science or linguist theory to a representative sample of the college-age population without baffling many students at one end and boring an equal number at the other. Also, students learn as much from their peers as their professors, and benefit from a cohort with which they can bat around ideas. Not least, a vibrant research institution must bring smarter undergraduates into the fold, to challenge received wisdom, inject energy and innovation, and replenish its senescing membership.

All this is to say that there are good reasons to have selective universities. The question is, How well are the Ivies fulfilling their mandate? After three stints teaching at Harvard spanning almost four decades, I am repeatedly astounded by the answer.

Like many observers of American universities, I used to believe the following story. Once upon a time Harvard was a finishing school for the plutocracy, where preppies and Kennedy scions earned gentleman’s Cs while playing football, singing in choral groups, and male-bonding at final clubs, while the blackballed Jews at CCNY founded left-wing magazines and slogged away in labs that prepared them for their Nobel prizes in science. Then came Sputnik, the ’60s, and the decline of genteel racism and anti-Semitism, and Harvard had to retool itself as a meritocracy, whose best-and-brightest gifts to America would include recombinant DNA, Wall Street quants, The Simpsons, Facebook, and the masthead of The New Republic.

This story has a grain of truth in it: Hoxby has documented that the academic standards for admission to elite universities have risen over the decades. But entrenched cultures die hard, and the ghost of Oliver Barrett IV still haunts every segment of the Harvard pipeline.

At the admissions end, it’s common knowledge that Harvard selects at most 10 percent (some say 5 percent) of its students on the basis of academic merit. At an orientation session for new faculty, we were told that Harvard “wants to train the future leaders of the world, not the future academics of the world,” and that “We want to read about our student in Newsweek 20 years hence” (prompting the woman next to me to mutter, “Like the Unabomer”). The rest are selected “holistically,” based also on participation in athletics, the arts, charity, activism, travel, and, we inferred (Not in front of the children!), race, donations, and legacy status (since anything can be hidden behind the holistic fig leaf).

The lucky students who squeeze through this murky bottleneck find themselves in an institution that is single-mindedly and expensively dedicated to the pursuit of knowledge. It has an astonishing library system that pays through the nose for rare manuscripts, obscure tomes, and extortionately priced journals; exotic laboratories at the frontiers of neuroscience, regenerative medicine, cosmology, and other thrilling pursuits; and a professoriate with erudition in an astonishing range of topics, including many celebrity teachers and academic rock stars. The benefits of matching this intellectual empyrean with the world’s smartest students are obvious. So why should an ability to play the bassoon or chuck a lacrosse ball be given any weight in the selection process?

The answer, ironically enough, makes the admissocrats and Deresiewicz strange bedfellows: the fear of selecting a class of zombies, sheep, and grinds. But as with much in the Ivies’ admission policies, little thought has given to the consequences of acting on this assumption. Jerome Karabel has unearthed a damning paper trail showing that in the first half of the twentieth century, holistic admissions were explicitly engineered to cap the number of Jewish students. Ron Unz, in an exposé even more scathing than Deresiewicz’s, has assembled impressive circumstantial evidence that the same thing is happening today with Asians.

Just as troublingly, why are elite universities, of all institutions, perpetuating the destructive stereotype that smart people are one-dimensional dweebs? It would be an occasion for hilarity if anyone suggested that Harvard pick its graduate students, faculty, or president for their prowess in athletics or music, yet these people are certainly no shallower than our undergraduates. In any case, the stereotype is provably false. Camilla Benbow and David Lubinski have tracked a large sample of precocious teenagers identified solely by high performance on the SAT, and found that when they grew up, they not only excelled in academia, technology, medicine, and business, but won outsize recognition for their novels, plays, poems, paintings, sculptures, and productions in dance, music, and theater. A comparison to a Harvard freshman class would be like a match between the Harlem Globetrotters and the Washington Generals.

What about the rationalization that charitable extracurricular activities teach kids important lessons of moral engagement? There are reasons to be skeptical. A skilled professional I know had to turn down an important freelance assignment because of a recurring commitment to chauffeur her son to a resumé-building “social action” assignment required by his high school. This involved driving the boy for 45 minutes to a community center, cooling her heels while he sorted used clothing for charity, and driving him back—forgoing income which, judiciously donated, could have fed, clothed, and inoculated an African village. The dubious “lessons” of this forced labor as an overqualified ragpicker are that children are entitled to treat their mothers’ time as worth nothing, that you can make the world a better place by destroying economic value, and that the moral worth of an action should be measured by the conspicuousness of the sacrifice rather than the gain to the beneficiary.

Knowing how our students are selected, I should not have been surprised when I discovered how they treat their educational windfall once they get here. A few weeks into every semester, I face a lecture hall that is half-empty, despite the fact that I am repeatedly voted a Harvard Yearbook Favorite Professor, that the lectures are not video-recorded, and that they are the only source of certain material that will be on the exam. I don’t take it personally; it’s common knowledge that Harvard students stay away from lectures in droves, burning a fifty-dollar bill from their parents’ wallets every time they do. Obviously they’re not slackers; the reason is that they are crazy-busy. Since they’re not punching a clock at Safeway or picking up kids at day-care, what could they be doing that is more important than learning in class? The answer is that they are consumed by the same kinds of extracurricular activities that got them here in the first place.

Some of these activities, like writing for the campus newspaper, are clearly educational, but most would be classified in any other setting as recreation: sports, dance, improv comedy, and music, music, music (many students perform in more than one ensemble). The commitments can be draconian: a member of the crew might pull an oar four hours a day, seven days a week, and musical ensembles can be just as demanding. Many students have told me that the camaraderie, teamwork, and sense of accomplishment made these activities their most important experiences at Harvard. But it’s not clear why they could not have had the same experiences at Tailgate State, or, for that matter, the local YMCA, opening up places for less “well-rounded” students who could take better advantage of the libraries, labs, and lectures.

The anti-intellectualism of Ivy League undergraduate education is by no means indigenous to the student culture. It’s reinforced by the administration, which treats academics as just one option in the college activity list. Though students are flooded with hortatory messages from deans and counselors, “Don’t cut class” is not among them, and professors are commonly discouraged from getting in the way of the students’ fun. Deans have asked me not to schedule a midterm on a big party day, and to make it easy for students to sell their textbooks before the ink is dry on their final exams. A failing grade is like a death sentence: just the first step in a mandatory appeal process.

It’s not that students are unconditionally pampered. They may be disciplined by an administrative board with medieval standards of jurisprudence, pressured to sign a kindness pledge suitable for kindergarten, muzzled by speech codes that would not pass the giggle test if challenged on First Amendment grounds, and publicly shamed for private emails that express controversial opinions. The common denominator (belying any hope that an elite university education helps students develop a self) is that they are not treated as competent grown-ups, starting with the first law of adulthood: first attend to your priorities, then you get to play.

My third surprise was what happens to Harvard students at the other end of the pipeline: they get snatched up by the big consulting and investment firms, helping to explain that 20 percent boost in their expected earnings. Why, I wondered, do these cutthroat institutions hire rowers and baritones who know diddly-squat about business just because they have a transcript with the word “Veritas” on it? Wouldn’t they get more value by hiring the best finance major from Ohio State? I asked some people familiar with this world to explain what seemed to me like a massive market failure. They responded candidly.

First, an Ivy degree is treated as a certification of intelligence and self-discipline. Apparently adding a few Harvard students to a team raises its average intelligence and makes it more effective at solving problems. That, the employers feel, is more valuable than specific knowledge, which smart people can pick up quickly in any case.

Second, a little education can go a long way. As one business-school professor put it, “I have observed many smart people who have little idea of how to logically think through a problem, who infer causation from a correlation, and who use anecdotes as evidence far beyond the predictability warranted. Most of the undergrads who go to the consulting firms did take a course in social science, and much of this basic logic can be obtained there.”

More disconcertingly, I was told that Ivy League graduates are a prestige good: having a lot of them in your firm is like wearing a Rolex or driving a Bentley. Also, if something goes wrong, your keister is covered. As they used to say about computers, “No one ever got fired for buying IBM.”

Is this any way to run a meritocracy? Ivy admissions policies force teenagers and their mothers into a potlatch of conspicuous leisure and virtue. The winners go to an exorbitant summer camp, most of them indifferent to the outstanding facilities of scholarship and research that are bundled with it. They can afford this insouciance because the piece of paper they leave with serves as a quarter-million-dollar IQ and Marshmallow test. The self-fulfilling aura of prestige ensures that companies will overlook better qualified graduates of store-brand schools. And the size of the jackpot means that it’s rational for families to play this irrational game.

What would it take to fix this wasteful and unjust system? Let’s daydream for a moment. If only we had some way to divine the suitability of a student for an elite education, without ethnic bias, undeserved advantages to the wealthy, or pointless gaming of the system. If only we had some way to match jobs with candidates that was not distorted by the halo of prestige. A sample of behavior that could be gathered quickly and cheaply, assessed objectively, and double-checked for its ability to predict the qualities we value….

We do have this magic measuring stick, of course: it’s called standardized testing. I suspect that a major reason we slid into this madness and can’t seem to figure out how to get out of it is that the American intelligentsia has lost the ability to think straight about objective tests. After all, if the Ivies admitted the highest scoring kids at one end, and companies hired the highest scoring graduates across all universities at the other (with tests that tap knowledge and skill as well as aptitude), many of the perversities of the current system would vanish overnight. Other industrialized countries, lacking our squeamishness about testing, pick their elite students this way, as do our firms in high technology. And as Adrian Wooldridge pointed out in these pages two decades ago, test-based selection used to be the enlightened policy among liberals and progressives, since it can level a hereditary caste system by favoring the Jenny Cavilleris (poor and smart) over the Oliver Barretts (rich and stupid).

If, for various reasons, a university didn’t want a freshman class composed solely of scary-smart kids, there are simple ways to shake up the mixture. Unz suggests that Ivies fill a certain fraction of the incoming class with the highest-scoring applicants, and select the remainder from among the qualified applicant pool by lottery. One can imagine various numerical tweaks, including ones that pull up the number of minorities or legacies to the extent that those goals can be publicly justified. Grades or class rank could also be folded into the calculation. Details aside, it’s hard to see how a simple, transparent, and objective formula would be worse than the eye-of-newt-wing-of-bat mysticism that jerks teenagers and their moms around and conceals unknown mischief.

So why aren’t creative alternatives like this even on the table? A major reason is that popular writers like Stephen Jay Gould and Malcolm Gladwell, pushing a leftist or heart-above-head egalitarianism, have poisoned their readers against aptitude testing. They have insisted that the tests don’t predict anything, or that they do but only up to a limited point on the scale, or that they do but only because affluent parents can goose their children’s scores by buying them test-prep courses.

But all of these hypotheses have been empirically refuted. We have already seen that test scores, as far up the upper tail as you can go, predict a vast range of intellectual, practical, and artistic accomplishments. They’re not perfect, but intuitive judgments based on interviews and other subjective impressions have been shown to be far worse. Test preparation courses, notwithstanding their hard-sell ads, increase scores by a trifling seventh of a standard deviation (with most of the gains in the math component). As for Deresiewicz’s pronouncement that “SAT is supposed to measure aptitude, but what it actually measures is parental income, which it tracks quite closely,” this is bad social science. SAT correlates with parental income (more relevantly, socioeconomic status or SES), but that doesn’t mean it measures it; the correlation could simply mean that smarter parents have smarter kids who get higher SAT scores, and that smarter parents have more intellectually demanding and thus higher-paying jobs. Fortunately, SAT doesn’t track SES all that closely (only about 0.25 on a scale from -1 to 1), and this opens the statistical door to see what it really does measure. The answer is: aptitude. Paul Sackett and his collaborators have shown that SAT scores predict future university grades, holding all else constant, whereas parental SES does not. Matt McGue has shown, moreover, that adolescents’ test scores track the SES only of their biological parents, not (for adopted kids) of their adoptive parents, suggesting that the tracking reflects shared genes, not economic privilege.

Regardless of the role that you think aptitude testing should play in the admissions process, any discussion of meritocracy that pretends that aptitude does not exist or cannot be measured is not playing with a full deck. Deresiewicz writes as if any correlation between affluence and Ivy admissions is proof that we don’t have a true meritocracy. But that only follows if the more affluent students are without merit, and without a measure of aptitude that is independent of affluence, how could you ever tell? For the same reason, his conspiracy theory of the historical trend in which Ivy students have been coming from wealthier families—namely that the Ivies deliberately impose expensive requirements to weed out poorer families—is glib. Hoxby has shown that the historical trend was propelled by students’ no longer applying to the closest regional colleges but to the ones with the most similar student bodies anywhere in the country. The law of supply and demand pushed the top schools to raise their academic admissions standards; the correlation with parental income may just be a by-product.

After first denying that we have ever tried meritocracy, Deresiewicz concludes by saying that we have tried it, and now should try “democracy” instead, by which he seems to mean a world in which the distribution of incomes of Ivy League families would be identical to that of the country as a whole. But as long as the correlation between wealth and aptitude is not zero, that goal is neither possible nor desirable.

Still, he’s right that the current system is harmful and unfair. What he could have said is that elite universities are nothing close to being meritocracies. We know that because they don’t admit most of their students on the basis of academic aptitude. And perhaps that’s what we should try next.

I have been in school for more than 40 years. First preschool, kindergarten, elementary school, junior high, and high school. Then a bachelor’s degree at UC Berkeley, followed by a doctoral program at Princeton. The next step was what you could call my first “real” job—as an economics professor at George Mason University.

Thanks to tenure, I have a dream job for life. Personally, I have no reason to lash out at our system of higher education. Yet a lifetime of experience, plus a quarter century of reading and reflection, has convinced me that it is a big waste of time and money. When politicians vow to send more Americans to college, I can’t help gasping, “Why? You want us to waste even more?”

How, you may ask, can anyone call higher education wasteful in an age when its financial payoff is greater than ever? The earnings premium for college graduates has rocketed to 73 percent—that is, those with a bachelor’s degree earn, on average, 73 percent more than those who have only a high-school diploma, up from about 50 percent in the late 1970s. The key issue, however, isn’t whether college pays, but why. The simple, popular answer is that schools teach students useful job skills. But this dodges puzzling questions.

First and foremost: From kindergarten on, students spend thousands of hours studying subjects irrelevant to the modern labor market. Why do English classes focus on literature and poetry instead of business and technical writing? Why do advanced-math classes bother with proofs almost no student can follow? When will the typical student use history? Trigonometry? Art? Music? Physics? Latin? The class clown who snarks “What does this have to do with real life?” is onto something.

The disconnect between college curricula and the job market has a banal explanation: Educators teach what they know—and most have as little firsthand knowledge of the modern workplace as I do. Yet this merely complicates the puzzle. If schools aim to boost students’ future income by teaching job skills, why do they entrust students’ education to people so detached from the real world? Because, despite the chasm between what students learn and what workers do, academic success is a strong signal of worker productivity.

Suppose your law firm wants a summer associate. A law student with a doctorate in philosophy from Stanford applies. What do you infer? The applicant is probably brilliant, diligent, and willing to tolerate serious boredom. If you’re looking for that kind of worker—and what employer isn’t?—you’ll make an offer, knowing full well that nothing the philosopher learned at Stanford will be relevant to this job.

The labor market doesn’t pay you for the useless subjects you master; it pays you for the preexisting traits you signal by mastering them. This is not a fringe idea. Michael Spence, Kenneth Arrow, and Joseph Stiglitz—all Nobel laureates in economics—made seminal contributions to the theory of educational signaling. Every college student who does the least work required to get good grades silently endorses the theory. But signaling plays almost no role in public discourse or policy making. As a society, we continue to push ever larger numbers of students into ever higher levels of education. The main effect is not better jobs or greater skill levels, but a credentialist arms race.

Lest I be misinterpreted, I emphatically affirm that education confers some marketable skills, namely literacy and numeracy. Nonetheless, I believe that signaling accounts for at least half of college’s financial reward, and probably more.

Most of the salary payoff for college comes from crossing the graduation finish line. Suppose you drop out after a year. You’ll receive a salary bump compared with someone who’s attended no college, but it won’t be anywhere near 25 percent of the salary premium you’d get for a four-year degree. Similarly, the premium for sophomore year is nowhere near 50 percent of the return on a bachelor’s degree, and the premium for junior year is nowhere near 75 percent of that return. Indeed, in the average study, senior year of college brings more than twice the pay increase of freshman, sophomore, and junior years combined. Unless colleges delay job training until the very end, signaling is practically the only explanation. This in turn implies a mountain of wasted resources—time and money that would be better spent preparing students for the jobs they’re likely to do.

The conventional view—that education pays because students learn—assumes that the typical student acquires, and retains, a lot of knowledge. She doesn’t. Teachers often lament summer learning loss: Students know less at the end of summer than they did at the beginning. But summer learning loss is only a special case of the problem of fade-out: Human beings have trouble retaining knowledge they rarely use. Of course, some college graduates use what they’ve learned and thus hold on to it—engineers and other quantitative types, for example, retain a lot of math. But when we measure what the average college graduate recalls years later, the results are discouraging, to say the least.

In 2003, the United States Department of Education gave about 18,000 Americans the National Assessment of Adult Literacy. The ignorance it revealed is mind-numbing. Fewer than a third of college graduates received a composite score of “proficient”—and about a fifth were at the “basic” or “below basic” level. You could blame the difficulty of the questions—until you read them. Plenty of college graduates couldn’t make sense of a table explaining how an employee’s annual health-insurance costs varied with income and family size, or summarize the work-experience requirements in a job ad, or even use a newspaper schedule to find when a television program ended. Tests of college graduates’ knowledge of history, civics, and science have had similarly dismal results.Of course, college students aren’t supposed to just download facts; they’re supposed to learn how to think in real life. How do they fare on this count? The most focused study of education’s effect on applied reasoning, conducted by Harvard’s David Perkins in the mid-1980s, assessed students’ oral responses to questions designed to measure informal reasoning, such as “Would a proposed law in Massachusetts requiring a five-cent deposit on bottles and cans significantly reduce litter?” The benefit of college seemed to be zero: Fourth-year students did no better than first-year students.Other evidence is equally discouraging. One researcher tested Arizona State University students’ ability to “apply statistical and methodological concepts to reasoning about everyday-life events.” In the researcher’s words:

Of the several hundred students tested, many of whom had taken more than six years of laboratory science … and advanced mathematics through calculus, almost none demonstrated even a semblance of acceptable methodological reasoning.

Those who believe that college is about learning how to learn should expect students who study science to absorb the scientific method, then habitually use it to analyze the world. This scarcely occurs.

College students do hone some kinds of reasoning that are specific to their major. One ambitious study at the University of Michigan tested natural-science, humanities, and psychology and other social-science majors on verbal reasoning, statistical reasoning, and conditional reasoning during the first semester of their first year. When the same students were retested the second semester of their fourth year, each group had sharply improved in precisely one area. Psychology and other social-science majors had become much better at statistical reasoning. Natural-science and humanities majors had become much better at conditional reasoning—analyzing “if … then” and “if and only if” problems. In the remaining areas, however, gains after three and a half years of college were modest or nonexistent. The takeaway: Psychology students use statistics, so they improve in statistics; chemistry students rarely encounter statistics, so they don’t improve in statistics. If all goes well, students learn what they study and practice.

Actually, that’s optimistic. Educational psychologists have discovered that much of our knowledge is “inert.” Students who excel on exams frequently fail to apply their knowledge to the real world. Take physics. As the Harvard psychologist Howard Gardner writes,

Students who receive honor grades in college-level physics courses are frequently unable to solve basic problems and questions encountered in a form slightly different from that on which they have been formally instructed and tested.

The same goes for students of biology, mathematics, statistics, and, I’m embarrassed to say, economics. I try to teach my students to connect lectures to the real world and daily life. My exams are designed to measure comprehension, not memorization. Yet in a good class, four test-takers out of 40 demonstrate true economic understanding.

Economists’ educational bean counting can come off as annoyingly narrow. Non-economists—also known as normal human beings—lean holistic: We can’t measure education’s social benefits solely with test scores or salary premiums. Instead we must ask ourselves what kind of society we want to live in—an educated one or an ignorant one?

Normal human beings make a solid point: We can and should investigate education’s broad social implications. When humanists consider my calculations of education’s returns, they assume I’m being a typical cynical economist, oblivious to the ideals so many educators hold dear. I am an economist and I am a cynic, but I’m not a typical cynical economist. I’m a cynical idealist. I embrace the ideal of transformative education. I believe wholeheartedly in the life of the mind. What I’m cynical about is people.

I’m cynical about students. The vast majority are philistines. I’m cynical about teachers. The vast majority are uninspiring. I’m cynical about “deciders”—the school officials who control what students study. The vast majority think they’ve done their job as long as students comply.

Those who search their memory will find noble exceptions to these sad rules. I have known plenty of eager students and passionate educators, and a few wise deciders. Still, my 40 years in the education industry leave no doubt that they are hopelessly outnumbered. Meritorious education survives but does not thrive.

Indeed, today’s college students are less willing than those of previous generations to do the bare minimum of showing up for class and temporarily learning whatever’s on the test. Fifty years ago, college was a full-time job. The typical student spent 40 hours a week in class or studying. Effort has since collapsed across the board. “Full time” college students now average 27 hours of academic work a week—including just 14 hours spent studying.

What are students doing with their extra free time? Having fun. As Richard Arum and Josipa Roksa frostily remark in their 2011 book, Academically Adrift,

If we presume that students are sleeping eight hours a night, which is a generous assumption given their tardiness and at times disheveled appearance in early morning classes, that leaves 85 hours a week for other activities.

Arum and Roksa cite a study finding that students at one typical college spent 13 hours a week studying, 12 hours “socializing with friends,” 11 hours “using computers for fun,” eight hours working for pay, six hours watching TV, six hours exercising, five hours on “hobbies,” and three hours on “other forms of entertainment.” Grade inflation completes the idyllic package by shielding students from negative feedback. The average GPA is now 3.2.

What does this mean for the individual student? Would I advise an academically well-prepared 18-year-old to skip college because she won’t learn much of value? Absolutely not. Studying irrelevancies for the next four years will impress future employers and raise her income potential. If she tried to leap straight into her first white-collar job, insisting, “I have the right stuff to graduate, I just choose not to,” employers wouldn’t believe her. To unilaterally curtail your education is to relegate yourself to a lower-quality pool of workers. For the individual, college pays.

This does not mean, however, that higher education paves the way to general prosperity or social justice. When we look at countries around the world, a year of education appears to raise an individual’s income by 8 to 11 percent. By contrast, increasing education across a country’s population by an average of one year per person raises the national income by only 1 to 3 percent. In other words, education enriches individuals much more than it enriches nations.

How is this possible? Credential inflation: As the average level of education rises, you need more education to convince employers you’re worthy of any specific job. One research team found that from the early 1970s through the mid‑1990s, the average education level within 500 occupational categories rose by 1.2 years. But most of the jobs didn’t change much over that span—there’s no reason, except credential inflation, why people should have needed more education to do them in 1995 than in 1975. What’s more, all American workers’ education rose by 1.5 years in that same span—which is to say that a great majority of the extra education workers received was deployed not to get better jobs, but to get jobs that had recently been held by people with less education.

As credentials proliferate, so do failed efforts to acquire them. Students can and do pay tuition, kill a year, and flunk their finals. Any respectable verdict on the value of education must account for these academic bankruptcies. Failure rates are high, particularly for students with low high-school grades and test scores; all told, about 60 percent of full-time college students fail to finish in four years. Simply put, the push for broader college education has steered too many students who aren’t cut out for academic success onto the college track.

The college-for-all mentality has fostered neglect of a realistic substitute: vocational education. It takes many guises—classroom training, apprenticeships and other types of on-the-job training, and straight-up work experience—but they have much in common. All vocational education teaches specific job skills, and all vocational education revolves around learning by doing, not learning by listening. Research, though a bit sparse, suggests that vocational education raises pay, reduces unemployment, and increases the rate of high-school completion.

Defenders of traditional education often appeal to the obscurity of the future. What’s the point of prepping students for the economy of 2018, when they’ll be employed in the economy of 2025 or 2050? But ignorance of the future is no reason to prepare students for occupations they almost surely won’t have—and if we know anything about the future of work, we know that the demand for authors, historians, political scientists, physicists, and mathematicians will stay low. It’s tempting to say that students on the college track can always turn to vocational education as a Plan B, but this ignores the disturbing possibility that after they crash, they’ll be too embittered to go back and learn a trade. The vast American underclass shows that this disturbing possibility is already our reality.Education is so integral to modern life that we take it for granted. Young people have to leap through interminable academic hoops to secure their place in the adult world. My thesis, in a single sentence: Civilized societies revolve around education now, but there is a better—indeed, more civilized—way. If everyone had a college degree, the result would be not great jobs for all, but runaway credential inflation. Trying to spread success with education spreads education but not success.

This essay is adapted from Bryan Caplan’s book The Case Against Education. It appears in the January/February 2018 print edition with the headline “What’s College Good For?”

Navigation des articles

One Response to Classement de Shanghaï/2018: L’État est mort, vive l’État ! (Decentralized dirigisme: Will the pressure of international academic rankings finally force France to let go of its obsession for state-commanded centralization to save its overcrowded and underfunded universities ?)

Classement de Shanghaï/2018: L’État est mort, vive l’État ! (Decentralized dirigisme: Will the pressure of international academic rankings finally force France to let go of its obsession for state-commanded centralization to save its overcrowded and un…