Mary Ann Irwin, Secretary/Treasurer and Acting President of CA-AAUP and long- time contingent faculty member at Diablo Valley College in Pleasant Hill, California, California State University East Bay, and elsewhere, has been appointed by AAUP President Rudy Fichtenbaum to the AAUP Committee on Contingency & the Profession. This standing committee of the AAUP works on a variety of topics of importance to higher education and faculty with a special focus on issues related to contingency faculty appointments. Members of this Committee work to improve conditions for contingent faculty members and to reverse the trend toward part-tme and non-tenture-track appointments.AAUP's One Faculty project is collecting examples of contract and handbook language that addresses job security, salary and benefits, academic freedom and participation in governance as each topic affects contingent faculty. If you can help with this project, either for your own campus or by conducting some research on policies at other campuses, please contact Gwen Bradley, AAUP Director of the Department of External Relations.gbradley@aaup.org

The Coup That Failed: How the Near-Sacking of a University President Exposed the Fault Lines of American Higher Education by Talbot Brewer Reprinted from The Hedgehog Review: Summer 2014 (Volume 16 | Issue 2)Talbot Brewer, a faculty fellow at the Institute for Advanced Studies in Culture, is professor of philosophy and chair of the department of philosophy at the University of Virginia. His most recent book is The Retrieval of Ethics. During the same summer that saw a derecho visit vast destruction upon a wide swath of the nation, a different kind of tempest roiled the campus of one of America’s oldest, most distinguished universities. Of human making, this storm left no physical wreckage, and even the institutional damage it might have wrought was seemingly blunted. For a time, at least, it felt as though something vital had survived unscathed. Certainly, the mood was jubilant, even triumphant, on the late June day in 2012 when some two thousand faculty, students, and staff gathered on the main lawn of the University of Virginia, their eyes fixed expectantly on the door just behind the pillars of Thomas Jefferson’s Rotunda. The door opened, and once and future university president Teresa Sullivan emerged. The assembled crowd, of which I was part, burst into cheers, some of us chanting her name. Sullivan bathed for a moment in the jubilation. Then she took to the podium to give her restoration speech.

Two weeks earlier, the same president, a sociologist by training, had been stripped of her office in a coup orchestrated by Helen E. Dragas, rector of the university’s Board of Visitors. When news of Sullivan’s removal began to spread, and when it became clear that it had been done without consulting anyone actually working at the university (with the possible exception of the chief financial officer), the faculty went into full rebellion. The Faculty Senate—a body that ordinarily trundles along without a discernible sense of its own mission—promptly passed a resolution of no confidence in the Board of Visitors. The Senate was joined by a number of other faculty groups in calling for the reinstatement of President Sullivan and the resignation of Dragas and her collaborator, Vice Rector Mark Kington. The university grounds, ordinarily sleepy in the summer, were soon in an uproar. Reporters began digging into the motives for the coup.

Dragas and Kington explained their actions as being the result of “philosophical” differences with Sullivan, yet they declined to say which of the great questions of existence had divided them so irrevocably as to require her dismissal. Enterprising reporters from the student newspaper made use of the Freedom of Information Act to shed some light on this mysterious philosophical disagreement. It turned out that Dragas and Kington had come to believe that the rise of online learning would soon pose an existential threat to the university, and that it had to embrace the trend quickly or risk being left hopelessly behind. Sullivan had been reluctant to move in this direction with the boldness they thought necessary. She was threatened with imminent dismissal, and agreed under duress to step down.

At first blush, this does not sound like a philosophical disagreement.1 It sounds like an ordinary empirical disagreement about whether, and under what conditions, the university would be able to attract enough qualified undergraduates to sustain itself. One party to the conflict, President Sullivan, was less impressed than her adversaries by Harvard Business School professor Clayton Christensen’s widely discussed prediction that online instruction would prove to be a “disruptive innovation,” one that would pose a threat to the very existence of traditional suppliers—not by providing a better product but by providing an inferior substitute that was either vastly cheaper or more convenient. Christensen had insisted that traditional providers of higher education could survive this disruption only by “changing their DNA”—that is, fundamentally changing their mode of instruction, partly by using online instruction to lower costs and reach more students. Sullivan seems to have thought that this was alarmist and that no program of online instruction would soon convince parents to forgo the rite of passage into adulthood that we call “going to college.”

On the surface, then, the conflict between President Sullivan and the leaders of the Board of Visitors seems to have been a difference in market outlook, not a difference that could be termed philosophical, even in the loose and popular deployment of that term. Yet I believe there were important philosophical disagreements in the background. Discussions among members of the Board of Visitors touched not only on the importance of taking bold steps to deliver instruction via the Internet but also on the importance of taking bold steps to trim away departments with relatively few majors. German had been mentioned; so had classics. The guiding idea of the would-be reformers was that the university should be continuously reshaped to meet changes in student demand. Consumer sovereignty should be extended from the problem of determining which products should be displayed on the shelves of which stores, to the problem of determining the contents of the proper education of a young adult.

Extending Consumer Sovereignty This extension of consumer sovereignty is objectionable, in the first instance, because college-age youths cannot be presumed to know in advance what it would behoove them to know; more accurately, they are in need of education concerning what sort of education they need. But there is another objection to this invocation of consumer sovereignty, one rooted in the very identity of liberal arts colleges and universities. Perhaps I can explain this second objection by recounting a recent phone conversation with a friend who for years taught Spanish literature at Mary Baldwin College, a small liberal arts school for women in the town of Staunton, Virginia, about a half hour’s drive from the University of Virginia. Maybe it’s a case of local contagion, or maybe there is a nationwide trend here that is flying under the radar of the national press, but there has been a standoff between Mary Baldwin’s board of trustees and its faculty that is strikingly reminiscent of the standoff at Virginia. The board is imposing major changes on the college, against the resistance of the faculty, and the faculty has contemplated a vote of no confidence in the president and board. The French and Spanish majors are being phased out, and nine other departments (“the expendables,” as the faculty calls them), covering core liberal arts disciplines such as art history, music, math, physics, chemistry, religion, and philosophy, are being considered for closing or consolidation. Meanwhile resources are being shifted to a new set of undergraduate majors in career-specific fields such as criminal justice, marketing, human resource management, social work, public health, special education, and (my favorite) coaching and exercise management.

The trustees seem to believe that these changes are needed to ensure Mary Baldwin’s survival. This makes me wonder how exactly they conceive of the college that has been entrusted to them, if they believe that an alteration of this kind could count as its survival. It obviously would not have counted as the survival of, say, Plato’s Academy if, at some point in the waning of Athens’s golden cultural era it had taken up the training of military leaders, or merchants, or rhetoricians. Nor would it count as the survival of a Franciscan monastery if it responded to a decline in religious enthusiasm by filling its vacant rooms with, say, vacationing nudists. What is a liberal arts college, if it can rededicate its facilities to career training for marketers, police officers, and coaches without losing its soul? How could such a change be counted as securing the college’s survival rather than managing its demise? One gets the impression that Mary Baldwin’s trustees understand themselves to have been charged with the mission of efficiently using a certain number of physical resources—dormitories, classrooms, and the like—for a course of post-secondary pedagogy sufficient to justify the conferral of a bachelor’s degree… in something or other. Commitment to the liberal arts evidently is not deemed essential to the institutional mission.

I think I know the provenance of this picture of proper institutional governance. It has its home in the business corporation, whose one essential purpose is to find a profitable market niche, and whose products and services are to be changed when necessary to fulfill this fixed purpose with optimal efficiency. It would be fiduciary irresponsibility for the board of a publicly held company to stick doggedly with the production and marketing of a particular product even in the face of clear indications that the market for it was declining and that some other product could be made and sold at a greater profit. Consumer sovereignty converges here with responsible board governance: One cannot responsibly insist on a vision of what the world needs that is out of fashion with most consumers.

Plotting the Ouster If one imported this picture of responsible governance into the trusteeship of colleges and universities, one might think that this would at least increase the chances of institutional survival. What I’m suggesting is that this would be a mistake. Liberal arts colleges and research universities are vulnerable to finding themselves in situations in which they must choose between filling their corridors with instructional activities that pay the bills yet secure the mere semblance of survival, or struggling to sustain the liberal arts mission even in the face of serious uncertainty about its financial viability. It is the sort of institution that can remain financially solvent in a faithless or faithful way, and whose path to financial insolvency can be noble or ignoble. The business corporation faces no such choice: If it remains profitable, it remains true to and successful at its identity-conferring mission.

It matters to the little drama we lived through at Virginia that the fate of our university had been entrusted to a board composed almost entirely of business executives and without a single career educator. Dragas herself is the CEO of a large real-estate development firm founded by her father. The two board members who worked with her most closely in plotting Sullivan’s ouster were real-estate developer Hunter Craig and Vice Rector Kington, cofounder of Columbia Capital and managing director of X-10 Management Corporation. These three were apparently deemed qualified to run the university because of their success in the world of business. They plotted Sullivan’s dismissal in cooperation with a small group of influential Virginia alumni who have made fortunes on Wall Street, including Jeffrey C. Walker, the former managing partner of J. P. Morgan Partners and chairman of CCMP Capital Advisors; Jeffrey Nuechterlein, founder and managing partner of Isis Capital; and former Goldman Sachs partner Peter Kiernan.

Winning the Battle… One might expect this group to have some qualms about its capacity to plot the future of an institution whose history, structure, and purpose differ so greatly from those of a Wall Street investment house. Yet these individuals’ e-mail communications suggest an unshakable confidence in their own competence and a smug condescension toward the judgment of career academics like Sullivan. Nuechterlein was moved to support the firing partly because he had asked Sullivan about her plans to counter the threat posed by online courses and “was not impressed” with her “rather pedestrian answer.” Kiernan faulted Sullivan’s approach to funding challenges and online education for its lack of “strategic dynamism”—a catch phrase in the business world for leadership that emphasizes bold risk-taking and nimble responsiveness to changing circumstances. Author of a book called Becoming China’s Bitch: And Nine More Catastrophes We Must Avoid Right Now, Kiernan is not shy about presenting himself as a seer and rearranger of the future, one who has mastered the skill Machiavelli thought crucial to a leader—that of mastering Lady Fortuna with bold action. He proved somewhat less skilled at foreseeing the perils of abruptly removing the University of Virginia’s first female president from office without consulting a single member of the faculty or, by all appearances, anyone with significant experience on the academic side of any institution of higher education. His active participation in Sullivan’s ouster led to his resignation under duress from the board of trustees of the university’s business school foundation just four days after he had written an email telling his fellow board members, “Trust me, Helen [Dragas] has things well in hand.”

…But Not the War She didn’t. To our great surprise we won the battle, but certainly not the war. It continues to unfold, now in the form of a struggle over a strategic plan that will shape the university’s transformation during the next decade or two. Perhaps the most crucial element of this plan is a set of new pan-university research centers. Thus far the university has announced the identity of only one of these centers—a data science institute—and it is not yet clear whether this entity’s primary focus will be academic research or training for jobs in marketing and national security. While the jury is still out on these changes, it is an ominous sign that, partly at Dragas’s insistence, the Board of Visitors will be making piecemeal decisions about budgetary allocations for each element of the strategic plan—an approach that will permit ten businesspersons, two lawyers, two medical doctors, one lobbyist, one Republican political operative, and one former university president to micromanage the university’s evolving academic curriculum and research priorities. I’ve told the story of the battle in order to explain the basis for my current understanding of the nature and stakes of the conflict and to provide what I’m afraid may be a preview of similar events coming soon to campuses near you. If I may try my hand at Kiernan-style prognostication, I think that what we are witnessing is the sharpening of a long-standing conflict between academia and an ascendant Zeitgeist that affirms material productivity and economic competitiveness over all other human ends, and that celebrates the all-purpose managerial acumen of the corporate leader. In my lifetime, we’ve gone from dismissing corporate muckety-mucks as conformists in gray flannel suits to counting business success as the reliable sign of the ability to lead any enterprise at all, right on up to the Free World. (Here I have in mind Mitt Romney’s continuous invocation of his experience in corporate restructuring as proof of his fitness for the presidency.) One by one, institutions that were not regarded as businesses have been remade in accordance with the techniques and priorities of the business manager. This re-fashioning of our institutions is now in full swing at academic institutions, and it poses a grave threat to the survival of these institutions. It is not just in moments of dramatic conflict, such as those we lived through at Virginia, that we feel the encroachment of this commercial Zeitgeist in the academy. We feel it daily, in the slow-motion transfer of power from the faculty to administrators who usually lack a higher degree in an academic field. These managers can exercise power only if the activity they manage can be conceived as something whose attainment they are in a position to ascertain and measure firsthand, without relying on the testimony of the trained professionals whose activities they are trying to regulate. Hence, their rise tends to be accompanied by a change in institutional aims and ambitions—a change promoted under the abstract name of accountability that in fact alters what we professors are accountable for, in a metric that can be appreciated from a vantage point external to our fields. This is the source of the increase in paperwork and the shrinking of prerogative the professoriate has experienced in recent decades. Here is the point of entry for the accursed paperwork requirements that at the University of Virginia go under the name of learning outcome assessments, and which have rightly been called “a disciplinary instrument masquerading as a pedagogic one.”2

We’ve seen very similar patterns of institutional change in other professions, including social work, elementary and secondary education, and medical care. All of these fields have seen a rise in managerial oversight of trained professionals, with dramatic expansions of reporting requirements and equally dramatic restrictions on the room for case-specific professional discretion. Yet unlike members of these other professions, college and university professors are not mere innocent bystanders to the rising power of the all-purpose manager. They (or at least their institutions) have greatly abetted it by profiting from new master’s-level programs in public and business management.

Colleges have always derived a large part of their market value from their capacity to provide students with a large boost in the contest for positions of prestige and high remuneration. One plausible explanation of the rise of master’s programs in management is that it is a strategic response to the overproduction of bachelor’s degree graduates relative to the supply of truly advantageous career starting points. The value of the bachelor’s has been bolstered, and a new revenue stream created, by the introduction of management degrees that require the degree yet are kept in much shorter supply. There is, then, a kind of poetic justice in the remaking of academia by the sort of hyperactive and disciplinary business-style management that it has helped to unleash on other professions. Not that this makes me like it one bit more.

Busy-ness School? The words scholar and school come to us from the Greek scholé, which can be translated at least roughly as “leisure,” in the sense of time free from the need to labor. Aristotle regarded such freedom as the precondition of the most valuable sorts of thought and action, and in particular for the sort of thought that is an end in itself. On this point he was not striking out on his own; he was giving voice to a prevailing Greek opinion that manifests itself in the etymological path connecting scholé to school and scholar. According to this characteristically Greek view, leisure opens the possibility of genuinely free thought—thought born of wonder and free to unfold in accordance with its own internal demands, rather than thought born of consciousness of lack or need. To be sure, scholé opens the possibility of other autotelic activities—that is, activities that are their own end and are free to unfold in accordance with the demands of this internal telos rather than activities whose point lies in some conceptually separate state of affairs they are calculated to produce. Yet the term evolved in such a way as to suggest a specifically Greek taste for free thought itself, coming first to mean “discussion” and later to designate the philosophical schools that regarded themselves as custodians of the best and most elevated human discussions.

This broadly Greek understanding of the opposition between leisure and leisure-lessness was taken up into Latin thought by such figures as Cicero and Seneca with the term otium and its contrary, negotium. This latter Latin term itself traversed a long etymological path, one of whose endpoints is the contemporary Spanish word negocio, which means business. A somewhat less direct affinity between commercial life and the absence of leisure is embedded in the word’s English-language equivalent, business—though ordinarily it spills out of the mouth too quickly for its etymology to register in the ear. I wonder if business school might not lose some of its appeal if its name were enunciated more deliberately. Wouldn’t people think twice before signing up for a school of busy-ness?

Maybe they wouldn’t. But I hope at least that you can now hear the lingering oxymoron in that phrase. A school of business. A scholé of the negation of scholé. An institution devoted to discussion and thought unfolding under its own internal demands, yet offering training for the sort of life that has no place for such thought—the sort that places thought in service of need. Indeed, the contrast is rather starker these days, since business busies itself not merely with the navigation of need but with the creation and intensification of felt need, hence with continuous amplification of the realm of human life in which thought takes direction from something alien to it. The leaders of last summer’s coup at the University of Virginia are practitioners of this mode of thought. Indeed, three of the four key organizers of the would-be ouster are closely associated with the university’s own contradiction in terms, the Darden Business School: Dragas and Kington are Darden graduates, and Kiernan chaired Darden’s foundation board until his forced resignation.

The turbulence in Charlottesville, then, can be understood as a clash between scholarship in the ancient sense—which is to say thought unfolding in freedom, thought that does not take direction from anything alien to itself—and the contrary forms of thought that are appropriate when basic needs deprive human beings of the opportunity for more valuable uses of their defining mental capacities. This conflict echoes the tension we find in the ongoing contest over the soul of Mary Baldwin College. Historically, Mary Baldwin has identified itself as a liberal arts college, and a large portion of its faculty remains attached to that identity. The liberal arts have traditionally taken their identity from a contrast with the servile arts, which is to say, the arts needed for efficiently navigating the condition of captivity to need, the condition of ascholia. The liberal arts are those it makes sense to develop and exercise when one is lucky enough to have a sizable amount of time free from the compulsion of genuine material need. The purpose of the servile arts is to keep oneself alive and healthy. The purpose of the liberal arts is to engage in activities that are worthwhile in themselves, activities that can give point to remaining alive and healthy. When a college retreats from the liberal to the servile arts, it announces to its students that times are too lean to permit four years of indulgence in something capable of giving point to remaining alive and healthy, and that they must concentrate on the task of enhancing their material means. When it includes marketing among the servile arts it recommends to those who come to its doors seeking education, it departs even more decisively from its prior mission: It announces that times call for the creation and intensification of new needs, needs that will redirect the minds of one’s fellow human beings from the liberal arts even under conditions that might otherwise conduce to the development and exercise of those arts.

Like many of my fellow professors, I gave serious thought during the upheaval to the possibility of publishing a defense of what we do at our university. I was particularly eager to defend the humanities, since the limited information that had been published concerning the intentions of the Board of Visitors suggested that they had keener doubts about the justifiability of humanities departments than about the justifiability of programs in the currently favored STEM programs—science, technology, engineering, and math—or in such areas of social science as politics and economics, or in the newcomer to our campus, “leadership studies.” I produced rough versions of possible op-eds but did not try to publish anything.

What paralyzed me was that my attempted defenses of the humanities seemed to fall into two categories: those that might conceivably help our cause but were not heartfelt, and those that were downright impolitic. These latter efforts were indeed doubly impolitic: They traced the value of the humanities to a vision of the human good that ran against the political currents of our place and time, and they pointed toward a serious indictment of the form actually taken by the humanities in my own university and in colleges and universities around the country. They were, in short, politically ineffective defenses of an ideal that, if taken seriously, would provide fresh grounds for attacking us.

I thought that in the long term it might be salutary to consider this ideal, and the novel objections to which it gives rise, but I did not want to publish my thoughts about these matters while the battle was in the public eye. I was even less eager to write something politically savvy that did not faithfully express my passion for the humanities. It seemed wrong to seek to preserve social space for philosophy with the tools of philosophy’s traditional nemesis, which is to say, with sophistry. So I decided to wait until a less fraught and less public occasion to attempt to articulate the grounds of my devotion to the humanities in general, and to philosophical reflection in particular.

Strategies for Defending the Humanities When I look at what others have written in defense of the humanities, the arguments seem to follow three strategies. The first is to bring out the pecuniary benefits of the humanities, whether in terms of individual career advantage or of national economic competitiveness. Distasteful as I find this approach, I don’t mean to say there is nothing to it. I recently traveled to Peking University to trade ideas about higher education with Chinese educators, and it seems that China’s leaders have concluded that they should invest in the humanities precisely because of the creativity that such an education incites, and the contribution this sort of creativity makes to market innovation. The American Association of Colleges and Universities has been trying to promote liberal arts education on precisely this ground: “In an economy fueled by innovation, the capabilities developed through a liberal education have become America's most valuable economic asset.”3 Similar observations have prompted Carol T. Christ, president of Smith College, to argue that “it would be a mistake to find irrelevant a system that has proven so disproportionally successful that its methods are being adopted by some of America’s strongest economic competitors.”4 Yet whatever the truth about the aggregate economic benefits of widespread training in the humanities, the argument falls flat when addressed to the individual students who are entering our colleges and universities, and whose choices of classes and majors play so important a role in determining what we end up teaching. The employment and earnings prospects of humanities majors are, in truth, quite dim when compared to those of students in other majors. Recent history majors have an unemployment rate more than one-third higher than that of engineering or business majors, and humanities majors who do manage to get jobs earn an average starting salary of only $37,000, compared to about $62,000 for engineering majors. While I’ve heard it said that in the long run humanities majors end up earning at least as much as engineers, the data do not bear out this rumor. In fact, engineers continue to out-earn individuals with humanities degrees by 30 percent in the mid-career years. So pecuniary considerations do not seem to count in favor of majoring in the humanities, and there are indications that students may be turning away from the humanities for precisely this reason. Eighty-eight percent of first-year college and university students now cite “getting a better job” as the top reason for pursuing their studies, up from 71 percent prior to the economic downturn, and only about one in twelve now chooses to major in the humanities—less than half the portion that made this choice in 1967.5

These economic considerations have created a political backlash against public subsidies for humanities departments at state universities. Florida is currently considering higher tuitions for humanities majors than for students in “strategic” majors because the former group’s studies supposedly contribute less to the state’s economic health. In the words of Florida governor Rick Scott, “If I’m going to take money from a citizen to put into education, then I’m going to take that money to create jobs. Is it a vital interest of the state to have more anthropologists? I don’t think so.” North Carolina governor Pat McCrory seems to be thinking along similar lines: “If you want to take gender studies that’s fine. Go to a private school and take it. But I don’t want to subsidize that if that’s not going to get someone a job.”6

Those who attempt to defend the humanities in pecuniary terms are not only missing the point of the activity they seek to defend; they are lending credence to the threatening views of political leaders such as Scott and McCrory by conceding one of their most contestable premises—namely, that subsidizing the education of fellow citizens is justifiable only if it conduces to career success or economic growth. This is precisely the pattern one sees in President Barack Obama’s public pronouncements on education. He has tied the justifiability of education at all levels to economic competitiveness, and has praised the educational systems of Singapore and other East Asian nations for emphasizing science and technology, thereby “spending less time teaching things that don’t matter, and more time teaching things that do.”7

One reason I think Governors Scott and McCrory are mistaken to suggest that economic growth is the only public good that can justify investments in education is that I do not believe continued economic growth, in anything like its recent historical form, to be good at all. Here we come to what is arguably the crux of the Zeitgeist that surrounds us, and what makes it so difficult to put forward an impassioned defense of the humanities that is not impolitic. While more than a glancing look at this topic is beyond the purview of the present article, a good place to begin is a little essay published in 1930 by John Maynard Keynes under the title “Economic Possibilities for Our Grandchildren.” Keynes projected that increased worker productivity would soon make it possible to sustain a decent standard of living for all while reducing the average work week to fifteen hours. This would bring human beings face to face with what Keynes regarded as their real and permanent problem: what to do with genuine freedom—that is, with scholé. A proper answer would require the arts of scholé, that is, the liberal arts—identifying, refining, and pursuing activities that are genuinely valuable in themselves. Keynes was deeply pessimistic about the human capacity to meet this challenge. Looking around him, he observed that the wives of the wealthy were already faced with it and were failing badly. These women, he wrote, “cannot find it sufficiently amusing, when deprived of the spur of economic necessity, to cook and clean and mend, yet are quite unable to find anything more amusing.”8

Recent history suggests that Keynes’s pessimism was well founded. We seem inclined to evade our “permanent problem” even in conditions of remarkable prosperity. As Boston College sociologist Juliet Schor observes, the material consumption rates achieved in 1948 could have been sustained in the early 1990s even if every single worker had taken every other year off.9 Yet about the same time, in 1988, the average American family contributed sixteen more weeks of full-time work to the formal economy than in 1967.10 This is not just a reflection of the entry of more family members into the work force. The trend holds good at the individual level as well as the family level. The average US worker put in 148 more hours per year in 1996 than in 1973.11 Nor does this seem to be a function of economic need. The proportion of US workers putting in more than forty-nine hours per week grew from 13 percent in 1976 to 19 percent in 1998, while the proportion of managers working that many hours grew from 40 percent to 45 percent.12 This is the flip side of the vast increase in consumption experienced in recent decades in the United States and Western Europe. We are working more and we are spending more—not in the name of mere survival or basic creature comforts but as our first attempt to answer what Keynes called the permanent question of humankind. We have effectively refused the offer of increased free time in exchange for increases in paid work and consumption (not that the two options are available individual by individual; the refusal has been for the most part a collective one—a matter of political paths not taken). Indeed, in the United States, our consumption patterns have become so lavish that they could be sustainably enjoyed by all of our fellow human beings only if we could somehow get our hands on five more planets that were roughly as well endowed with natural resources as the one we've got. Given this, our de facto answer to the permanent question of humankind is not available on a sustainable basis to humankind as a whole.

This seems to me to ground both a practical and a moral demand to rethink our answer in a vastly less production-oriented, and consumption-oriented, direction—a demand, in other words, to explore and develop the art of liberty. I envision the humanities as central among the liberal arts, in the sense that they are themselves a fecund source of intrinsically valuable activities, and they deepen virtually all of the activities of those who permit their psyches to be reshaped by sustained engagement in them. They deepen friendships; they deepen neighborly social relations; they deepen loves and marriages and parent-child relations, walks in the woods, idle musings, creative and expressive activity, and contemplation of the creative and expressive products of others. The moment has come when we can afford to democratize this life-enhancing form of education. If we opt instead to remake ourselves as a kind of commercial Sparta, whose educational system is geared primarily to the enhancement of economic productivity, we will leave future generations with a pillaged natural environment and a badly degraded cultural environment.

Yet if we do wish to cultivate a deeper public appreciation of the humanities, we will face some impressive obstacles. We will have to counteract the effects of a pervasive form of socialization by commercial enterprises—one that represents the largest and best-funded program of proselytism ever mustered in the history of humankind. The telling novelty of this form of proselytism is that it is automatic: It can go forward without a single true believer in the wisdom of the consumerist vision of the good on which its many and varied communications overlap. This tuition-free, corporate-sponsored schooling begins long before the first day of kindergarten and does not go out of session until we die. The average six-year-old child in the United States sees 40,000 commercial messages per year and can name 200 commercial brands. Nor does this open-air school have to limp along on bake sales. Global expenditures on advertising totaled about $650 billion in 2003, making advertising the world’s second biggest industry (after weapons).13

A Bidding War for Attention The advertising industry can be conceived of as a continuous bidding war for the precious commodity of human attention. Its messages surely bend prevailing evaluative sensibilities in an anti-contemplative direction. But they have other effects as well. First, they produce an environment of attentional overload, one that inures us to all but the loudest and most ingratiating focal objects. Second, they bathe us in messages whose content is expressly devised to be manipulative, and this cannot help but leave its mark on the prevailing understanding of the normal and proper use of language.

I was thinking about this recently while flying to San Francisco for a philosophy conference. I was hungry. On a flight like this you used to get a meal. Now United Airlines has instituted what it calls a “Choice Menu”: You have a choice of several different meals, or no meal at all. (One of these choices—the last one—is still free.) The item I end up choosing is called Tapas. It turns out to be a box of seven or eight carefully packaged samples of appetizers. What is most notable about the enclosed offerings is not what’s in the packages but the packages themselves. If you lifted all the copy from these packages and wrote it on a legal pad, you’d have several pages of text. Interspersed with lists of ingredients whose nutritional value you would need a food chemist to assess, you’d find a recurring tale. It begins with some particular person—someone with an ordinary name like Tom or Suzie or Stacy. This person had a great passion for making whatever is in the package. But she never had a thought in the world about going into business: She made her treats from love, for a small circle of family and neighbors. Then word got around about these love-drenched wonders, and friends and family members convinced her that she ought to share the love by creating a tiny little company. The purpose was not in the least pecuniary. You yourself, here in the plane, flying over the United States looking down at the farms below—you are practically being invited right into the living room to share the love of this good soul, whose name you now know. These pita chips in the Tapas box were made with love and care because “That’s the Stacy’s Way.” And so on. You begin to realize that what you’ve purchased in the Tapas box is really eight advertisements with nibbles in them, and you wonder if in fact United is paid by these loving souls, or the companies they’ve created, to include their little ad-wrapped samples in the Tapas box. For the real purpose of “Tapas” seems to be to sell larger versions of the packages found in the box—packages that are available in the grocery stores 30,000 feet below. Then you take your napkin out to wipe your mouth, and you see that it too is an advertisement, in this case for the United app for iPhones. It’s not an ordinary nap; it’s an app ad nap.

We have devised a world in which mercenary words and images press upon us. Wherever our eyes are likely to alight, someone is willing to pay for images and text our gaze can alight upon. They elbow us out of the quiet repose, the scholé, that contemplation requires. We adapt, and teach our children to adapt, to a contested and interested domain of image and language where the interesting is continuously revealed to be a mere effect, produced by someone’s interest in interesting us. And when amid this clamor of manipulative messages there suddenly appears something quite different, something called literature, or art, or philosophy, it is not easy to adjust our cynical and self-protective posture so as to open ourselves to this newcomer. The attentional environment has not encouraged the traits required for proper engagement with philosophy and other humanities disciplines: the habit of sustained attention and of patience and generosity in interpretation; the openness to finding camaraderie and illumination from others in the more treacherous passages of human life; the expressive conscience that cannot rest until it lights upon exactly the right words for one’s own incipient thoughts, words that have nothing to do with manipulation. It is, then, grotesque to suggest that the humanities must be justified or condemned on the basis of their contribution to the vitality of the commercial sphere. The vitality of the commercial sphere, as currently constituted, poses an existential threat to the repose of mind, the scholé, that the humanities require if they are to flourish.

Cultivating Citizens If the humanities are not to be justified in terms of economic productivity, perhaps they can be justified by their contribution to the life of the polity. This is the second of the three strategies one finds in the literature on the topic, and it has some extremely able proponents, including Martha Nussbaum. Nussbaum is drawn to this justificatory strategy partly because she thinks that a more direct insistence on the value of public funding for the humanities, or of providing an education in the humanities to all youths, would offend against a properly liberal commitment to pluralism concerning the human good. Nussbaum writes,

“[Modern democracies] are societies in which the meaning and ultimate goals of human life are topics of reasonable disagreement among citizens who hold many different religious and secular views, and these citizens will naturally differ about how far various types of humanistic education serve their own particular goals. What we can agree about is that young people all over the world, in any nation lucky enough to be democratic, need to grow up to be participants in a form of government in which people inform themselves about crucial issues they will address as voters and, sometimes, as elected or appointed officials.14

She argues that the humanities cultivate the analytical capacities, historical and intercultural understanding, and mutual respect and concern needed for proper participation in the difficult activity of democratic self-rule.

Like the economic argument, this rationale appeals to a politically potent value. However, unlike the economic argument, it ties the humanities to a value they might reasonably be thought to promote, at least to some degree. Still, I think there are two serious difficulties with it. The first is that few democracies actually provide room for the sort of active citizenship on which Nussbaum’s argument depends. The actual practice of democracy in most Western democratic nations is well captured by Joseph Schumpeter’s uninspiring definition of democracy as that institutional arrangement for political decision-making in which the power to make decisions is acquired by means of a competitive struggle, among political elites, for the peoples’ vote.15 The role of most citizens in this institutional arrangement is sharply limited: to vote at regular intervals for the elites they prefer. If we ask whether a particular citizen ought to invest in a long and expensive course of education for the sole reason that this will make her votes better informed, more comprehending, and more respectful and empathetic, it seems clear that the answer will be “no.”

It may be true that it behooves even a Schumpeterian polity to promote thoughtful and competent voting as part of the solution to a collective problem that makes political ignorance individually rational. Yet it seems that the problem at hand is deeper and more systematic than a mere decision-theoretic paradox. There are forms of democratic self-rule that really would provide a proper forum for most citizens to exercise a rich historical and cultural understanding and a well-developed mutual concern. But these forms seem to be workable only on a very small scale. The Madisonian corrective to the dangers of popular rule is to make the so-called republic vast, with a citizenry too numerous, dispersed, and varied for coherent deliberation or coordinated action. If this corrective works at all, it works by leaving most citizens with no effective means of civic engagement. The successful grassroots activism seen so recently in the micro-polis of the University of Virginia is hardly imaginable on a national scale. Contemporary polities could, of course, be radically downsized to permit a more active and engaged form of citizenship. Perhaps they should be. But unless and until this restructuring is actually carried out, to cultivate a capacity and taste for active political engagement is to prepare the citizenry for unactionable discontent.

A second problem with Nussbaum’s argument is that there is at most a very tenuous connection between immersing oneself in the humanities and becoming a good citizen. If one wants to encourage youths to engage in politics in a respectful and mutually comprehending spirit, it might help to have them read John Locke’s Second Treatise of Civil Government and study The Federalist, but is it really necessary, or even greatly helpful, to have them read Madame Bovary, To the Lighthouse, The Brothers Karamazov, or Remembrance of Things Past? Would it be greatly helpful to study aboriginal sculpture, Renaissance painting, or Baroque music? To read Kant’s First Critique, Aristotle’s Metaphysics, or Kierkegaard’s Either/Or? I doubt it. Even if one were to accept the premise that post-secondary education should be sculpted so as to optimally promote the virtues of engaged and compassionate citizenship, these works would hardly suggest themselves as the surest way to attain this effect.

Here’s what I think is driving Nussbaum’s argument. The fixed point is not the premise that we must provide whatever education will most effectively conduce to the vitality of our democracy. The fixed point is the eventual conclusion—that the humanities are valuable and must be defended against impending threats. The obvious line of defense is to articulate the value of the humanities as one experiences them. After all, it is precisely one’s lively sense of this value that inclines one to defend the humanities in the first place. But this direct argument is deemed inadmissible on liberal neutralitarian grounds—that is, because it turns on claims about the human good about which citizens can reasonably disagree. The neutralitarian liberal is permitted to take an official concern with the character and the evaluative outlook of the citizenry only to the extent that this is necessary for fundamentally important political purposes such as the stabilization of liberal rights and the proper functioning of democratic institutions. Yet the real value of the humanities cannot shine through under these constraints. The main opportunities for exercising the sort of understanding inspired by close reading of Marcel Proust and James Joyce seem to lie not in the political forum but at the café, or over the dinner table, or perhaps in the bedroom (where it greatly multiplies the menu of available pathologies!). Despite Nussbaum’s notable gifts as a thinker and writer, and her obvious love of literature and philosophy, her argument wobbles under scrutiny.

I do not believe that liberalism calls for strict neutralitarian limits on the state’s concern with the acculturation of successive generations of citizens. Indeed, it is precisely this stance of principled abstention that has effectively ceded the task of shaping future citizens to the highest bidders—namely, corporations in their capacity as advertisers—and that has therefore helped to make the background culture so hostile to the humanities. Against the backdrop of today’s communications technology, and especially in the context of what is less a democracy than a corporatocracy, neutralitarianism has the non-neutral effect of promoting a consumerist and careerist monoculture—something that the liberal lover of plural experiments in living ought to denounce unequivocally. So I think that Nussbaum’s argument turns on a picture of the proprieties of public debate whose real-world effects are illiberal, baleful, and especially bad for the humanities.

None of this means that Nussbaum’s argument won’t work, in the practical and political sense of “work.” No competent politician would confess to a Schumpeterian vision of politics. Telling the people to their face that they are not in charge would be a serious misstep in the Schumpeterian contest for the popular vote. Actual democracies can be relied on to pretend that they are governments “of the people, by the people, for the people,” even when the people have relatively little to say. Given the political power of this pretense, and given public uncertainty about just what it means to be a good citizen, Nussbaum’s strategy for defending the humanities is perhaps as wise as any. In any case, I have no more promising strategy to offer. I’ve been up front about my intention to advance an impolitic strategy for defending the humanities. I turn to it now.

If I ask myself why I recoil from the arguments canvassed above, it’s because they so thoroughly miss the appeal of the form of thought and life that I seek to share with my students. For me, its appeal has nothing to do with preparing my students for any preordained social role, whether in the market or the forum. I feel I have something especially valuable to offer those who recoil from whatever satisfactions may be available in the cubicle of an accounting firm or in politics as currently practiced. That these satisfactions are not enough to enable one to live a fully human life, to participate fully in the most meaningful currents of the human project on earth, that these are at best means and at worst impediments—this much seems to me to be an essential starting point for recognizing the value of the humanities. To tie the worth of what I do to any putative role in instilling “marketable” skills in the young people I teach would be to betray the impulse that drew me down the path to this profession, and to turn my back on the special bonds of friendship I think I’ve established with those few students I believe I’ve really “reached.”

I am not alone in refusing to locate the value of the humanities in the useful preparation they provide for excelling in future roles or professions. Stanley Fish, for instance, insists that the humanities are their own reward, and that they can be justified only in terms of the special pleasure they afford their initiates, and that humanities professors should be pleased to admit their uselessness since this is tantamount to insisting on the autonomous and intrinsic value of their pursuits.16 This sort of stance is obvious fodder for the reformist agenda of the likes of the evidently unchastened Helen Dragas, who in 2013 scolded President Sullivan and her faculty supporters for failing to appreciate that the University of Virginia is “not an academic playground.”17

So Fish’s view certainly merits being labeled impolitic. There is a good deal right about it as well. The humanities really can be pleasurable, they really are intrinsically rewarding, and it would be a serious mistake to turn to them solely because of their usefulness. Yet in my view, Fish takes insufficient care to distinguish between the truth that the humanities are not to be used and the falsehood—I want to say, the slander—that they are useless. To say they are useless is to say they bring nothing of value to our lives beyond the transient pleasure of engaging in them. But this is surely wrong. The humanities are, more accurately, a gateway to and instigator of a lifelong activity of free self-cultivation. The changes they provoke in us are not always for the happier, or the more remunerative, or the more civically engaged, but when things go passably well, these changes are for the deeper, the more reflective, and the more thoughtful. The humanities connect our lives with a human vocation that is different in kind from, and potentially more meaningful than, commerce or politics (though in the end the lines between these spheres can break down, and commercial and political activities can themselves be infused with, and made more meaningful by, the extra measure of understanding we might hope to cultivate by our engagement with history, literature, the visual arts, philosophy, and the like).

A Polis Apart I want to flesh out my argument by focusing on the case of philosophy, and by looking in particular at the moment in which Western philosophy comes to self-consciousness as an alternative to the Athenian practice of providing youths with training in rhetoric. The opposition between these two visions of pedagogy is dramatized in Plato’s Gorgias, which portrays a dialogue between Plato’s teacher, Socrates, and a group of teachers of rhetoric. Like any fundamental inquiry into pedagogy, Socrates’s exploration turns on a conception of the good human life and the good human being. Socrates grounds this conception in a distinctive and valuable capacity that distinguishes us from the other animals. The lives of other animals are fixed to a very great degree by instinct. We humans have a capacity that the Greeks called logos—a capacity for speech and thought; a capacity to take hold of words and exchange them with our fellow human beings in an attempt not merely to understand our world but also to give a distinctive, non-instinctual shape to ourselves and our communities.

The rhetorician (at least as portrayed by Plato) thinks that logos is best used as a tool for persuading people to believe whatever one happens to want them to believe. Its use, then, is guided not by truth but by the exigencies of power. Since one can’t completely ignore truth and still be persuasive, the application of logos is guided in many cases by the ring of truth, by plausibility. The sort of philosophical thought practiced by Socrates is fundamentally different. Socrates’s first rule of dialogue was Say exactly what you think. Not what you think will impress your fellow students, or fellow citizens, or what you think your teacher or your employer or anyone else wants to hear. Speak your mind. Make your thoughts answerable to the phenomena under discussion, and find words that faithfully capture your vision of them. When you speak your mind in sustained and careful conversation about important topics, while refusing the impulse to find rationalizations when serious objections are raised, you put yourself into play. You draw yourself out, make your stance in the world more self-conscious, more determinate, more refined. For Socrates, this is what the defining human capacity of logos is for, this is its telos: to develop ourselves in freedom from want, articulating and refining our views—both individual and communal—about the world we live in and about what sort of life we are to pursue in it. To achieve this, he thought we would need sincere, persistent, thoughtful, and compassionate conversational partners. And we would need scholé—which is to say, free time. Lots of time. Not, say, a four-year stay at college, but a lifetime.

In the concluding section of another of Plato’s dialogues, the Phaedrus, Socrates argues that the written word cannot itself capture and deliver what needs to be understood; at best, it can incite readers to turn toward the phenomena themselves and secure understanding through a more immediate apprehension of them. Further, the written word is not ideally suited to play even this indirect role, since its author is not there to respond to successive attempts, on the part of the reader, to attain a firsthand discernment of the phenomena that inspire it. What Platonic philosophy hopes to deliver to students is no more amenable to summary statement in a treatise or textbook than what Freudian psychotherapy hopes to deliver to patients. The quest for understanding is irreducibly idiosyncratic, because the sources of blindness and delusion are irreducibly idiosyncratic. If the reader cannot speak to the author, the possibility of useful communication is greatly reduced. If this is right, then the spoken word taken in itself—delivered, say, in the form of a lecture rather than in the course of a conversation—is no better a vehicle for philosophical enlightenment than the written word.

Philosophy, in short, lives in conversation. The student must be called on to speak, and to do so sincerely rather than strategically—e.g., with an eye to a grade. This is what puts the student himself or herself into play. If this does not happen, then philosophy does not happen. Thus, philosophy does not happen in the passive uptake of lectures—whether they are delivered in a large lecture hall or in a Massive Open Online Course (or MOOC, as such courses are quickly coming to be known). If university-style philosophy is in danger of being replaced by the “disruptive innovation” of online education, perhaps this is because university philosophy classes have assumed a deficient form, one suited to fields whose findings can be mastered in passive uptake.

The task of the philosophy professor is to enact philosophical thinking in conversation with students. Since one can never know what students might say, there is no sure recipe for making things come off well, and no determining in advance where exactly the conversation will lead. When this activity is remade so as to efficiently produce some pre-envisioned outcome whose attainment can be certified by those who are not party to the conversation, it is not thereby improved; it is annulled. This is why learning outcome assessments are so desperately out of place in the humanities classroom. They obviate the improvisatory conversational exploration the humanities require if they are not to be replaced by something else.18

The ideal teacher of philosophy is not someone whose opinions are to be accepted, but someone whose form of thought is worth emulating. The Socrates we know is a dramatic persona Plato puts forward as worthy of emulation. I believe that such emulation consists in serious-minded lifelong engagement (engagement “unto death,” as Plato wishes to make clear in the Phaedo and Crito) in the activity of self-articulation, which is to say, the activity of bringing oneself into a more determinate form by bringing oneself into words. Here “articulation” is meant in both of its common senses: We give a more fine-toothed and determinate shape to our views about important matters (i.e., give them greater articulation) by bringing them into the space of words (i.e., by articulating them).19 This activity requires fidelity to our actual outlook, but it also alters that outlook by finding words for it that we are prepared to live by, hence setting the stage for another, more adequate articulation. If this is philosophy, then philosophy is continuous with the sort of self-formative activity all human beings go through again and again in the course of their lives, provided they live with even a modicum of deliberateness.

But this is as it should be. Philosophy is not a recherché topic through which certain bookish human beings cultivate an optional or adventitious expertise. It is the intensification and refinement of an inescapable human task—the task of “being-underway towards what is to be uncovered” (to borrow Martin Heidegger’s phrase for the Greek notion of the fundamental posture of the human being).20 Our engagement in this task is intensified when we hear the appeal of words emerging from the far horizon of our understanding—words that seem like the thing to say, though we have not yet entirely appreciated their meaning. Such words call us to an effort to understand them, to uncover what we find appealing about them. We relate to such words as the instruments of our own becoming.

The intensification of this form of living needn’t end at graduation, and it needn’t be restricted to book groups or evenings at the theater. It is a daily possibility that can infuse the daily activities, including the daily work, of almost any way of life. In a sense, then, the humanities can be said to be useful for any career whatsoever, but the use lies not in increasing one’s capacity to secure the characteristic ends of that career but in deepening one’s experience of its characteristic activities.

It may appear that this essay has drifted from a wide discussion of the humanities to a narrower discussion of Platonic philosophy. Have I changed the subject, offering an impolitic defense only of the latter and leaving the rest of the humanities to find some other way of shooting themselves in the foot? I don’t think so. It seems to me that the self-formative and culture-formative mode of thought whose value I’ve defended is present in nearly all fields of the humanities. It should be kept in mind, though, that the category of the humanities has haphazard boundaries. It is not clear to me, for instance, why logic is in and mathematics out, or why history is in and cultural anthropology out. Perhaps there are fields within the humanities that do not encourage the self-cultivating and culture-shaping use of thought to which I’ve called attention, and fields beyond the humanities that do. If so, this might be a ground for revisiting the traditional understanding of the demarcation, but that would be an argument for another day. What matters for now is that I believe myself to have rested my case for a form of thought that is found not only in philosophy but also in classics, in history, and in the circle of creative expression and critical response that characterizes the fine arts and the academic study of drama, poetry, literature, and the visual arts.

It’s said that college is not the real world, and in a sense I’m happy to affirm that. But I don’t see it as mere preparation for the things of real substance and value—that’s not the mode of its remove from reality. I see it instead as a kind of polis apart, with a few permanent members and an ever-changing citizenry of youths. What happens in this polis, when it’s in good working order, is a kind of intensification of a form of reflective self-cultivation that can and ought to be a continuous life activity. It is the stuff of a good life, not some mere instrumental means. It can be intertwined with, and can deepen, almost any subsequent life activity (including many forms of work and political engagement). This parallel polis provides an important counterweight to the culture-shaping effects that arise from the melding of corporate capitalism and contemporary communications technology. Because the academy encourages an open-ended form of self-cultivation, and because it provides an important counterweight to an outlook on value that threatens to render us a monoculture, it can be defended in the name of liberal pluralism, and the liberal should not adopt standards of public argument that prevent us from bringing the value of the academy into view. It would be a devastating loss if we remade this parallel polis in accordance with the guiding values of the corporation. This is not at all to say that we have no need to remake this parallel polis. But we ought to remake it in the image of its best self.

Endnotes

My colleague Mitch Green pointed this out in a memorable speech to students and faculty from the steps of the Rotunda. He has since left for the University of Connecticut, which made a concerted effort to poach University of Virginia faculty during the leadership crisis.

I owe this idea of self-articulation to Charles Taylor, who sets it out in “Responsibility for Self,” in The Identities of Persons, ed. Amelie Rorty (Berkeley and Los Angeles: University of California Press, 1976), 281–300.