faculty

Want to up your citation stats? Try changing your name – but make sure it starts with an “A,” “B,” or “C.” That’s what a new paper in Economic Inquiry suggests (an abstract is available here). The study, by Wei Huang, a Ph.D. candidate in economics at Harvard University, says that researchers whose last names begin with A, B, or C who are listed first as authors in articles in a variety of science journals receive, on average, one to two more citations than their peers whose names start with X, Y, or Z.

The effect is most evident when reference lists are long. The effect is not evident in self-citations. Researchers whose names begin with the letters D-W fall somewhere in the middle in numbers of citations. Huang calls the effect modest but “salient” and attributes it in part to the fact that authors are listed alphabetically in many reference lists. Huang says the findings raise questions about the validity of citation indexes, in that quantity may not be as reliable an indicator of quality as many believe it is.

Many a thick academic tome turns out to be a journal article wearing a fat suit. So all due credit to Anna M. Young, whose Prophets, Gurus, and Pundits: Rhetorical Styles and Public Engagement was published by Southern Illinois University Press this year. Her premise is sound; her line of argument looks promising; and she gets right to work without the rigmarole associated with what someone once described as the scholarly, “Yes, I read that one too” tic.

Indeed, several quite good papers could be written exploring the implicit or underdeveloped aspects of her approach to the role and the rhetoric of the public intellectual. Young is an associate professor of communication at Pacific Lutheran University, in Tacoma, Washington. Much of the book is extremely contemporary in emphasis (to a fault, really, just to get my complaint about it out front here). But the issue it explores goes back at least to ancient Rome -- quite a while before C. Wright Mills got around to coining the expression “public intellectual” in 1958, in any case.

The matter in question emerges in Cicero’s dialogue De Oratore, where Young finds discussed a basic problem in public life, then and now. Cicero, or his stand-in character anyway, states that for someone who wants to contribute to the public discussion of important matters, “knowledge of a vast number of things is necessary, without which volubility of words is empty and ridiculous.”

On the other hand -- as Cicero has a different character point out -- mere possession of learning, however deep and wide, is no guarantee of being able to communicate that learning to others. (The point will not be lost on those of you surreptitiously reading this column on your mobile phones at a conference.)

Nobody “can be eloquent on a subject that he does not understand,” says Cicero. Yet even “if he understands a subject ever so well, but is ignorant of how to form and polish his speech, he cannot express himself eloquently about what he does understand.”

And so what is required is the supplementary form of knowledge called rhetoric. The field had its detractors well before Cicero came along. But rhetoric as defined by Aristotle referred not to elegant and flowery bullshit but rather to the art of making cogent and persuasive arguments.

Rhetoric taught how to convey information, ideas, and attitudes by selecting the right words, in the right order, to deliver in a manner appropriate to a particular audience -- thereby convincing it of an argument, generally as a step toward moving it to take a given action or come to a certain judgment or decision. The ancient treatises contain not a little of what would later count as psychology and sociology, and modern rhetorical theory extends its interdisciplinary mandate beyond the study of speech, into all other forms of media. But in its applied form, rhetoric continues to be a skill of skills – the art of using and coordinating a number of registers of communication at the same time: determining the vocabulary, gestures, tone and volume of voice, and so on best-suited to message and audience.

When the expression “public intellectual” was revived by Russell Jacoby in the late 1980s, it served in large part to express unhappiness with the rhetorical obtuseness of academics, particularly in the humanities and social sciences. The frustration was not usually expressed quite that way. It instead took the form of a complaint that intellectuals were selling their birthright as engaged social and cultural critics in exchange for the mess of pottage known as tenure. It left them stuck in niches of hyperspecialized expertise. There they cultivated insular concerns and leaden prose styles, as well as inexplicable delusions of political relevance.

The public intellectual was a negation of all of this. He or she was a free-range generalist who wrote accessibly, and could sometimes be heard on National Public Radio. In select cases the public intellectual was known to Charlie Rose by first name.

I use the past tense here but would prefer to give the term a subscript: The public intellectual model ca. 1990 was understood to operate largely or even entirely outside academe, but that changed over the following decade, as the most prominent examples of the public intellectual tended to be full-time professors, such as Cornel West and Martha Nussbaum, or at least to teach occasionally, like Judge Richard Posner, a senior lecturer in law at the University of Chicago.

And while the category continues to be defined to some degree by contrast with certain tried-and-true caricatures of academic sensibility, the 2014 model of the public intellectual can hardly be said to have resisted the blandishments of academe. The danger of succumbing to the desire for tenure is hardly the issue it once might have seemed.

Professor Young’s guiding insight is that public intellectuals might well reward study through rhetorical analysis -- with particular emphasis on aspects that would tend to be missed otherwise. They come together under the heading “style.” She does not mean the diction and syntax of their sentences, whether written or spoken, but rather style of demeanor, comportment, and personality (or what’s publicly visible of it).

Style in Young’s account includes what might be called discursive tact. Among other things it includes the gift of knowing how and when to stop talking, and even to listen to another person’s questions attentively enough to clarify, and even to answer them. The author also discusses the “physiological style” of various public intellectuals – an unfortunate coinage (my first guess was that it had something to do with metabolism) that refers mostly to how they dress.

A public intellectual, then, has mastered the elements of style that the “traditional intellectual” (meaning, for the most part, the professorial sort) typically does not. The public perceives the academic “to be a failure of rhetorical style in reaching the public. He is dressed inappropriately. She carries herself strangely. He describes ideas in ways we cannot understand. She holds the floor too long and seems to find herself very self-important.” (That last sentence is problematic in that a besetting vice of the self-important that they do not find themselves self-important; if they did, they’d probably dial it down a bit.)

Now, generations of satirical novels about university life have made clear that the very things Young regards as lapses of style are, in fact, perfectly sensible and effective rhetorical moves on their own terms. (The professor who wears the same argyle sweater year-round has at least persuaded you that he would rather think about the possible influence of the Scottish Enlightenment on The Federalist Papers than the admittedly large holes.)

But she longs for a more inclusive and democratic mode of engagement of scholarship with the public – and of the public with ideas and information it needs. To that end, Young identifies a number of public-intellectual character types that seem to her exemplary and effective. “At different times,” she writes, “and in different cultural milieus, different rhetorical styles emerge as particularly relevant, powerful, and persuasive.” And by Young’s count, six of them prevail in America at present: Prophet, Guru, Sustainer, Pundit, Narrator, and Scientist.

“The Prophet is called by a higher power at a time of crisis to judge sinners in the community and outline a path of redemption. The Guru is the teacher who gains a following of disciples and leads them to enlightenment. The Sustainer innovates products and processes that sustain natural, social, and political environments. The Pundit is a subject expert who discusses the issues of the day in a more superficial way via the mass media. The Narrator weaves experiences with context, creating relationships between event and communities and offering a form of evidence that flies below the radar in order to provide access to information.” Finally, the Scientist “rhetorically constructs his or her project as one that answers questions that have plagued humankind since the beginnings….”

The list is presumably not meant to be exhaustive, but Young finds examples of people working successfully in each mode. Next week we'll take a look at what the schema implies -- and at the grounds for thinking of each style as successful.

Few people appear happy with the state of shared governance at American colleges and universities. Faculty members complain that they are being disempowered by administrators and trustees who are creating an increasingly "corporatized" academic environment and who are more concerned with budgets than with quality. Administrators lament the extent to which faculties seem oblivious to the fiscal realities threatening the status quo and to the need for significant or even radical change. Trustees struggle to find the appropriate balance between too much and too little involvement in the activities of both faculty members and administrators. And legislators seem baffled by the whole system -- though in my experience bafflement is actually one of the less dangerous states in which legislators might find themselves. It is when they think they understand things that I get worried.

I am inclined to believe that many of these concerns are overblown. Bad actors and bad decisions are unavoidable in any large and diverse system, but these still seem to me the exception and not the rule. Most faculty members and most administrators appear to me to want what they have always wanted: to create vibrant and supportive environments within which the work of teaching and learning can be carried out at a high level. Achieving this goal has become increasingly difficult as the economic model of higher education has come under more intense stress, and this has cast the long-present messiness of the shared governance model into sharper relief. It has never been easy to maintain equilibrium within such a complex system of institutional decision-making, but today the stakes seem higher and the cost of missteps or inaction much greater.

The interesting question is not whether the shared governance model is irrevocably broken, but whether it can be improved. I believe that it can and that improvement must begin with a better understanding of where the fault lines in the current system lie.

Imagine that two people are charged with the completion of two tasks. They can choose to "share" this responsibility in a couple of different ways: each can be assigned to the completion of one task, or both can work on both tasks together. Depending on the nature of the tasks — and the people — one or another of these approaches may be the more effective.

Shared governance at most colleges has evolved into a model that more closely resembles the first than the second of these approaches. It is common to find the faculty charged with the design, oversight, and teaching of the curriculum, with some minimal level of input from administrators. Virtually all other matters — co-curricular programming, student life, and, above all else, decisions about the spending of institutional dollars — are chiefly the purview of administrators, with some minimal level of input from faculty. We have, that is, a system of sharing through division more than a system of sharing through deep collaboration.

There is no denying that in some respects this division makes a good deal of sense: faculty members are the ones best-situated by training and position to make decisions about academic matters, and administrators in areas like student life and finance have both training and relevant experience that most faculty members lack. But there is also no denying that the absence of extensive collaboration between those charged with designing a complex and expensive service and those charged with creating a sustainable economic model for that service, especially when there are serious questions about its sustainability, is far from optimal. At the risk of appearing to trivialize our important enterprise — and we do take ourselves pretty seriously — I would liken many colleges to restaurants at which the chefs decide upon the menu items to be offered, the managers work on the business model, and neither group does much consulting with the other. Such an establishment is not likely to survive for very long.

While this particular system of what I would call divided governance has long been in place on college campuses, there is reason to believe that the division has in recent years gotten sharper and more absolute. Faculty members at selective institutions, both large and small, have over time been expected to become more fully and continuously engaged in scholarly activities and therefore have become less likely to carve out time to learn the ins and outs of college governance. Disciplinary training has become increasingly specialized, leaving faculty members less able or less inclined to think in institutional rather than departmental terms. Many more college leaders, especially at the presidential level, are now career administrators or are even being drawn from outside academe, and while this may or may not prepare them to be highly effective presidents, it certainly leaves the faculty concerned, with good reason, about their preparedness to speak to matters of the curriculum.

There is more. Academic administration has itself become both more professionalized and more specialized, as evidenced by the proliferation of graduate programs in the field and of "how to" seminars, conferences, and books for current or aspiring deans and presidents. Though some faculty members do persist in perceiving administrators as failed professors or, in the words of Professor Rob Jenkins, as "managers who just might be more concerned with the bottom line than with educational quality," the simple truth is that running a college has become an increasingly complex job that, like most such jobs, requires preparation, experience, and ongoing study, and that it is hard to do well in one's spare time.

As a faculty member I spent my time studying Dickens and honing my skills in the classroom. As a college president I spend my time learning about everything from admissions yield models to bond ratings to Title IX requirements and honing my skills in leadership. I have found both kinds of work to be demanding and rewarding, but would be incapable of doing the two simultaneously, at least at the level of excellence to which one should aspire.

In short, the days of faculty participating as a matter of course in admissions decisions or of presidents being drawn regularly from the ranks of the faculty at their own institutions are over.

Shared governance is not going away, nor is it clear, given the nature of college communities, that there is a preferable alternative. There are, however, a number of steps that can be taken to optimize the beneficial and minimize the deleterious effects of the system. The most important of these might be an attitudinal shift, on the part of both faculty members and administrators, away from a Manichean view of the academic world and toward a view more nuanced and accurate. So long as faculty members see administrators chiefly as "managers ... more concerned with the bottom line than with educational quality" — or worse — and so long as administrators see most faculty members as utterly indifferent to something as important as "the bottom line," the sharing of governance between the two groups will be fraught with conflict and distrust. I say this knowing that such changes in attitude are much easier to describe than to bring about but also out of a deep conviction that no change would do more to improve the quality of decision-making. I also believe that in this conversation, as in so many others, the loudest and most influential voices come from the extremes and that the majority of faculty members and administrators are not so far apart in priorities as the academic press would suggest.

My other recommendations are more concrete.

1. Rely more heavily for important decisions on representative rather than direct democracy.

All-faculty meetings are simply the wrong place to make decisions that have a serious strategic or financial impact on an institution. There is neither the time nor the base of information nor, at most colleges, the appropriate atmosphere necessary for careful and informed deliberation. Better outcomes are likely to come from elected faculty committees whose members have the time and willingness to study complex issues. These committees should be more fully empowered to make decisions and not just to offer recommendations to the full faculty. At most colleges, the one piece of deeply consequential business that is carried out wholly by an elected committee is the tenure and promotion process. Not surprisingly, it is also the piece of business that typically gets done most carefully and effectively.

2. Be sure that there is at least one body on campus whose members include both administrative leaders and elected faculty representatives and whose charge is to consider, in confidence, matters of strategic importance that cut across all areas of operations.

The remedy for the current weaknesses in shared governance lies not simply in taking authority away from the "full faculty." It also lies in providing more information to, and consulting more extensively with, the faculty in the form of their elected representatives — not only about curricular matters, but about all matters that affect their ability to carry out their work. This will not happen consistently unless it is built into the regular governance structure of the institution.

3. Include an elected faculty representative on the president's senior staff.

It is time that we stopped pretending that the faculty view the provost or the academic dean as "one of them" and therefore as their voice at the table during discussions among administrative leaders. The moment the provost becomes provost, she or he is viewed chiefly as an administrator, even if that individual is broadly respected and, indeed, even if that individual was drawn directly from the faculty ranks (a move that is becoming less common). The substantive and symbolic benefits of having an elected faculty voice in the room far outweigh the risks and drawbacks. This would not be a full-time position but rather the equivalent of chairing an important committee, since it is essential that this representative continue to be seen chiefly as a member of the faculty.

4. Provide as many opportunities as possible for faculty members who are interested in college governance to learn about all aspects of the college.

It has been my experience that those faculty members who are the finest teachers and most active scholars are only infrequently interested in administrative careers but are often interested in leadership more broadly understood. Such faculty members can best contribute to shared governance if they are as informed as possible about the operations, challenges, and strategic priorities of the institution. Administrators should be prepared to share with interested faculty members, honestly and fully, all pieces of information other that those that are, for one reason or another, necessarily confidential. They should offer seminars on areas such as financial operations, admissions, and fund-raising. Transparency and training do not eliminate disagreement, but over time they establish trust.

The most difficult of these changes to make is clearly the first. Almost never does a group with power relinquish it voluntarily, yet that is precisely what I am calling for in this instance: that is, for the full faculty to vote to relinquish some of its decision-making authority (any form of forced disempowerment would in my view be disastrous). The only chance for such a change to be approved would be, at the same time, to empower the faculty with more say, through elected representatives, in the decisions about which they presently have almost no say at all. Such a step might begin to move us at least minimally away from divided governance and toward a system in which tasks of great importance are more genuinely and regularly shared.

A new study on the emotional lives of adjunct professors that appears to be the first of its kind says contingent faculty members are at risk for stress, depression and anxiety due to their working conditions. The paper, written by Gretchen M. Reevey, a lecturer in psychology at California State University at East Bay, and Grace Deason, an assistant professor of psychology at the University of Wisconsin at LaCrosse, was published in Frontiers of Psychology and is available in full here.

Ask anyone professing the humanities today and you come to understand that a medieval dimness looms. If this is the end-times for the ice sheets at our poles — and it is — many of us also understand that the melt can be found closer to home, in the elimination of language and classics departments, for instance, and in the philistinism represented by governors such as Rick Scott of Florida and Patrick McCrory of North Carolina, who apparently see in the humanities a waste of time and taxpayer subsidies. In the name of efficiency and job creation, according to their logic, taxpayers can no longer afford to support bleary-eyed poets, Latin history radicals, and brie-nibbling Francophiles.

That there is a general and widespread acceptance in the United States that what is good for corporate America is good for the country is perhaps inarguable, and this is why men like Governors Scott and McCrory are dangerous. They merely invoke a longstanding and not-so-ugly stereotype: the pointy-headed humanist whose work, if you can call it that, is irrelevant. Among the many easy targets, English departments and their ilk are convenient and mostly defenseless. Few will rise to rush the barricades with us, least of all the hard-headed realists who understand the difficulties of running a business, which is what the university is, anyway.

I wish, therefore, to propose a solution that will save money, save the humanities, and perhaps make the world a better place: Close the business schools.

The Market Argument

We are told that something called “the market” is responsible for the great disparities in pay between humanities professors and business professors. To a humanist, however, this market is the great mystifier; we find no evidence of an “invisible hand” that magically allocates resources within the university. The market argument for pay differentials between business professors and historians (average pay in 2014 for full professors at all institutions: $123,233 and $86,636, respectively, a difference of almost 30 percent; average at research institutions is $160,705 and $102,981, a difference of 36 percent), for instance, fails to convince that a market is operating. This is because administrators and trustees who set salaries based upon what the market can bear, or what it calls for, or what it demands, are actually subsidizing those of us who are who are manifestly out of the market.

Your average finance professor, for instance, is not a part of this market; indeed, she is a member of the artificial market created by colleges and universities themselves, the same institutions that tout the importance of critical thinking and of creating the well-rounded individual whose liberal arts study will ostensibly make her into a productive member of our democracy. But the administrators who buy the argument that the market allocates upward of 20, 30, or 40 percent more for the business professor than it does her colleague in the humanities have failed to be the example they tout: they are not thinking.

The higher education market for business professors and legal scholars, for instance, is one in which the professor is paid as if she took her services and sold them on what is commonly call the market. Which is where she, and her talents, manifestly are not. She is here, in the building next to ours, teaching our students and doing the same work we are. If my daughter cuts our lawn, she does not get paid as if she were cutting the neighbor’s lawn.

The business professor has sacrificed the blandishments of the other market for that of the university, where she can work softer hours, have her December/January vacation, go to London during the summer on a fellowship or university grant, and generally live something approaching the good life — which is what being employed by a college or university allows the lucky who earn tenure. She avoids the other market — eschews the long hours in the office, the demands of travel, the oppressive corporate state — so that she can pick up her kids from school on occasion, sleep in on a Saturday, and turn off her smartphone. She may be part of a machine, but it is a university machine, and as machines go she could do worse. This “market” is better than the other one.

But does she bring more value to the university? Does she generate more student hours? These are questions that administrators and business professors do not ask. Why? Because they wouldn’t like the answers. They would find that she is an expensive acquisition. Unless she is one of the Wharton superstars and appears on CNN Money and is quoted in The Wall Street Journal, there’s a good chance that the university isn’t getting its money’s worth.

The Moral Argument

There is another argument for wishing our business professor adieu. She is ostensibly training the next crop of financiers and M.B.A.s whose machinations have arguably had no salutary effects on this democracy. I understand that I am casting a wide net here, grouping the good with the bad, blaming the recent implosion of the world economy on business schools. One could, perhaps, lay equal blame on the mathematicians and quantitative analysts who created the derivative algorithms and mortgage packages that even the M.B.A.s themselves don’t understand, though there’s a good chance that business school graduates hired these alpha number crunchers.

Our investment bankers and their ilk will have to take the fall because, well, they should have known better. If only because, at bottom, they are responsible — with their easy cash and credit, their drive-through mortgages, and, worst of all, their betting against the very system they knew was hopelessly constructed. And they were trained at our universities, many of them, probably at our best universities, the Harvards and Princetons and Dartmouths, where — it is increasingly apparent — the brightest students go to learn how to destroy the world.

I am not arguing that students shouldn’t take classes in accounting, marketing, and economics. An understanding of these subjects holds value. They are honorable subjects often horribly applied. In the wrong hands they become tools less of enlightenment and liberation than ruthless self-interest. And when you have groups of like-minded economic pirates banding together in the name of self-interest, they form a corporation, that is, a person. That person, it is now apparent, cannot be relied upon to do the right thing; that person cannot be held accountable.

It’s not as if this is news. Over 150 years ago, Charles Dickens saw this problem, and he wrote A Christmas Carol to address it. The hero of Dickens’s novella is Jacob Marley, who returns from the grave to warn his tightfisted partner Ebenezer Scrooge that he might want to change his ways. When Scrooge tells Marley that he was always a “good man of business,” Marley brings down the thunder: “Mankind was my business. The common welfare was my business; charity, mercy, forbearance, and benevolence, were, all, my business. The dealings of my trade were but a drop of water in the comprehensive ocean of my business!”

In closing the business schools, may the former professors of finance bring to the market a more human side (or, apropos of Dickens, a more ghostly side). Whether or not they do, though, closing the business schools is a necessary first step in righting the social and economic injustices perpetuated not by capitalism but by those who have used it to rend the very social fabric that nourishes them. By planting the seeds of corporate and financial tyranny, our business schools, operating as so many of them do in collusion with a too-big-to-fail mentality, have become the enemy of democracy. They must be closed, since, as Jacob Marley reminds us, we all live in the business world.

II. Save the Humanities

Closing the business schools will allow us to turn our attention more fully to the state of the humanities and their apparent demise. The 2013 report released by the American Academy of Arts and Sciences, which asserts that “the humanities and social sciences are not merely elective, nor are they elite or elitist. They go beyond the immediate and instrumental to help us understand the past and the future.” As if that’s going to sell.

In the wake of the academy’s report, The New York Times dutifully ran three columns on the humanities — by David Brooks, Verlyn Klinkeborg, and Stanley Fish — which dove into the wreck and surveyed the damage in fairly predictable ways (excepting Fish, whose unpredictability is predictable). Brooks remembers when they used to teach Seneca and Catullus, and Klinkeborg looks back on the good old days when everyone treasured literature and literary study. Those days are gone, he argues, because “the humanities often do a bad job of teaching the humanities,” and because “writing well used to be a fundamental principle of the humanities,” though it apparently is not anymore. Why writing well isn’t a fundamental principle of life is perhaps a better question.

We might therefore ask: Aside from the typical obeisance to something called “critical thinking,” what are the humanities supposed to do?

I propose that one of the beauties of the liberal arts degree is that it is meant to do nothing. I would like to think, therefore, that the typical humanities major reads because she is interested in knowledge for purposes outside of the pervasive instrumentalism now fouling higher education. She does not read philosophy because she wants, necessarily, to become a philosopher; she does not read poetry to become a poet, though she may dream of it; she does not study art history, usually, to become an art historian, though she may one day take this road.

She may be in the minority, but she studies these subjects because of the pleasure it gives her. Reading literature, or studying philosophy, or viewing art, or watching films — and thinking about them — are pleasurable things. What a delight to subsidize something that gives her immediate and future joy instead of spending capital on a course of study that might someday allow her to make more money so that she can do the things she wants to do at some distant time. Henry David Thoreau said it best: “This spending of the best part of one's life earning money in order to enjoy a questionable liberty during the least valuable part of it reminds me of the Englishman who went to India to make a fortune first, in order that he might return to England and live the life of a poet. He should have gone up garret at once.” If you want to be a poet, be done with it.

Does she suffer for this pleasure?

It is an unfortunate fact of our political and cultural economy that she probably does. Her parents wonder helplessly what she is up to and they threaten to cut off her tuition unless she comes to her senses. The governor and legislature of her state tell her that she is wasting her time and that she is unemployable. She goes to her advisers, who, if they are in the humanities, tell her that the companies her parents revere love to hire our kind, that we know how to think critically and write clearly and solve problems.

And it isn’t that they are lying, exactly (except to themselves). They simply aren’t telling her the whole truth: that she will almost surely never have the kind of financial success that her peers in business or engineering or medicine will have; that she will have enormous regrets barely ameliorated by the thought that she carries the fire; that the digital humanities will not save her, either, though they may help make her life slightly more interesting.

It is with this problem in mind that I argue for a vision of the university as a place where the humanities are more than tolerated, where they are celebrated as intrinsic to something other than vocationalism, as a place in which the ideology that inheres to the industrial model in all things can and ought to be dismantled and its various parts put back together into something resembling a university and not a factory floor.

Instead of making the case that the humanities gives students the skills to “succeed in a rapidly changing world,” I want to invoke the wisdom of Walt Whitman, one of the great philosophers of seeming inactivity, who wrote: “I lean and loafe at my ease observing a spear of summer grass.”

What does it mean to loafe? Whitman is reclining and relaxing, but he is also active: he “invites” his soul and “observes” the world around him. This conjunction of observation and contemplation with an invitation to the soul is the key here; using our time, energy, and intellectual faculties to attend to our world is the root of successful living. A world of contemplative loafers is one that can potentially make clear-eyed moral and ethical judgments of the sort that we need, judgments that deny the conflation of economic value with other notions of value.

Whitman would rather hang out with the men who brought in the catch than listen to the disputations of science or catch the fish himself: “You should have been with us that day round the chowder-kettle.” While I am not necessarily advocating a life of sloth, I’m not arguing against it, either. I respect the art of study for its own sake and revere the thinker who does nothing worthwhile, if by worthwhile we mean something like growing the economy. Making a living rather than living is the sign of desperation.

William Major is professor of English at Hillyer College of the University of Hartford. He is author of Grounded Vision: New Agrarianism and the Academy (University of Alabama Press, 2011).