Tuesday, July 29, 2014

For teachers, one of the most enjoyable things to do is spend time being students again.

So it was that I spent the past weekend at Transylvania University’s seminar on Twenty-First Century Liberal Education, along with 18 other academics from a variety of liberal arts institutions.

We all read hundreds of pages of material in preparation. In the span of 65 hours at the seminar, we spent two hours listening to formal lectures (and another hour discussing them), 10 hours in formal discussion sessions, and countless more hours informally continuing those exchanges.

Yes, this is what teachers do in the summer for fun. And it was fun—as well as intellectually illuminating and invigorating.

It was also sobering, coming as it did at a time when higher education faces plenty of public scrutiny and criticism, and when the liberal arts and liberal arts colleges in particular face charges of irrelevance.

The value of this kind of intensive consideration of a topic is that it inevitably focuses the mind. Many of the issues we discussed have been bouncing around my brain for a while (sometimes showing up in this blog), but I’ve never considered them as intensely as I did at the seminar.

Since I’m forever preaching to my students that the best way to figure out what they think about a reading or discussion is to write about it, I’ll attempt to do that myself. (All of the pieces I quote below are from the wonderful reader that the Transylvania seminar leaders put together.)

For the historian, the easiest and most obvious conclusion to take from our readings is that there is nothing new about the liberal arts—or higher education in general—being under siege. It rather seems like a permanent state of affairs. That’s no excuse for complacency about its current challenges, to be sure, but it does help leaven one’s reaction to all of the apocalyptic warnings of the demise of liberal arts. This is not new: the liberal arts college has been through this before and survived. As Alan O. Pfnister put it in 1984, “the free-standing liberal arts college in America has been a study in persistence amid change, continuity amid adaptation.”

“Continuity and change” is the essence of history, and the story of the liberal arts has seen plenty of both. The perennial debate seems to revolve mostly around the question of value and utility: What precisely is the value of the liberal arts? How do we determine that value, how do we present that value to prospective students and their parents?

For clarity’s sake, the sides can be simplified: 1) the liberal arts have value that cannot be quantified and assessed in any meaningful way, but they prepare students to lead better, more meaningful lives; and 2) the liberal arts must demonstrate their practical value in concrete, accessible ways that give others outside the academy reason to believe they are worth the time and money expended in studying them.

Since these are simplifications, few people are likely to identify with either without some kind of reservation, but I’d argue that at some point everyone concerned with the topic will end up choosing one as having primacy over the other.

I choose the first. I am not unaware of the pressures being brought to bear to make college education ever more “practical” (read “directly applicable to post-graduation employment”) to justify its high price tag. I simply believe first causes matter and that something essential is lost when we, as another participant in the seminar put it, allow external rather than internal causes to determine what and how we teach.

The second point of view, however, seems to dominate the field these days. Writing in 2007, David C. Paris, professor of government at Hamilton College (and one-time participant in the Transylvania seminar) said: “the liberal arts and the academy in general need to make peace with, or at least acknowledge, the importance of the market.”

I’ll meet Paris half-way: I acknowledge that the market matters. Despite his rather disdainful portrayal of the traditional liberal arts as appearing “esoteric and apart from real concerns” or “ornamental,” and of its defenders as not concerned with the “real world,” I am not oblivious to reality.

But no, I will not “make peace” with the idea that the market should determine what and how educators in the liberal arts teach. Paris argues that “the liberal arts are threatened,” at least in part, by “too narrow a self-concept” among its practitioners. He writes that “promoting a good life recognizes that there are many ways of living such a life.” The latter is true. But it is not the liberal arts that are “too narrow.” It is the market that defines the good life in the most narrow way possible, i.e., by a single standard: the dollar sign.

Our students do not need the liberal arts to tell them that money matters. The entire culture tells them that relentlessly. They cannot escape it. It is our job as educators to open them to some of the other possible answers to that basic question: “What makes a good life?”

The liberal arts have a long history of addressing that question and advancing our understanding of the good. Liberal education has been a vehicle for addressing questions of inequality and oppression, empowering students to challenge the institutions that buttress those conditions, primarily through encouraging independent thinking. It has been a truly liberating force, and it has not achieved that by asking what the market wants from it.

What message does it send about the answer to that fundamental question of the good when the Association of American Colleges and Universities (AAC&U) resorts to focus groups of students and employers to tell educators what liberal education should be? Or when the AAC&U endorses and privileges certain educational trends as superior (“active” or “high-impact”) to others and justifies its prescriptions by noting that “employers strongly endorsed” them and that they will receive “very strong support from the employer community”?

Whether they realize it or not, they are saying in effect: Let the market decide. They are abdicating their responsibility as educators to shape curriculum. They are buying into not just the language but the values of the market: if it is demanded, it must be supplied.

David L. Kirp writes in Shakespeare, Einstein, and the Bottom Line: The Marketing of Higher Education: “This is more than a matter of semantics and symbols.” When we use “business vocabulary we enforce business-like ways of thinking.” (Thanks to Transylvania’s Jeffrey B. Freyman for this quotation from his paper, “The Neoliberal Turn in Liberal Education.”)

Though the proponents of this point of view often come from the progressive side of the political spectrum, they unwittingly are endorsing a decidedly illiberal view of education. As Christopher Flannery and Rae Wineland Newstad point out in “The Classical Liberal Arts Tradition,” the phrase “liberal arts” literally means the “arts of freedom” as opposed to those practiced by slaves. “Slaves are subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves.” So-called “practical” training was for slaves, and the liberal arts would ruin slaves for their role in society as servants to their superiors.

Liberal education later evolved—particularly in the United States—into not just the privilege of the already free, but as a vehicle for freeing the young from servile status. As Frederick Douglass makes clear in his autobiography, the liberating quality of education was the reason American slaves were denied it: “Knowledge unfits a child to be a slave.” Liberal education equips students to take their places as equals in a free society, as makers of their own lives.

But note how the AAC&U approached its call for reform in 2008. In advocating its “Engaged Learning Reforms” (which closely mirror John Dewey’s practical learning agenda of the 1930s--it is nothing new), AAC&U president Carol Geary Schneider justified the plan primarily with a table showing the “Percentage of Employers Who Want Colleges to ‘Place More Emphasis’ on Liberal Education Outcomes.” Leading the pack was “science and technology,” with the support of 82%. Next came “teamwork skills in diverse groups,” with 76%.

The clinching argument for Schneider is this: “these goals for college learning are strongly endorsed by the constituency that today’s students particularly want to please—their future employers.”

That sentence, to my mind, lays bare the essential problem with the AAC&U approach: rather than strongly reaffirming the goal of educating students to think for themselves—the traditional goal of liberal education—the AAC&U implicitly admits that it has substituted the goal of pleasing their future employers. At the end of the day, how far is that from students being “subjected to the will of others, mere tools or instruments of alien purposes, unable to choose for themselves”?

This vision of the liberal arts does not free students; it puts the liberal arts at the service of society’s economic masters. It is natural that economic fear in uncertain times leads college students to want to please future employers. That does not mean that educators should seek to assuage that fear by shirking their responsibility to provide their students with far more than that, or should bend the curriculum to meet the desires of employers.

Schneider’s statement is not an isolated case, either. AAC&U’s LEAP program (Liberal Education and America’s Promise) published a piece in 2005 titled “Liberal Education for the 21st Century: Business Expectations” by Robert T. Jones, president of Education and Workforce Policy. Jones is not shy about how he sees the role of higher education: it “must respond to these trends by keeping the curriculum aligned with the constantly changing content and application of technical specialties in the workplace.”

Note, education “must” serve the needs of the workplace. That which business does and wants, higher education must do—because, at the end of the day, education serves business. Education must submit to business’ “assessment” of how well it produces the “outcomes” business wants, it must get “continual input from both employers and graduates” and change its ways accordingly.

Jones states that employers “are less concerned with transcripts than the demonstration of achievement and competency across a variety of general and specialized skills.” Knowledge, wisdom, perspective—none of these traditional liberal arts goals fit this account of what employers want. “Competency” in “general and specialized skills” is the aim. Today, “competencies” has become a common buzzword in education discussions, even opening the door for granting academic credit for work experience, and threatening to make the classroom experience virtually unnecessary.

The new liberal education, Jones says, “now enhanced with practical learning [how’s that for product branding?] is the essential foundation for success in every growing occupation.”

Jones is smart enough to compliment liberal education, even as he asserts that it is, at least in its current form, wholly inadequate and must be altered to serve the workplace better. But his ultimate purpose could not be clearer: education must “make peace” with the market.

Yes, there are substantial economic pressures on students today. Do we as educators, however, serve them best by surrendering our purposes to what prospective employers tell us they want? I say no. The question we need to ask is this: are the traits that employers say they want, and the means we are urged to adopt to meet them, wholly compatible with liberal education?

Take one example: Schenider tells us that colleges should change curriculum to include more “experiential learning” such as internships and “team-based assignments”—the latter because 76% of employers want more emphasis in college on “teamwork skills.”

Do employers and faculty mean the same things when they advocate “teamwork skills” as an educational goal? If employers next tell us we're not producing the correct "outcome" when we teach teamwork, will we be called upon to change practices once again? Is it not possible that when some employers say they want employees with “teamwork skills,” they mean people who will not rock the boat and bring up the less essential “ethical values” that the team might be violating? I’d suggest that the recent record of the banking and financial industries shows that we may be teaching too much teamwork and not enough ethics.

It may not be coincidental that the two lowest priorities for employers on Schneider’s survey were “ethics and values” at 56% and “cultural values/traditions” at 53%. Would those who use such survey results to justify their preferred educational reforms also accept that the curriculum should not emphasize ethics and values, because employers don’t seem to care so much about them? Shouldn’t the low priority the employers placed on ethics and values suggest to us that perhaps their goals are not the same as liberal education’s, and make us at least question whether we should give priority to their preferences?

A liberal arts education should empower students with a sense of perspective, but that is precisely what is sorely lacking in this debate. The AAC&U approach smacks of fear and desperation, but is the reality really so dire that we need to look to surveys of employers to tell us what to do? Yes, the price of higher education is high (though not as high as the sticker price suggests, since most students do not pay that price), and students and their parents have a right to expect that a high-priced college education will prepare its graduates for life—including the working life.

But today’s sense of panic comes less from those realities than from a culture that reflexively and unthinkingly ridicules the liberal arts as impractical, simply because they do not immediately and automatically funnel graduates into high-paying jobs. Seemingly everyone from Click and Clack on “Car Talk” to President Obama buys into the idea that the art history major won’t get you a good job. We laugh and nod knowingly when people joke that all that liberal arts majors really need to know is how to ask “Do you want fries with that?”

But it is simply not true, as an AAC&U report shows. It may seem true when graduation comes and that dream job (making, say, at least as much money as last year’s tuition cost) does not materialize. It certainly did for me when I was in that boat. But I see much better now than I did then. Thirty years down the road, the full value to me of my liberal arts education continues to emerge.

The liberal education will not pay its dividends—either economic or otherwise—in one or two or five years. When we expect it to do so, we are unthinkingly adopting the short-run values of today’s market mentality, with its concern with the next quarter’s profit, not the long-term viability of the company (see, again, the banking and financial industries). When we then change the way we teach in deference to such illusory expectations, we begin to sacrifice what we have always done best in the service of a mirage.

It is hard for liberal arts colleges to preach patience and perspective; perhaps it has rarely been harder to do so than it is now. But it is true: a liberal arts education has long-term value, value that cannot be reduced to income earned two or four years out, as the President’s “College Scorecard” seems to be intending to do.

The fact of the matter is that ten or twenty or thirty years down the road, liberal arts majors are doing fine. True, they may not make as much as their cohorts in the STEM fields. Some may need a graduate degree to enhance further their economic well-being. But the traditional liberal arts curriculum does NOT condemn liberal arts graduates to a life of poverty, and we do not serve our students well when we buy into the lie that it does.

When we accept that false narrative as true, when we contort ourselves and embrace any curricular reform that promises to make us more “practical” and “useful,” when we adopt educational practices for their branding or marketing potential rather than their educational value, we betray our fundamental mission: the education of our students for freedom, not for servitude.

Tuesday, July 22, 2014

After more than four years doing this blog, I'm starting a new venture. History New Network recently invited me to blog on their site, and with this post, "Historical Humility," I begin.

I'll still be posting my pieces here; probably a day after they make their debut on HNN. And I will continue to use this space for the occasional less historical and more personal piece.

I'd like to thank you readers who have been following this blog--some since it began early in 2010. In retrospect, it seems that every time I began to wonder if it was worth the time and effort, someone would, out-of-the-blue, send me a nice compliment, or ask me when the next piece was coming. So thanks to everyone who did that.

I just wish my Dad was still here to see the new blog. He was probably the biggest fan of "The Past Isn't Past." Nothing gave me more satisfaction than when he would drop a casual "I liked your blog post" into our weekly Sunday afternoon phone call. After he passed, I went on his computer to send a message to his contacts to let them know, and noticed that "The Past Isn't Past" was the first bookmark on his web browser.

Friday, July 4, 2014

Not just because of fireworks (though who doesn't love a good fireworks display?). And not just because of cookouts (and, since you can throw a veggie burger on the grill too, who doesn't love a good cookout?). And not just because it gives me a reason to play two of my favorite songs, Bruce Springsteen's "Fourth of July, Asbury Park (Sandy)" and Dave Alvin's "Fourth of July" (though, seriously, this would be reason enough).

I love the Fourth because of the Declaration of Independence.

It began sometime in my childhood. At some point, on some vacation, at some historical site, my parents bought me a facsimile of the Declaration. It probably tells you all you need to know about me that I thought this was a great souvenir. It was hard, brittle, yellowed paper that crackled when you handled it. For some time I thought all official documents were thus. So when, in the fifth grade, my classmates called upon me to write a peace treaty ending the Great Spitball War between Group 2 and Group 3 (a foreshadowing that I would one day study diplomatic history?), I insisted on taking the piece of paper, coloring it with a yellow crayon, and then crumpling it up in a ball and flattening it out so that, at least to my eye, it looked like my copy of the Declaration. Then it was official.

Later, I eventually stopped wondering why there were so many "f"s where there should clearly be "s"s, and thought more about its content. Just about every American is familiar with the most famous passage about the self-evident truths. But there is a lot more to the Declaration. Much of it, the bulk of it really, is essentially an indictment of George III justifying the break. Reading it with an historian’s rather than
a patriot’s eye, many of the points don’t really hold up. But my favorite part of the Declaration isn’t one of the well-known lines, or something obscure from the list of charges. It comes at the end, just a simple, short phrase, and it encapsulates for me what is best about the Fourth of July.

When you think about it, July 4 isn’t really the most natural date for the nation’s birth. There are other turning points we could have chosen, for example, the outbreak of hostilities. Using that criterion, April 19, 1775, the date of the battles of Lexington and Concord, would be a better choice. Perhaps February 6, 1778, the date a great power, France, recognized American independence and entered an alliance with the U.S. that would help win the war, would be fitting. Legally one could argue that April 9, 1784, the date Britain recognized independence with its acceptance of the Treaty of Paris, was the true independence day.

But we didn’t chose the date of a battle, or the recognition of a great power, or the acceptance of the mother country. We chose the date of a declaration. What does July 4, 1776 mark, after all? A decision. An intention. Not a change in fact, but a change of mind. Looked at coldly, purely as a matter of fact, the Declaration is an absurdity. The colonies declared that they were independent, but they clearly were not. The colonies were still ruled by royal governors appointed by the King, and were occupied by tens of thousands of British soldiers. But the declaration nonetheless boldly states, in the words of a resolution first proposed by Richard Henry Lee nearly a month earlier, that “these united Colonies are, and of Right ought to be Free and Independent States.”

And it’s that phrase that I love: “and of Right ought to be.” The Declaration is not one of fact. It is one of what “of Right ought to be.” This country was founded with its eyes on the Right. Those men who signed the declaration were not always right. About some things, many of them, in many ways, were tragically wrong. But they knew the importance of what ought to be. And they knew that the most important date was not the one when men took up arms, but when they decided to do what was right. When it has been at its worst, this country has settled passively for what is, or what cynics said has always been and thus must always be. When it has been at its best, it has remembered to keep its eyes on what "of Right ought to be."

Have a wonderful Fourth of July, and sometime between the cookout and the fireworks, think a little about what of Right ought to be. And then work to make it a reality. That’s what the Fourth, and being an American, means to me.

About Me

I am professor of history and chair of the Department of History at Wofford College in Spartanburg, SC. My obsessive interest in politics goes back to watching the Watergate hearings as a child (seriously).
My current research is on radio during the Great Debate over U.S. involvement in World War II, 1939-1941.
You can follow me on Twitter: @byrnesms