Books

In one of those cases where satire cannot trump cold hard fact, the power brokers and heavy thinkers who gathered at an Alpine resort in Davos, Switzerland, for the World Economic Forum last month expressed great concern about the danger that growing inequality poses to social stability everywhere. As well they might.

Strictly speaking, "widening income disparities" was only one of 10 issues flagged by the Forum's Outlook on the Global Agenda 2014report, along with "a lack of values in leadership" and "the rapid spread of misinformation online." But a couple of concerns on the list -- "persistent structural unemployment" and "the diminishing confidence in economic policies" -- were variations on the same theme. Two or three other topics were related to income disparity only a little less directly

In case you didn't make it to Davos last month (my invitation evidently got lost in the mail this year ... as it has every year, come to think of it), another gathering this summer will cover much of the same ground. The 18th World Congress of the International Sociological Association -- meeting in Yokohama, Japan, in mid-July -- has as its theme "Facing an Unequal World: Challenges for Global Sociology." The scheduling of their events notwithstanding, it was the sociologists who were really farsighted about the issue of growing inequality, not the "Davos men." The ISA announced the theme for its congress as early as December 2010.

And the conversation in Japan is sure to be more focused and substantive. A lot of business networking goes on during the World Economic Forum. By some accounts, the topic of inequality figured more prominently in the news releases than in actual discussions among participants. It's almost as if all of Bono's efforts at Davos were for nought.

Available a solid six months before the sociologists put their heads together in Yokohama, Goran Therborn's The Killing Fields of Inequality (Polity) ought to steer the public's thinking into deeper waters than anything that can be reached with a reductive notion like "widening income disparities." Money provides one measure of inequality, but so do biomedical statistics, which record what Therborn, a professor emeritus of sociology at the University of Cambridge, calls "vital inequality." (Income disparities fall under the heading of "resource inequalities," along with disparities in access to nutrition, education, and other necessities of life.)

A third, less quantifiable matter is "existential inequality," which Therborn defines as "the unequal allocation of personhood, i.e., of autonomy, dignity, degrees of freedom, and of rights to respect and self-development." A big-tent concept of Therborn's own making, existential inequality covers the limitations and humiliations imposed by racism, sexism, and homophobia but also the experience of "people with handicaps and disabilities or just the indigent overlorded by poorhouse wardens or condescending socio-medical powerholders," among others.

While analytically distinct, the three forms of inequality tend to be mutually reinforcing, often in perfectly understandable but no less miserable ways: "Nationwide U.S. surveys of the last decade show that the lower the income of their parents, the worse is the health of the children, whether measured in overall health assessment, limitations on activity, school absence for illness, emergency ward visits, or hospital days."

The differences in health between the offspring of well-off and low-income parents "have been measured from the child's age of two, and the differentials then grow with age." A study of mortality rates among men in Central and East European countries shows a pattern of higher education corresponding to a longer life; men with only a primary education not only died earlier but were more prone to longstanding illnesses. (The patterns among women were comparable "but differentials are smaller, less than half the male average.")

Such inequalities within countries look small compared to those between countries, of course -- and Therborn piles up the examples of so many varieties of inequality from such diverse places that it becomes, after a while, either numbing or unbearable. Generalization is hazardous, but the pattern seems to be that a considerable variety of inequalities, both inter- and intranational, has sharpened over the past 30 years or so. Not even the author's own country of origin, Sweden -- so long the promised land for social democrats -- has been spared. Therborn's study of income developments in the Stockholm Metropolitan area between 1991 and 2010 showed that "the less affluent 80 percent of the population saw their income share decline, while the most prosperous 10 percent had their share augmented from 25 to 32 percent."

Furthermore, the share of the income that top tenth earned from playing the Stockholm Stock Exchange grew 282 percent over the same period. In Sweden as elsewhere, "the top side of intra-national inequality is driven primarily by capital expansion and concentration, and that at the bottom by (politically alterable) policies to keep the poor down and softened up to accept anything."

It seems unlikely that the CEOs, financiers, and politicians at Davos ever had it put to them quite like that. But Therborn seems equally unhappy with his own discipline, which he thinks has somehow managed to dodge thinking about inequality as such.

"Among the fifty odd Research Committees of the International Sociological Association," he writes, "there is not one focused on inequality." The closest approximation is the one on "Social Stratification," which he says "has mainly been interested in intergenerational social mobility."

That mobility having been, for the most part, upwards. But the distance from the bottom of society to its top verges ever more on the dystopian. In a rare flourish, Therborn invokes the alternative: "the positive lure of enlightened societies governed by rational and inclusive deliberation, where nobody is outcast or humiliated, and where everybody has a chance to develop his/her abilities."

To reach it, or even to move in that direction, implies a battle. "Nobody knows how it will end," he concludes. "Which side will you be on?"

I don't think he's asking just the people who will be there in Yokohama this summer.

Two great models of eloquence in the English language are The Book of Common Prayer and the translation of the Bible usually called the King James Version. A memorable passage that appears in both volumes crossed my mind while thinking about a couple of recent works of social criticism. (It also happens that Princeton University Press recently brought out The Book of Common Prayer: A Biography by Alan Jacobs, a professor of humanities at Baylor University, which a couple of readers have highly recommended.)

The text in question appears a couple of times in the New Testament as part of what's usually called "the Lord's Prayer." The Book of Common Prayer, the older of the two volumes, renders one line of the prayer as "Forgive us our trespasses, as we forgive those who trespass against us." The KJV rendering says, "Forgive us our debts, as we forgive our debtors."

To my ear, "trespasses" works better rhythmically, and it expresses the notion of "sin" or "offense" in a slightly more elegant manner. By contrast, "debt" or "debtor" expresses the same thought in a blunt and harsh way, and even conjures the old cartoon image of St. Peter recording good and evil deeds in a big ledger at the gates of heaven. Puzzled by the contrast, I consulted an extremely literal translation by J.N. Darby -- a Victorian Biblical scholar of uncompromising severity -- who suggests that "debt" is indeed what the original text says.

Around the time Darby was working on his translation, Friedrich Nietzsche fleshed out an argument about the interrelationship among guilt, debt, and memory. Bringing up an atheist philosopher pretty well guarantees someone is now offended. But The Genealogy of Morals spells out in bleak and somewhat lurid terms a point left implicit in the prayer: The debtor is at the mercy of the creditor, who has the right (or at least the power) to inflict suffering -- even bloody revenge -- when payment is not made.

Whatever else it may signify, the brutal connotations of "debt" make forgiveness sound much more demanding and consequential than "trespass" would imply. (Awkward recollection: Learning the prayer as a little kid, I pictured God being unhappy that people were ignoring a sign on His lawn.)

Homo economicus never spent all that much time on moral accounting. But at least the old bourgeois virtues included restraint and a residual belief that self-interest was justified insofar as it served a larger good. The issues that concern Andrew Ross in his new bookCreditocracy (discussed in last week's column) unfold in a world where debt itself is a kind of demigod, answerable to no higher power of any kind -- and certainly not to the state.

As the example of credit-default swaps on subprime mortgages in the go-go '00s made clear, the alchemists of finance are able to create profitable investment opportunities out of the risk (i.e., the degree of likelihood) of non-repayment -- making possible the creation of enormous fortunes from loans that cannot be repaid, at least not in full. That is but one link in a complex chain of debt-creation. Should the speculative bubble burst, the job of preventing economic meltdown falls to the government (which already has its own deficits, of course) at whatever risk to allocations for education, infrastructure, etc.

Add to it an average household debt that, Ross notes, grew from 43 percent of gross domestic product in 1980 to 97 percent in 2008 -- across three decades of stagnating wages. Throughout that period, 60 percent of income gains went to the country's wealthiest 1 percent -- a trend that changed dramatically when the economic crisis hit. Since then, 95 percent of income gains have gone to that debt-creating (if not job-creating) sliver.

David J. Blacker, a professor of philosophy of education and legal studies at the University of Delaware, characterizes the situation with a simple image in The Falling Rate of Learning and the Neoliberal Endgame (Zero Books):

"Imagine a casino in which you play with the house money and if you win you get to keep all the winnings to yourself, whereas if you lose, the house covers your bets. The literally astronomical public sums required to continue this arrangement for the minutest percentage of the population is the proximal cause of the squeeze on public resources. Schoolchildren, the poor, the sick, the disabled, the elderly etc., must all sacrifice so elites no longer have to undergo the risks that are officially supposed to be inherent in their role as fearless capitalist risk-takers. ..." But genuine competition and risk are reserved "for small businesses and other little people like private and public sector employees."

Ross responds to the debt-driven status quo by challenging a whole series of moral reflexes that have traditionally accompanied debt: the feelings of obligation and culpability, of shame and implied weakness, that the prayer rendered in the King James translation take as a given. When access to socially necessary goods (particularly higher education) is restricted or undermined by an economy making debt all but inescapable for countless people, someone ought to feel guilty when students default on their loans -- just not the students themselves. The next step is to call for large-scale fiscal disobedience: a social movement of millions of people pledging to default on their student loans. On the far side of that and other radical confrontations with the debt machine, Ross conceives the possibility of morally sound, humanely responsible systems of finance, based on communitarian social forms. Not utopia, perhaps, but a long way from here.

Massive default is a strategy I find it easier to admire, or at least to daydream about, than to recommend. It is not impossible that a million people might make such a pledge. Carrying out the action is another matter -- and if only a fraction see it through, the result is bound to be martyrdom of an uninspiring and ineffectual kind. In any case, I have no student debt to default on in solidarity, and calling for others to do so would be a case of telling them, "Let's you and him go fight."

Like Creditocracy, David Blacker's book was written in the wake of Occupy Wall Street. But where Ross occasionally sounds like Pierre-Joseph Proudhon -- with his vision of a mutualist society of small producers, exchanging goods and services with a new form of money that doesn't promote inequality -- Blacker thinks along much more classically Marxist lines. The predatory forms of financial speculation that led to the crisis five years ago will not be regulated out of existence, nor are they deviations or tumors growing on a fundamentally healthy economy. The casino will keep rewarding the high rollers when they win and shaking the rest of society down when they lose. Such investment in manufacture as continues to be made will need workers with skills and the capacity to adapt to technological developments -- but ever fewer of them.

Most of the population will be an object for social control, rather than Schooling proper. At some level most of us sense this already, making the whole notion of "education as investment in the future" an ever more problematic principle. Blacker has written probably the gloomiest book I have read in years, but in some ways it seems like a practical one. He is not a survivalist. He thinks pedagogy still has a role, provided it's geared to understanding the dire probabilities and finding ways to respond to them. It helps that Blacker is a sharp and forceful writer, giving his analysis something of the vividness and urgency of an Old Testament prophet delivering warnings that nobody really wants to hear.

The nine muses are a motley bunch. We’ve boiled them down into a generic symbol for inspiration: a toga-clad young woman, possibly plucking a string instrument. But in mythology they oversaw an odd combination of arts and sciences. They were sisters, which allegorically implies a kinship among their fields of expertise. If so, the connections are hard to see.

Six of them divvied up the classical literary, dramatic, and musical genres – working multimedia in the cases of Erato (inventor of the lyre and of love poetry) and Euterpe (who played the flute and inspired elegaic songs and poems). The other three muses handled choreography, astronomy, and history. That leaves and awful lot of creative and intellectual endeavor completely unsupervised. Then again it’s possible that Calliope has become a sort of roaming interdisciplinary adjunct muse, since there are so few epic poets around for her to inspire these days.

An updated pantheon is certainly implied by Peter Charles Hoffer’s Clio Among the Muses: Essays on History and the Humanities (New York University Press). Clio, the demi-goddess in charge of history, is traditionally depicted with a scroll or a book. But as portrayed by Hoffer -- a professor of history at the University of Georgia – she is in regular communication with her peers in philosophy, law, the social sciences, and policy studies. I picture her juggling tablet, laptop and cellphone, in the contemporary manner.

Ten years ago Hoffer published Past Imperfect, a volume assessing professional misconduct by American historians. The book was all too timely, appearing as it did in the wake of some highly publicized cases of plagiarism and fraud. But Hoffer went beyond expose and denunciation. He discussed the biases and sometimes shady practices of several well-respected American historians over the previous 200 years. By putting the recent cases of malfeasance into a broader context, Hoffer was not excusing them; on the contrary, he was clearly frustrated with colleagues who minimized the importance of dealing with the case of someone like Michael Bellesiles, a historian who fabricated evidence. But he also recognized that history itself, as a discipline, had a history. Even work that seemed perfectly sound might be shot through with problems only visible with the passing of time.

While by no means a sequel, Clio Among the Muses continues the earlier book’s effort to explain that revisionism is not a challenge to historical knowledge, but rather intrinsic to the whole effort to establish that knowledge in the first place. “If historians are fallible,” Hoffer writes, “there is no dogma in history itself, no hidden agenda, no sacred forms – not any that really matter – that are proof against revision… Worthwhile historical scholarship is based on a gentle gradualism, a piling up of factual knowledge, a sifting and reframing of analytical models, an ongoing collective enterprise that unites generation after generation of scholars to their readers and listeners.”

Hoffer’s strategy is to improve the public’s appreciation of history by introducing it to the elements of historiography. (That being the all-too-technical term for the history of what historians do, in all its methodological knottiness.) One way to do so would be through a comprehensive narrative, such as Harry Elmer Barnes offered in A History of Historical Writing (1937), a work of terrific erudition and no little tedium. Fortunately Hoffer took a different route.

Clio Among the Muses instead sketches the back-and-forth exchanges between history and other institutions and fields of study: religion, philosophy, law, literature, and public policy, among others. Historians explore the topics, and use the tools, created in these other domains. At the same time, historical research can exert pressure on, say, how a religious scripture is interpreted or a law is applied.

Clio’s dealings with her sisters are not always happy. One clear example is a passage Hoffer quotes from Charles Beard, addressing his colleagues at a meeting of the American Historical Association in 1933: “The philosopher, possessing little or no acquaintance with history, sometimes pretends to expound the inner secret of history, but the historian turns upon him and expounds the secret of the philosopher, as far as it may be expounded at all, by placing him in relation to the movement of ideas and interests in which he stands or floats, by giving to his scheme of thought its appropriate relativity.”

Sibling rivalry? The relationships are complicated, anyway, and Hoffer has his hands full trying to portray them. The essays are learned but fairly genial, and somehow not bogged down by the fundamental impossibility of what the author is trying to do. He covers the relationship between history and the social sciences – all of them -- in just under two dozen pages. Like Evel Knievel jumping a canyon, you have to respect the fact that, knowing the odds, he just went ahead with it.

But then, one of Hoffer’s remarks suggests that keeping one’s nerve is what his profession ultimately requires:

“Historical writing is not an exercise in logical argument so much as an exercise in creative imagination. Historians try to do the impossible: retrieve an ever-receding and thus never reachable past. Given that the task is impossible, one cannot be surprised that historians must occasionally use fallacy – hasty generalization, weak analogy, counterfactual hypotheticals, incomplete comparisons, and even jumping around in past time and space to glimpse the otherwise invisible yesteryear.”

Originally published by Encyclopedia Britannica in 1952, Great Books of the Western World offered a selection of core texts representing the highest achievements of European and North American culture. That was the ambition. But today the set is perhaps best remembered as a peculiar episode in the history of furniture.

Many an American living room displayed its 54 volumes -- “monuments of unageing intellect,” to borrow a phrase from Yeats. (The poet himself, alas, did not make the grade as Great.) When it first appeared, the set cost $249.50, the equivalent of about $2,200 today. It was a shrewd investment in cultural capital, or at it least it could be, since the dividends came only from reading the books. Mortimer Adler – the philosopher and cultural impresario who envisioned the series in the early 1940s and led it through publication and beyond, into a host of spinoff projects – saw the Great Books authors as engaged in a Great Conversation across the centuries, enriching the meaning of each work and making it “endlessly rereadable.”

Adler's vision must have sounded enticing when explained by the Britannica salesman during a house call. Also enticing: the package deal, with Bible and specially designed bookcase, all for $10 down and $10 per month. But with some texts the accent was on endless more than rereadable (the fruits of ancient biological and medical research, for example, are dry and stony) and it is a good bet that many Great Books remained all but untouched by human hands.

Well, that’s one way to tell the Great Books story: High culture meets commodity fetishism amidst Cold War anxiety over the state of American education. But Tim Lacy gives a far more generous and considerably more complex analysis of the phenomenon in The Dream of a Democratic Culture: Mortimer J. Adler and the Great Books Idea, just published by Palgrave Macmillan. The book provides many unflattering details about how Adler’s pedagogical ambitions were packaged and marketed, including practices shady enough to have drawn Federal Trade Commission censure in the 1970s. (These included bogus contests, luring people into "advertising research analysis surveys" that turned into sales presentations, and misleading "bundling" of additional Great Books-related products without making clear the additional expense.) At the same time, it makes clear that Adler had more in mind than providing a codified and “branded” set of masterpieces that the reader should passively absorb (or trudge through, as the case may be).

The Dream of a Democratic Culture started life as a dissertation at Loyola University in Chicago, where Lacy is currently an academic adviser at the university’s Stritch School of Medicine. In its final pages, he describes the life-changing impact on him, some 20 years ago, of studying Adler’s How to Read a Book (1940), a longtime bestseller. He owns and is reading his way through the Great Books set, and his study reflects close attention to Adler’s own writings and the various supplementary Great Books projects. But in analyzing the life and work of “the Great Bookie,” as one of Adler’s friends dubbed him, Lacy is never merely celebratory. In the final dozen years or so before his death in 2001, Adler became one of the more splenetic culture warriors – saying, for example, that the reason no black authors appeared in the expanded 1990 edition of the Great Books was because they “didn’t write any good books.”

Other such late pronouncements have been all too memorable -- but Lacy, without excusing them, makes a case that they ought not to be treated as Adler’s definitive statements. On the contrary, they seem to betray principles expressed earlier in his career. Lacy stops short of diagnosing the aging philosopher’s bigoted remarks as evidence of declining mental powers, though it is surely a tempting explanation. Then again, working at a medical school would probably leave a non-doctor chary about that sort of thing.

I found The Dream of a Democratic Culture absorbing and was glad to be able to interview the author about it by email; the transcript follows. Between questions, I looked around a used-books website to check out the market in secondhand copies of Great Books of the Western World is like. One listing for the original 1952 edition is especially appealing, and not just because of its price (under $250, in today’s currency). “The whole set is in very good condition,” the bookseller writes, “i.e., not read at all.”

Q: How did your personal encounter with the Great Books turn into a scholarly project?

A: I started my graduate studies in history, at Loyola University Chicago, during the 1997-98 academic year. My initial plan was to work on U.S. cultural history, with a plan to zoom in on either urban environmental history or intellectual history in an urban context. I was going to earn an M.A. and then see about my possibilities for a Ph.D. program.

By the end of 1998 the only thing that had become clear to me was that I was confused. I had accumulated some debt and a little bit of coursework, but I needed a break rethink my options. I took a leave of absence for the 1999 calendar year. During that period I decided three things: (1) I wanted to stay at Loyola for my Ph.D. work; (2) Environmental history was not going to work for me there; (3) Cultural and intellectual history would work for me, but I would need to choose my M.A. thesis carefully to make it work for doctoral studies.

Alongside this intense re-education in the discipline of history I had maintained, all through the 1997 to 1999 period, my reading of the Britannica's Great Books set. I had also accumulated more books on Adler, including his two autobiographies, during stress-relief forays into Chicago's most excellent used bookstore scene. Given Adler's Chicago connections, one almost always saw his two or three of his works in the philosophy sections of these stores.

During a cold December day in 1999, while sitting in a Rogers Park coffee shop near Loyola, this all came together in a sudden caffeine-laced epiphany: Why not propose the Great Books themselves as the big project for my graduate study? I sat on the idea for a few days, both thinking about all the directions I could take for research and pounding myself on the head for not having thought of the project sooner. I knew at this point that Adler hadn't been studied much, and I had a sense that this could be a career's worth of work.

The project was going to bring together professional and personal interests in a way that I had not imagined possible when thinking about graduate school.

Q: Did you meet any resistance to working on Adler and the Great Books? They aren’t exactly held in the highest academic esteem.

A: The first resistance came late in graduate school, and after, when I began sending papers, based on my work, out to journals for potential publication. There I ran into some surprising resistance, in two ways. First, I noticed a strong reluctance toward acknowledging Adler's contributions to American intellectual life. As is evident in my work and in the writings of others (notably Joan Shelley Rubin and Lawrence Levine, but more recently in Alex Beam), Adler had made a number of enemies in the academy, especially in philosophy. But I had expected some resistance there. I know Adler was brusque, and had written negatively about the increasing specialization of the academy (especially in philosophy but also in the social sciences) over the course of the 20th century.

The second line of resistance, which was somewhat more surprising, came because I took a revisionist, positive outlook on the real and potential contributions of the great books idea. Of course this resistance linked back to Adler, who late in his life — in concert with conservative culture warriors --- declared that the canon was set and not revisable. Some of the biggest promoters of the great books idea had, ironically, made it unpalatable to a great number of intellectuals. I hadn't anticipated the fact that Adler and the Great Books were so tightly intertwined, synonymous even, in the minds of many academics.

Q: Selecting a core set of texts was only part of Adler's pedagogical program. Your account shows that it encompassed a range of forms of instruction, in various venues (on television and in newspapers as well as in classrooms and people’s homes). The teaching was, or is, pitched at people of diverse age groups, social backgrounds, and so on -- with an understanding that there are numerous ways of engaging with the material. Would you say something about that?

A: The great books idea in education --- whether higher, secondary, or even primary --- was seen by its promoters as intellectually romantic, adventurous even. It involved adults and younger students tackling primary texts instead of textbooks. As conceived by Adler and Hutchins, the great books idea focused people on lively discussion rather than boring Ben Stein-style droning lectures, or PowerPoints, or uninspiring, lowest-common-denominator student-led group work.

One can of course pick up bits of E.D. Hirsch-style "cultural literacy" (e.g., important places, names, dates, references, and trivia) through reading great books, or even acquire deeper notes of cultural capital as described in John Guillory's excellent but complex work, Cultural Capital: The Problem of Literary Canon Formation (1993). But the deepest goal of Adler's model of close reading was to lead everyday people into the high stakes world of ideas. This was no mere transaction in a "marketplace of ideas," but a full-fledged dialogue wherein one brought all her or his intellectual tools to the workbench.

Adler, Hutchins, John Erskine, Jacques Barzun, and Clifton Fadiman prided themselves being good discussion leaders, but most promoters also believed that this kind of leadership could be passed to others. Indeed, the Great Books Foundation trained (and still trains) people to lead seminars in a way that would've pleased Erskine and Adler. Education credentials matter to institutions, but the Foundation was willing train people off the street to lead great books reading groups.

This points to the fact that the excellent books by famous authors promoted by the great books movement, and the romance inherent in the world of ideas, mattered more than the personality or skill of any one discussion moderator. All could access an engagement with excellence, and that excellence could manifest in texts from a diverse array of authors.

Q: It seems like the tragedy of Adler is that he had this generous, capacious notion that could be called the Great Books as a sort of shorthand – but what he's remembered for is just the most tangible and commodified element of it. A victim of his own commercial success?

A: Your take on the tragedy of Adler is pretty much mine. Given his lifelong association with the great books project, his late-life failings almost guaranteed that the larger great books idea would be lost in the mess of both his temporary racism and promotion of Britannica's cultural commodity. The idea came to be seen as a mere byproduct of his promotional ability. The more admirable, important, and flexible project of close readings, critical thinking, and good citizenship devolved into a sad Culture Wars spectacle of sniping about race, class, and gender. This is why I tried, in my "Coda and Conclusion" to end on a more upbeat note by discussing the excellent work of Earl Shorris and my own positive adventures with great books and Adler's work.

Q: Was it obvious to you from the start that writing about Adler would entail a sort of prehistory of the culture wars, or did that realization come later?

A: At first I thought I would be exploring Adler's early work on the great books during my graduate studies. I saw myself intensely studying the 1920s-1950s period. Indeed, that's all I covered for my master's project which was completed in 2002.

However, I began to see the Culture Wars more clearly as I began to think in more detail about the dissertation. It was right around this time that I wrote a short, exploratory paper on Adler's 1980s-era Paideia Project. When I mapped Paideia in relation to "A Nation at Risk" and William Bennett, I began to see that my project would have to cover Bloom, the Stanford Affair, and the 1990 release of the second edition of Britannica's set. Around the same time I also wrote a paper on Adler's late 1960s books. When I noticed the correlation between his reactions to "The Sixties" and those of conservative culture warriors, it was plain to me that I would have to explore Adler as the culture warrior.

So even though I never set out to write about the Culture Wars, I got excited when I realized how little had been done on the topic, and that the historiography was thin. My focus would limit my exploration (unlike Andrew Hartman's forthcoming study), but I was pleased to know that I might be hanging around with a vanguard of scholars doing recent history on the Culture Wars.

Q: While Adler’s response to the upheaval of the 1960s was not enthusiastic, he was also quite contemptuous of Alan Bloom’s The Closing of the American Mind. How aware of Bloom's book and its aftermath were you when you bought and started reading the Great Books?

A: Honestly, I had little knowledge of Allan Bloom nor his ubiquitous The Closing of the American Mind until the mid-1990s. This requires a little background explanation. I started college in 1989 and finished in 1994. As a small-town Midwestern teenager and late-1980s high schooler, I was something of a rube when I started college. I was only vaguely aware, in 1989, that there was even a culture war ongoing out there (except in relation to HIV and AIDS).

I'm ashamed to admit, now, how unaware I was of the cultural scene generally. Moreover, I was insulated from some of it, and its intensity, during my early college years when it was at its height because I began college as an engineering student. Not only was my area of study far outside the humanities, the intensity of coursework in engineering sheltered me from all news beyond sports (my news reading outlet at the time). Even when I began to see that engineering wasn't for me, around 1992, my (then) vocational view of college caused me to move to chemistry rather than a humanities subject.

My own rudimentary philosophy of education kept me from thinking more about the Culture Wars until my last few years as a college student. It was then that I first heard about Bloom and his book. Even so, I only read passages in it, through the work of others, until I bought a copy of the book around 2000. I didn't read The Closing of the American Mind, word-for-word, until around 2003-04 while dissertating.

Q: There was no love lost between Adler and Bloom – you make that clear!

A: In my book you can see that Adler really wanted it known that he believed Leo Strauss and all his disciples, especially Bloom, were elitists. Adler believed that the knowledge (philosophy, history, theology, psychology, etc.) contained in great books were accessible to all. While scholarship and the knowledge of elites could add to what one gained from reading great books, there was a great deal in those works that was accessible to the common man and hence available to make better citizens.

So while Adler was sort of a comic-book character, you might say he was a clown for democratic citizenship -- a deceptively smart clown champion for democratizing knowledge and for raising the bar on intelligent discourse. This analogy is faulty, however, because of the intensity and seriousness with which he approached his intellectual endeavors. He loved debate with those who were sincerely engaged in his favorite topics (political philosophy, education, common sense philosophy, etc.).

I see only advantages in the fact that I was not personally or consistently engaged in the culture wars of the late 1980s and early 1990s. It has given me an objective distance, emotionally and intellectually, that I never believed possible for someone working on a topic that had occurred in her/his lifetime. Even though I started graduate school as something of a cultural and religious conservative (this is another story), I never felt invested in making my developing story into something that affirmed my beliefs about religious, culture, and America in general.

A belief that tradition and history had something to offer people today led me to the great books, but that did not confine me into a specific belief about what great books could, or should, offer people today. I was into great books for the intellectual challenge and personal development as a thinker, not for what great books could tell me about today's political, social, cultural, and intellectual scene.

Q: You defend Adler and the Great Books without being defensive, and I take it that you hope your book might help undo some of the damage to the reputation of each -- damage done by Adler himself, arguably, as much as by those who denounced him. But is that really possible, at this late a date? Won’t it take a generation or two? Or is there something about Adler's work that can be revived sooner, or even now?

A: Thank you very much for the compliment in your distinction about defending and being defensive. I did indeed seek to revise the way in which Adler is covered in the historiography. Because most other accounts about him have been, in the main, mocking and condescending, any revisionary project like mine would necessarily have to be more positive -- to inhabit his projects and work, which could result in something that might appear defensive. I think my mentor, Lewis Erenberg, and others will confirm that I did not always strike the right tone in my early work. It was a phase I had to work through to arrive at a mature, professional take on the whole of Adler's life and the Great Books Movement.

As for salvaging Adler's work as a whole, I don't know if that's possible. Some of it is dated and highly contextual. But there is much worth reviewing and studying in his corpus. My historical biography, focused on the great books in the United States, makes some headway in that area.

Some of Adler's other thinking about great books on the international scene will make it into a manuscript, on which I'm currently working, about the transnational history of the great books idea. If all goes well (fingers crossed), that piece will be paired with another by a philosopher and published as "The Great Books Controversy" in a series edited by Jonathan Zimmerman and Randall Curren.

I think a larger book on Adler's work in philosophy is needed, especially his work in his own Institute for Philosophical Research. I don't know if my current professional situation will give me the time and resources to accomplish much more on Adler. And even if my work situation evolves, I do have interests in other historical areas (anti-intellectualism, Chicago's intellectual history, a Jacques Maritain-in-America project). Finally, I also need keep up my hobby of reading more great books!

The Chinese word for “crisis,” as generations of commencement speakers have reminded us, is written using the same character as “opportunity.” Whatever inspirational quality this chestnut may possess does not grow with repetition – and it is a curmudgeonly pleasure to learn that it’s wrong, or at best only fractionally true.

In fact both “crisis” and “opportunity” are written with two characters. The one they share can mean “quick-witted” or “device,” depending on context, and can be combined with another glyph to write “airplane.” (An airplane is uplifting, albeit not motivationally.) And Victor H. Mair, the professor of Chinese at the University of Pennsylvania who debunked this hardy linguistic urban legend, points out that apart from the Sinological blunder, it’s terrible advice: “Any would-be guru who advocates opportunism in the face of crisis should be run out of town on a rail, for his/her advice will only compound the danger of the crisis.”

But you don’t uproot a cultural weed all that easily -- especially not when crisis-mindedness has become totally normal. That’s a paradox but it’s also indisputable. A quick search of Google News finds 89.5 million articles with the word “crisis” in them as of this writing. Rhetorical inflation has a lot to do with it, of course. But it’s also the long-term effect of a state of mind that Susan Sontag characterized so well in an essay from 1988: “A permanent modern scenario: apocalypse looms … and it doesn’t occur. And it still looms. […] Modern life accustoms us to live with the intermittent awareness of monstrous, unthinkable – but, we are told, quite probable – disasters.”

The instances she had in mind were the threat of nuclear war and the AIDS epidemic. In 25 years, neither has disappeared, though other catastrophes (actual and potential) have moved to the fore. The crises change, but not the structure of feeling.

Anti-Crisis by Janet Roitman, published by Duke University Press, digs deeper than Sontag’s comments on apocalypse fatigue. Roitman, an associate professor of anthropology at the New School, approaches the ongoing discussion of subprime mortgage "crisis" (as it’s hard not to think of it) with questions about the assumptions and implicit limitations of a word so ubiquitous that it is normally taken for granted.

She does so by way of the late Reinhart Koselleck’s approach to intellectual history, known by a term even some of his English-language commentators have preferred to leave untranslated: Begriffsgeschichte. No way am I going to try to type that again, so let’s just refer to it as “conceptual history.” But arguably use of the full Teutonic monty is justified in order to distinguish Koselleck’s work from what, in the Anglo-American tradition, is called the history of ideas.

As Koselleck writes in an entry for a major conceptual-history handbook on social and political ideas, the term “crisis” played an important role in the work of the Young Hegelians, who took their master’s thinking about the philosophy of history as a starting point for the critique of existing institutions. Given that a key term in Hegel’s system is Begriff (the Concept) and that one of the Young Hegelians was Karl Marx, who maintained that recurrent crisis was an inescapable part of the history of capitalism itself – well, given all that, it’s possible to see how the word Begriffsgeschichte might carry layers of implication soon lost in translation.

The argument of Anti-Crisis is nothing if not oblique, and self-reflexive to boot, and paraphrasing it seems a fool’s errand. It is a good idea to grapple with Koselleck’s essay on crisis before reading Roitman’s book (so I learned the hard way) and no hard feelings on my part if you did so before finishing this column.

So now to run that errand. For Roitman, "crisis" is not simply a clichéd label for -- among other things -- recent economic developments, but a fraught and dubious concept. The word itself has roots in an ancient Greek medical term referring to the phase of an illness which will either kill the patient or end in recovery. It came into frequent use to describe social, political, and cultural phenomena beginning late in the 18th century -- one element in a very complex series of shifts of meaning between religious concepts of social and cosmic order and a (seemingly?) secular pattern of life.

The French Revolution, with the spectacle of comprehensive upheaval, doubtless made the word especially vivid. But Koselleck also cites Thomas Paine’s The Crisis, from 1776. “To Paine, the War of Independence was no mere political or military event,” he writes; “rather it was the completion of a universal world historical process, the final Day of Judgment that would entail the end of all tyranny and the ultimate victory over hell... .”

In sum, then, “crisis” came to possess small range of theological, political, and other connotations. Calling something a crisis implies its urgency or consequentiality. But it also posits that elements of the crisis are intelligible. They are the effects of departures from a norm, or aspects in the unfolding of some grand narrative. The crisis has causes, which we can discover. It has effects, which we begin to interpret even while enduring them.

“Crisis is a blind spot that enables the production of knowledge,” writes Roitman. “… More precisely, it is a distinction that secures ‘a world’ for observation.” The process rests upon “a distinction that generates and refers to an ‘inviolate level’ of order (not crisis)” that “is seen to be contingent (historical crises) and yet is likewise posited as beyond the play of contingency, being a logical necessity that is affirmed in paradox (the formal possibility of crisis).”

Now, assuming I understand her argument correctly, Roitman regards calling the great vertigo of financial free-fall a few years ago as something we can label a crisis -- at the risk of assuming we understand what it was, how it happened, and why.

That, in turn, posits that our ideas and information are adequate to the tasks: that government regulation distorts the healthy functioning of the marketplace (if you’re a neoclassicist) or that insufficient government regulation tips the market advantage to the unscrupulous (if you’re Keynes-minded) or that crisis is built into capitalism because of the tendency of the rate of profit to fall (as Marx believed, or didn’t believe, depending on which Marxist you ask).

The problem in any case being that the causal explanations now available rest on understandings of the economy that don’t take into account how crises (or, rather, judgments about the risk of crisis) are not only a factor in how decisions are made in financial markets but operate in instruments involved in the functioning of those markets.

Derivatives and credit default swaps are the examples that everyone has now of, at least. More have been invented, and still more will be. Risk management is a thriving field. So can we judge something to be in a crisis when expectations of crisis (and of profit from crisis) are operational – and bound to become more so? That isn’t a rhetorical question. I have no idea one way or the other, and if Anti-Crisis answers it, I did not mark the page.

“We persevere,” the author says, “in the hope that we can perceive the moments when history is alienated in terms of its philosophy – that is, that we can perceive a dissonance between historical events and representations…. We are left in a chasm: perplexed and immobilized by the supposed radical dissonance between the value of houses and the value of derivatives of houses.”

Perplexed? Yes. Immobilized? Not necessarily. (Epistemologically induced paralysis is only one of the possible responses to a foreclosured mortgage.) I respect Anti-Crisis for making me think hard, even if it occasionally felt like thinking in circles. Meanwhile, it turns out that that Simon & Schuster will be publishing something now listed simply as Untitled Financial Crisis Book, appearing under the company’s Books for Young Readers imprint in early 2015. Whatever baggage its conceptual history has laden it with, the notion of crisis seems to be making itself very much at home.

Last week Pope Francis, who is on something of a roll, assured atheists that they could get into heaven. As one of the unchurched and the disbelieving, I appreciate this expression of good will without finding the news especially consequential. There’s enough to worry about as it is, this side of death.

But the pontiff’s timing is impressive. Ronald Dworkin’s Religion Without God, the philosopher’s first posthumous work, appeared in bookstores a few days before Francis made his statement -- even though Harvard University Press listed it as an October book. (When he succumbed to leukemia in February, Dworkin was a professor of law and philosophy at New York University and an emeritus professor of jurisprudence at University College, London.) Surely it’s a matter of providence at work, or at least of synchronicity, depending on which way you’ve staked that existential wager.

I call it Dworkin’s first posthumous book, not on the basis of inside information, but from the certainty somebody is bound to raid the Nachlass of any figure so prominent in Anglo-American discussions of the philosophy of law across four decades.

Even a fairly stringent assessment of him as someone more esteemed outside his discipline than in it -- ever the complaint when someone is just too visible as a public intellectual -- ends up conceding that he did play a catalytic role, at times. Much of the commentary since his death seems to echo Dworkin’s own recollection of serving as Learned Hand’s clerk: “I disagreed with everything he said, but he was a very good person to have to argue with.” (By the way, a book bringing the philosopher’s and the judge’s ideas together for comparison seems like a project full of interesting possibilities.)

Religion Without God is based on the three Einstein Lectures that Dworkin gave at the University of Bern in Switzerland in December 2011. The lecture series began in 2009. The speakers rotate, from year to year, between a physicist, a mathematician, and a philosopher. Einstein’s occasional remarks about God (the things he actually wrote and said, not the kudzu-like apocrypha) are the seed crystals for the lectures, rather than their topic.

According the publisher’s note, Dworkin “planned greatly to extend his treatment of the subject over the next few years” but “had time only to complete some revisions of the original text,” although the volume closes with a fourth piece, “Death and Immortality,” shorter than the lectures, which bears no indication of when it was written. It begins on a mordant note, as if in reply to the Pope: “When Woody Allen was told that he would live on in his work, he replied that he would rather live on in his apartment.”

For a while I suspected that Religion Without God might be a very late installment in the New Atheism saga, and on that basis gave it wide berth. All the polemical gunpowder has run out on both sides. The very prospect of another battle -- Dworkin v. Dawkins! -- sounded as appealing as a sawdust burrito or an afternoon in line at the Department of Motor Vehicles. Life is too short.

Happily the lectures are nothing of the kind. Arguments for or against the existence of God (or gods, if you prefer) form no part of Dworkin’s project. He takes it as a given that the dispute will continue, as it must, at varying degrees of heat and lucidity. But he also takes as important and meaningful that some forms of atheism are as deeply shaped by the numinous as any religious faith.

“Numinous” is the term Rudolf Otto coined in The Idea of the Holy (1917) to name an overwhelming experience of the grandeur, power, order, significance, and strangeness (“otherness”) of the universe, or of being itself. It can be blissful, and it can be terrifying. Religious mystics have no monopoly on the numinous. Physicists and mathematicians have written about it, for example, and one of the passages from Einstein quoted by Dworkin expresses it in a forceful manner:

“To know that what is impenetrable to us really exists, manifesting itself as the highest wisdom and the most radiant beauty which our dull faculties can comprehend only in the most primitive forms – this knowledge, this feeling, is at the center of true religiousness. In this sense, and in this sense only, I belong to the ranks of devoutly religious men.”

On another occasion, Einstein said, “He to whom this emotion is a stranger, who can no longer pause to wonder and stand rapt in awe, is as good as dead; his eyes are closed.” Dworkin stresses that while monotheists may understand numinosity as a revelation of the power and awe-full reality of the Creator, it does not, as such, compel belief in a personal deity (what Dworkin refers to, from time to time, as “the Sistine god,” in honor of Michelangelo’s rendition).

Einstein, for one, dismissed the idea of such a Supreme Being existing prior to, and apart from, the universe. He said so repeatedly, although believers kept construing his remarks about “belong[ing] to the ranks of devoutly religious men” to the contrary. The physicist thought of himself as a kind of pantheist, along Spinoza’s lines. The difference between pantheism and atheism is arguably one of shading -- and Dworkin subsumes Einstein’s perspective under the rubric “religious atheism,” which would also apply to beliefs such as Ethical Culture and some kinds of pacifism.

“Religious atheism” is not meant to be an ironic label; the author shows no interest in it as paradoxical. Dworkin’s point is that a sense of “life’s intrinsic meaning and nature’s intrinsic beauty” runs deeper than one’s judgment of the source or intelligibility of that meaning and beauty. Values “are real and fundamental, not just manifestations of something else; they are as real as trees or pain.” The theist understands meaning, beauty, goodness, and other values to be the intentional creation or the commandment of a higher being, who thus merits our worship, or at least our very close attention. To live a good and meaningful life means living in accord with the divine purpose.

But for the religious atheist (which is to say, for the author himself) that is getting things more or less backward. Dworkin seems to have reached the same conclusion as Descartes on a matter that bothered the earlier thinker in his final years, as mentioned in Steven Nadler’s The Philosopher, the Priest, and the Painter (discussed in this column).

In short: Is something good – or (true, beautiful, just, etc.) because God wills it? Or is it the other way around? What if the bearded man on the ceiling of the Sistine Chapel decided that theft, murder, and cannibalism were totally fine, and even to be encouraged? Would that make them good? If not, then in some sense we have accepted that right has priority even over divine might.

Thus concluded the theist Descartes, as did the religious atheist Dworkin. There is much that I am scanting in Dworkin’s book here, in the interest of time, but that should provoke enough thought, and elicit enough invective, for now. Let me end this column, as it began, with a look to the afterlife. In a symposium at the Boston University School of Law a few years ago, Dworkin announced that he’d had a glimpse of paradise:

“Lots of people, including among them among the most distinguished philosophers and lawyers in the world, have come together to discuss a book of mine. As if that weren’t good enough, they discuss it before I’ve actually finished writing it so I can benefit from what they say. That isn’t the best part. The best part is that I don’t even have to die.”

The implication, by contrast, is that hell is all about the deadlines.