I have been going to academic conferences since I was about 12 years old. Not that I am any sort of prodigy—both of my parents are, or were at one point, academics, so I was casually brought along for the ride. I spent the bulk of my time at these conferences in hotel lobbies, transfixed by my Game Boy, waiting for my mother to be done and for it to be dinnertime. As with many things that I was made to do as a child, however, I eventually came to see academic conferences as an integral part of my adult life.

So it was that, last year, I found myself hanging out at the hotel bar at the annual conference of the Modern Language Association, despite the fact that I am not directly involved with academia in any meaningful way. As I sipped my old fashioned, I listened to a conversation between several aging literature professors about the “digital humanities,” which, as far as I could tell, was a needlessly jargonized term for computers in libraries and writing on the Internet. The digital humanities were very “in” at MLA that year. They had the potential, said a white-haired man in a tweed jacket, to modernize and reinvigorate humanistic scholarship, something that all involved seemed to agree was necessary. The bespectacled scholars nodded their heads with solemn understanding, speaking in hushed tones about how they wouldn’t be making any new tenure-track hires that year.

Related Story

See, I don’t know if you’ve heard, but there is a crisis occurring in the humanities. I cannot remember the last time I browsed the op-ed section of The New York Times without encountering someone worrying about “the continuing value of a humanities education in an increasingly technology-driven world” or something similar. For the past several years, stories about declining funding, poor job prospects, and sagging enrollments have dominated the public conversation. These stories are so prevalent, in fact, that it has become rather trite to publicly wring one’s hands over the decline of the humanities. The New Republic even features the macabre article tag “Humanities Deathwatch.” In truth, the existence of the crisis is so solidly established that complaining about the hand-wringing over the crisis has itself become a cliché.

Yet the faint reverberations of distant pianists playing the Marche funèbre of the humanities can be heard everywhere. Many public officials—like overbearing uncles at a funeral—have leaned over to offer counsel, urging everyone to consider degrees in STEM fields. President Obama has made public proclamations about the importance of financial support for STEM subjects to ensure a thriving workforce. The standard avuncular narrative about why we should choose STEM subjects runs like this: In the future, as science and technology continue to grow in cultural importance, there are going to be more and more jobs in STEM fields—and, by implication, fewer and fewer jobs in the humanities. There are figures from The National Center for Education Statistics showing as much. It is the staid duty of educators to ensure that our graduates have the skills they need to participate in tomorrow’s so-called “knowledge economy,” especially if America is to remain globally competitive—or so the argument goes.

The more I heard of this overbearing uncle’s counsel, the more I wanted another drink. As I wandered back to the hotel bar alongside a group of graduate students leaving a lecture on Ernest Hemingway, I started thinking: Isn’t it exactly this sort of hyper-competitive anti-logic that created the crisis of the humanities in the first place? Insistent warnings about the need for practicality—for sacrifices in the name of the job market—have filled students with a fearsome anxiety about their financial futures. Are you going to try and pay your electric bill with music, Susan?

Are we so certain that everything that matters in planning our future can be quantified?

In other words, the humanities crisis is largely a positive feedback loop created by stressing out over economic outcomes. Research by government bureaus held that people who studied STEM disciplines had better employment prospects. As a result, state and federal education budgets consistently made these subjects a priority. Enrollment in the humanities slumped, and this made it more difficult for budding humanists and artists to succeed, not least because fewer and fewer jobs were available in the academy.

This shift left a huge number of previously beloved intellectuals—the old guard of art and literature and history—feeling pressured, sometimes by their own colleagues, to justify their continued existence in terms of the present-day job market. The stinging irony of the whole situation is difficult to dismiss: The very people demanding to know why English and art-history departments weren’t doing very well were often the people who’d helped drive students away from those departments to begin with.

Back at the hotel bar, I got wrapped up talking to a graduate student named Matt Langione, who studies literature at Berkeley. Next to all the poorly matched blacks and grays—which are the universally accepted sartorial currency of humanities professors trying to look cool—he stood out in a snappy tie and blazer. Matt has the kind of self-assuredness and charm that makes his casual use of words like “autotelic” and “proto-conceptual” sound perfectly natural. He is somehow erudite without ever seeming condescending.

He told me he was studying modernist literature (e.g., James Joyce, Virginia Woolf, Ezra Pound) by, of all things, studying neuroscience. The novelty of Matt’s studies, it seemed to me, encapsulated the craziest thing of all about the whole “crisis of the humanities:” The conversation about funding for the humanities somehow manages to proceed in complete isolation from the actual practices of today’s humanistic scholars.

Matt’s doctoral thesis is a great example of this: He claims, in essence, that literary modernism’s insights about the relationship between abstract thoughts and tangible objects are now being understood by neurological research. “This thesis of Ezra Pound’s that poetry should yoke ideas to particular objects—so that the thing and the thought are brought together in a single manifold,” he said, “actually anticipates a very recent neuroscientific insight, which is that, in certain aesthetic states, processing and perception happen in the same cortical centers of the brain.” Matt’s big idea, in other words, is that literature sometimes comes to important conclusions about the nature of consciousness and reality before science can catch up. “The point is—and this is a major claim of literary theorists—that literature allows us to feel our way around insights that we don’t yet have a clean, conceptual articulation of.” By his logic, then, the way to drive science forward might be to fund the study of literature.

As part of my quest to prove my hotel-bar hunches, I also spoke with my good friend John Harpham, a graduate student in the government department at Harvard University. John is the author of three academic articles on the legacy of slavery, and his dissertation will focus on its intellectual history. He’s the kind of guy who enjoys thinking deeply about the likes of Abraham Lincoln.

It is painfully short-sighted to decide the value of art or literature or history solely in terms of today’s economic needs.

This is noteworthy mostly because Harvard’s government department is slanted heavily toward quantitative research, making a humanities-focused thinker like John something of an outsider. “Politics,” John said as though reciting from a prepared lecture, “is more complex than the science side of the government department would ever even guess. It consists of our arguments to each other about what is right and what is best and what has been and what should be. To only study behavior—to measure the exact amount of the incumbency advantage, for example—is not even close to what politics really is, which is a form of moral discourse. The humanities offer the only means of accessing that moral discourse.”

John’s point, in other words, is that the humanities offer a level of discourse that’s inaccessible through quantitative research. I asked whether that discourse wasn’t somehow less practical. “The problems that we face as a country,” John rejoined, “are often far more complex than they initially seem. It’s not just ‘passing a balanced budget’ or ‘making government more open.’ They involve understanding, or empathy. To learn empathy with other people, to access the historical residue that’s in all of our memories, to meet and exercise what’s highest in us. That’s the sort of civilization we should strive for. That’s a broad way of putting it, but it’s not an abstract way of putting it. And in creating that sort of civilization, humanistic scholarship is important. It has a role. Maybe not the most important role, but it’s a real role.” Are we so certain, in other words, that everything that matters in planning our future can be quantified?

To attempt a grandiloquent summary (I ran out of bitters, so it’s imperative that I wrap this up): There is little sense in denying that there is a crisis afoot in the humanities. But it’s myopic to focus on the crisis without acknowledging what the humanities really have to offer. In the absence of concrete understanding, we are left to spin about in anxious epicycles, fretting that our children’s art history and philosophy degrees will ultimately be worth no more than $4.85—the approximate cost of one page of fine bond paper. This kind of worry-worn discourse serves to reify and strengthen the downward trends in humanities enrollment. It not only makes the crisis worse; in some sense, it is the crisis. But it is painfully short-sighted to decide the value of art or literature or history solely in terms of today’s economic needs.

As for me, I have already booked my flight for another humanities conference next weekend.

Most Popular

Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.

Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.

But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.

Writing used to be a solitary profession. How did it become so interminably social?

Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.

Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.

MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.

Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.

The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.

The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.

The piece begins by detailing how Clinton helped the global bank.

“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”

There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!

And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.

He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.

During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.

During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.

Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.

An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.

Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.

The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.

The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.

What is the Islamic State?

Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.

Some say the so-called sharing economy has gotten away from its central premise—sharing.

This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.

The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”

A controversial treatment shows promise, especially for victims of trauma.

It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)

Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.

Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.