Hannah Arendt considered calling her magnum opus Amor Mundi: Love of the World. Instead, she settled upon The Human Condition. What is most difficult, Arendt writes, is to love the world as it is, with all the evil and suffering in it. And yet she came to do just that. Loving the world means neither uncritical acceptance nor contemptuous rejection. Above all it means the unwavering facing up to and comprehension of that which is.

Every Sunday, The Hannah Arendt Center Amor Mundi Weekly Newsletter will offer our favorite essays and blog posts from around the web. These essays will help you comprehend the world. And learn to love it.

Fenton Johnson in Harpers meditates on a fundamental question of our loud and distracted age: “What is the usefulness of sitting alone at one’s desk and writing, especially writing those vast seas of pages that will see only the recycling bin? What is the usefulness of meditation, or of prayer? What is the usefulness of the solitary?” Being alone, avoiding society and choosing to live on one’s own, is an art, something we need to practice and learn. And Johnson argues it is worth the effort. “I do not wish to say that being solitary is superior or inferior to being coupled, nor that the full experience of solitude requires living alone, though doing so may create a greater silence in which to hear an inner voice.” That inner voice of solitude may, for one thing, speak differently than our outward voice: “Could solitaries model the choice for reverence over irony? Instead of conquering nations or mountains or outer space, might we set out to conquer our need to conquer? If that seems a tall order, I offer you Paul Cézanne, painting himself to the point of diabetic collapse, reinventing painting. Think about the hallucinatory quality of his late work; think about how modern art owes itself to solitude and low blood sugar. I offer Eudora Welty, writing magical realism when Gabriel García Márquez was a teenager. Henry James, portraying the caustic corruptions of fortress marriage, living alone in Lamb House by the sea. Zora Neale Hurston, who nurtured a flame of mysticism in a world hostile to it, and who showed that through her wits alone a black woman could live by her own rules, and who died in poverty and was buried in an unmarked grave in a potter’s field. Thomas Merton, who spent twenty years in a monastery preparing for his true vocation, which was solitude. Walt Whitman, who taught us how to be American. Emily Dickinson, his sister in solitude, who taught us how to be alive to the world, most especially to the suffering of its solitaries. I offer you Jesus, that renegade proto-feminist communitarian bachelor Jew, who reminded us of the lesson first set forth a thousand years earlier in the Hebrews’ holy book: to love our neighbors as ourselves. I offer you Siddhartha Gautama, who sat in solitude to achieve the understanding that everyone and everything are one.” As does Hannah Arendt who distinguishes solitude from loneliness and writes that solitude is the precondition for thinking, Johnson suggests that amidst the “chatter and diversions of our lives,” solitude is what can “keep the demons at bay.” The question, unasked and unanswered, is how to find and nurture solitude in a world increasingly devoid of private places.

Gabriel Weinberg, whose DuckDuckGo search engine does not track users’ web searches, thinks that the country is now ready for a thoughtful debate about data privacy. “Any day now President Obama is going to propose a new privacy bill of rights that will give you much more control over your personal information. A healthy debate will then ensue, and you can and should be a part of it. You can actually move the needle on this one. Let me try to convince you. First things first, this is not a partisan issue. This is not Obama’s debate. This is our debate. It’s our personal information. Obama is just sparking the flame. In 2012 he proposed something similar and it didn’t catch. Three short years later, enough has changed in the world to expect this time it will be different…. The question in the upcoming debate will quickly become: what limits? The status quo of collect it all and reveal as little as possible has to go, but there is a massive range between maximum possible collection and minimum necessary collection. Here are a few things we could do. Companies (and governments) could explicitly tell you what is happening to your personal information. They could allow you to opt-out. They could give you granular control of your data. They could even tell you exactly what you’re getting when you give out specific pieces of information. Disclosure requirements could mimic those in other areas like credit cards and mortgages where the most relevant risks are highlighted. In other words, there are a lot of options.” Weinberg writes that people are beginning to care and that it is time to pass new regulations limiting the use of private data. That will only happen, however, if we the people actually see the gathering, use, and selling of immense amount of our personal data as a danger. Weinberg believes that this is happening: “We’ve all noticed those annoying ads following us around the Internet. That’s just the tip of the iceberg. Most people still don’t know that private companies build and sell profiles about them or that many retailers charge different prices based on these data profiles.” The question is, once we know this is happening, will we change our behavior? Is the answer only if and when we understand what is truly lost when we give up our privacy? This is the question being asked at the Arendt Center Fall 2015 Conference “Privacy: Why Does It Matter?” Save the date: October 15-16.

Babette Babich publishes a long meditation on Margarethe von Trotta’s film “Hannah Arendt,” in which the theme is the celluloid expression of internal states. “Like Adorno, Arendt would be vigorously denounced for arrogance, an arrogance von Trotta’s film also documents (Arendt’s colleagues indict her in just this language and von Trotta’s film thus illustrates a common side of academic non-collegiality). It is also Arendt’s arrogance that colors von Trotta’s depiction (this is more of the film’s signal syncretism) of the falling out between Hannah Arendt and the Hans Jonas who would go on to make what one might describe as monotonic ethics his personal calling card. In von Trotta’s film, Jonas is represented as the injured party, a favoring that is unsurprising as the film drew on Jonas’ Memoirs (and therewith his point of view). The contrast between arrogance and the steadfast adherence to a conventionally received ethical viewpoint is key. Where arrogance is regarded as a vice, modesty is a virtue, most especially for a woman, a troublesome demand for an academic and an intellectual like Arendt. The vice of arrogance is also supposed to be emotive (though on whose side remains an open question) and perforce irrational.”

Novelist Tom McCarthy thinks that while the best and most creative among us once turned to art, they’re now working for Google: “It is not just that people with degrees in English generally go to work for corporations (which of course they do); the point is that the company, in its most cutting-edge incarnation, has become the arena in which narratives and fictions, metaphors and metonymies and symbol networks at their most dynamic and incisive are being generated, worked through and transformed. While ‘official’ fiction has retreated into comforting nostalgia about kings and queens, or supposed tales of the contemporary rendered in an equally nostalgic mode of unexamined realism, it is funky architecture firms, digital media companies and brand consultancies that have assumed the mantle of the cultural avant garde. It is they who, now, seem to be performing writers’ essential task of working through the fragmentations of old orders of experience and representation, and coming up with radical new forms to chart and manage new, emergent ones. If there is an individual alive in 2015 with the genius and vision of James Joyce, they’re probably working for Google, and if there isn’t, it doesn’t matter since the operations of that genius and vision are being developed and performed collectively by operators on the payroll of that company, or of one like it.”

Robert L. Kehoe III considers sociologist David Goldblatt’s new book The Game of Our Lives on the newfound (at least, newfound to Americans) prominence of English soccer: “Borrowing from Don DeLillo’s Underworld, Goldblatt’s investigation of British football reminds us that ‘longing on a large scale is what makes history.’ Today those grand longings have become ‘increasingly colonized by commercially manufactured imagery.’ Gone are the days where witnessing a live sporting event was principally a physical and communal experience. Now, ‘distant, mediated, artificial events’ have become ‘the central nodes of an atomized culture held together by a shared addiction to stupefaction and the spectacle.’ Subsequently, intimacy, immediacy, spontaneity, and authenticity have been replaced by hype, cliché, and exaggeration, leaving the concrete human realities of sport in the shadows of the circus. According to Goldblatt, any institution or activity subject to mediation (and especially mass-mediation) is vulnerable to its 21st-century simulacra. Taken to its logical conclusion you arrive at the overt farce of professional wrestling, and while football faces similar dangers under the influence of organized match-fixing, its salvation is ‘that the raw material out of which the media-football complex constructs the spectacle remains intensely local.’ Still, vividly capturing the drama of English football through enhanced production methods can only create the illusion of a tangible social relationship between say fans at Anfield and a bar in Los Angeles. Illusory or otherwise, English football has a growing international consumer base that doesn’t just enjoy the spectacle: they feel as though they’re a part of it. Goldblatt calls this an imaginary community; full of religious fervor but devoid of any tangible communal purpose.”

In an essay about the relationship between Charlie Brown and Charlie Hedbo, Sarah Boxer pens a paean to Peanuts: “Back in 1969, when Snoopy helped launch Charlie Mensuel, Peanuts was still seen as pretty subversive. It had a minimalist look and an existentialist twist that no other strip had. Timothy Leary, four years before writing his work on psilocybin mushrooms, praised Peanuts as ‘masterful.’ The English psychoanalyst Donald Winnicott wanted to use a picture of Linus with his blanket to illustrate what a transitional object was. And, according to Michaelis, it was ‘the first mainstream comic strip ever to regularize the use of the word “depressed.”‘ ‘Nobody was saying this stuff,’ said the cartoonist Jules Feiffer. ‘You didn’t find it in The New Yorker. You found it in cellar clubs, and, on occasion, in the pages of the Village Voice. But not many other places.’ Schulz himself knew that he was doing something new, showing that even ‘little kids can be very nasty’ to each other–and miserable, too. With a subtlety that Charlie Hebdo would never dream of, Peanuts also made people look at their own meanness and zeal, including the religious kind. In 1965, according to Michaelis, Schulz got a letter complaining that ‘the Great Pumpkin was sacrilegious.’ (Schulz agreed.) And in a memorable strip penned shortly after Snoopy’s doghouse went up in flames, while Snoopy was still mourning the cinders–his lost pool table, his books, his records, his Wyeth–you see Lucy yelling at him, in triumph: ‘You know why your doghouse burned down? You sinned, that’s why! You’re being punished for something you did wrong! That’s the way these things always work!'”

Peter Levine asks what Hannah Arendt might have meant when she praised Martin Heidegger for bringing thinking to life in his classrooms. Levine writes that philosophers can do three things: they can interpret the philosophical tradition, make rational arguments, and practice reflection and introspection. For Levine, Arendt was one of the last thinkers to do all three: “Arendt perceived Heidegger as putting these parts back together. Reading classical works in his seminar (or in a reading group, called a Graecae) was a creative and spiritual exercise as well as an academic pursuit. Karl Jaspers held different substantive positions, but he had a similar view of philosophy, the discipline to which he had moved after a brilliant career in psychiatry. Elisabeth Young-Bruehl writes that Jaspers’

new orientation was summarized in many different ways, but this sentence is exemplary: ‘Philosophizing is real as it pervades an individual life at a given moment.’ For Hannah Arendt, this concrete approach was a revelation; and Jaspers living his philosophy was an example to her: ‘I perceived his Reason in praxis, so to speak,’ she remembered (Hannah Arendt: For Love of the World, pp. 63-4).

Arendt fairly quickly decided that ‘introspection’ was a self-indulgent dead-end and that Heidegger’s philosophy was selfishly egoistic. Then the Nazi takeover of 1933 pressed her into something new, as she assisted enemies of the regime to escape and then escaped herself. She found deep satisfaction in what she called ‘action.’ From then on, she sought to combine ‘thinking’ (disciplined inquiry) with political action in ways that were meant to pervade her whole life. That combination is hard to find today, if it can be found at all.”

Writer Javier Grillo-Marxuach, reaching way back into his past, wonders what it means to derive, to plagiarize, in the age of mass culture: “The amusing truth of the matter is this: often–especially in a mature career in a medium with six decades of mass visibility–you will hear a pitch that is derivative of something that was, itself, derivative of something else that the pitcher is not aware of. More than once I have heard a younger writer say, ‘Do you remember that old episode of Star Trek: The Next Generation where Riker passes out in the teaser and wakes up 16 years later as captain of the Enterprise, but he can’t remember anything … and he cleverly realizes that his amnesia is really a Romulan ruse to get him to give up sensitive information?’ only to be shocked when told, ‘Yeah, it was a takeoff from an even older James Garner movie–based on a Roald Dahl short story–where he’s an Allied spy who passes out before the D-Day invasion, wakes up in a U.S. Army Hospital six years later, and can’t remember anything, then cleverly realizes that his amnesia is a German ruse to extract from him the location of the invasion.’ Derivation is the air we breathe.”

Film Screening, A Snake Gives Birth to a Snake and Director’s Discussion by Michael Lessac

Synopsis: A diverse group of South African actors tours the war-torn regions of Northern Ireland, Rwanda, and the former Yugoslavia to share their country’s experiment with reconciliation. As they ignite a dialogue among people with raw memories of atrocity, the actors find they must once again confront their homeland’s violent past, and question their own capacity for healing and forgiveness.

Tuesday, March 24, 2015

Weis Cinema, Campus Center, 6:30 pm

Courage To Be: Lecture and Dinner Series, with Uday Mehta

Putting Courage at the Centre: Gandhi on Civility, Society and Self-Knowledge

Invite Only. RSVP Required.

HAC Virtual Reading Group – Session #6

HAC members at all levels are eligible to participate in a monthly reading group led online via a telecommunication website by Roger Berkowitz, Director of the Hannah Arendt Center.

For questions and to enroll in our virtual reading group, please email David Bisson, our Media Coordinator, at dbisson@bard.edu.

Friday, April 3, 2015

Bluejeans.com, 11:00 am – 12:00 pm

Property and Freedom: Are Access to Legal Title and Assets the Path to Overcoming Poverty in South Africa?

A one-day conference sponsored by the Hannah Arendt Center for Politics and Humanities at Bard College, the Human Rights Project, and the Center for Civic Engagement, with support from the Ford Foundation, The Brenthurst Foundation, and The University of The Western Cape

This week on the Blog, Philip Walsh discusses Hannah Arendt’s critique of the consumer society that was emerging in the 1950s in the Quote of the Week. Psychiatrist and academic Thomas Szasz provides this week’s Thoughts on Thinking. And we appreciate a student’s personal Arendt library in our Library feature.

Thomas Levin of Princeton came to Bard Tuesday to give a lecture to the Drones Seminar, a weekly class I am participating in, led by my colleague Thomas Keenan and conceived by two of our students Arthur Holland and Dan Gettinger. Levin has studied surveillance techniques for years and he came to think with us about how the present obsession with drones will transform our landscape and our imaginations. At a time when the obsession with drones in the media is focused on their offensive capacities, it is important to recall that drones were originally developed as a surveillance technology. If drones are to become omnipresent in our lives, what will that mean?

Levin began by reminding us of the embrace of other surveillance devices in mass culture, like recording devices at the turn of the 20th century. He offered old postcards and cartoons in which unsuspecting servants or children were caught goofing off or insulting their superiors with newfangled recording devices like the cylinder phonograph and, later, hidden cameras and spy satellites. The realization emerges that we are being watched, and this sense pervades the popular consciousness. In looking to these representations from mass culture of the fear, awareness, and even expectation that we will be watched and listened to, Levin finds the emergence of what he calls “rhetoric of surveillance.”

In short, we talk and think constantly about the fact that we are or may be being watched. This cannot but change the way we behave and act. Levin poses this question. What, he asks, is the emerging drone imaginary?

To answer that question it is helpful to revisit an uncannily prescient imagination of the rise of drones in a text written over half a century ago, Ernst Jünger’s The Glass Bees. Originally published in 1957 and recently reissued in translation with an introduction by science fiction novelist Bruce Sterling, Jünger’s text centers around a job interview between an unnamed former light cavalry officer and Giacomo Zapparoni, secretive, filthy rich, and powerful proprietor of The Zapparoni Works that “manufactured robots for every imaginable purpose.” Zapparoni’s secret, however, is that he instead of big and hulking robots, he specialized in Lilliputian robots that gave “the impression of intelligent ants.”

The robots were not powerful in themselves, but they worked together. Like drone bees and drone ants—that exist only for procreation and then die—the small robots, or drones, serve specific purposes in industry or business. Zapparoni’s tiny robots “could count, weigh, sort gems or paper money….” Their power came from their coordination.

The robots “worked in dangerous locations, handling explosives, dangerous viruses, and even radioactive materials. Swarms of selectors could not only detect the faintest smell of smoke but could also extinguish a fire at an early stage; others repaired defective wiring, and still others fed upon filth and became indispensable in all jobs where cleanliness was essential.” Dispensable and efficient, Zapparoni’s little robots could do the most dangerous and least desirable tasks.

In The Glass Bees, we are introduced to Zapparoni’s latest invention: flying glass bees that can pollinate flowers much more efficiently and quickly than natural bees. The bees “were about the size of a walnut still encased in its green shell.” They were completely transparent and they were an improvement upon nature, at least insofar as the pollination of flowers was concerned. If a true or natural bee “sucked first on the calyx, at least a dessert remained.” But Zapparoni’s glass bees “proceeded more economically; that is, they drained the flower more thoroughly.” What is more, the bees were a marvel of agility and skill: “Given the flying speed, the fact that no collisions occurred during these flights back and forth was a masterly feat.” According to the cavalry officer, “It was evident that the natural procedure had been simplified, cut short, and standardized.”

Before our hero is introduced to Zapparoni’s bees, he is given a warning: “Beware of the bees!” And yet he forgets this warning. Watching the glass bees, the cavalry officer is fascinated. He felt himself “come under the spell of the deeper domain of techniques,” which like a spectacle “both enthralled and mesmerized.” His mind, he writes, went to sleep and he “forgot time” and “also entirely forgot the possibility of danger.”

Jünger’s book tells, in part, the story of our fascination and subjection to technologies of surveillance. On Facebook or Words with Friends, or even using our smart phones or GPS systems, we allow our fascination with technology to dull our sense of its danger. As Jünger writes: “Technical perfection strives toward the calculable, human perfection toward the incalculable. Perfect mechanisms—around which, therefore, stands an uncanny but fascinating halo of brilliance—evoke both fear and a titanic pride which will be humbled not by insight but only by catastrophe.”

The protagonist of The Glass Bees, a former member of the Light Cavalry and later a tank inspector, had once been fascinated by the “succession of ever new models becoming obsolete at an ever increasing speed, this cunning question-and-answer game between overbred brains.” What he came to see is that “the struggle for power had reached a new stage; it was fought with scientific formulas. The weapons vanished in the abyss like fleeting images, like pictures one throws into the fire. New ones were produced in protean succession.” Victory ceased to be about physical battle; it became, instead, a contest of technical mastery and knowledge.

The danger drones pose is not necessarily military. As General Stanley McChrystal rightly said when I asked him about this last week at the New York Historical Society, drones are simply another military tool that can be used for good or ill. Many fret today about collateral damage by drones and forget that if we had to send in armies to do these tasks the collateral damage would be much greater. Others worry about assassination, but drones are simply the tool, not the person pulling the trigger. It may be true that having drones when others don’t offers an enormous military advantage and makes the decision to go to kill easier, but when both sides have drones, we will all think heavily between beginning a cycle of illegal assassinations.

Rather, the danger of drones is how they change us as humans. As we humans interact more regularly with drones and machines and computers, we will inevitably come to expect ourselves and our friends and our colleagues and our lovers to act with the efficiency and selflessness of drones. Sherry Turkle worries that mechanical companions offer such fascination and unquestionable love that humans are beginning to prefer spending time with their machines than with other humans—who make demands, get tired, act cranky, and disappoint us. Ron Arkin has argued that robot soldiers will be more humane at war than human soldiers, who often act rashly out of exhaustion, anger, or revenge. Doctors are learning to rely on Watson and artificially intelligent medical machines, who can bring databases of knowledge to bear on diagnoses with the speed and objectivity that humans can only dream of. In every area of human life where humans once were thought to be necessary, drones and machines are proving more reliable, more capable, and more desirable.

The danger drones represent is not what they do better than humans, but that they do it better than humans. They are a further step in the human dream of self-improvement—the desire to overcome our shame at our all-too-human limitations.

The incredible popularity of drones today is partly a result of their freeing us to fight wars with ever-reduced human and economic costs. But drones are popular also because they appeal to the human desire for perfection. The question is, however, how perfect we humans can be before we begin to lose our humanity. That is, of course, the force of Jünger’s warning: Beware of the bees!

As drones appear everywhere around us, you would do well to put down the newspaper and turn off You Tube and, instead, revisit Ernst Jünger’s classic tale of drones. The Glass Bees is your weekend read. You can read Bruce Sterling’s introduction to The Glass Bees here.

Roger Berkowitz and Walter Russell Mead will be in conversation with General Stanley McChrystal this Sunday, March 10, at 5 pm at the New York Historical Society in Manhattan. Tickets are available here.

Leadership is rare; in politics today it is quite nearly extinct. Around the world politicians are paralyzed not simply by partisanship, but also by an unwillingness to make judgments. Not knowing what is right, they jockey for political power. They seek advantage, seemingly innocuous to the recognition of a responsibility to something larger than themselves. This is now a worldwide phenomenon. Italy just elected a clown. From Athens to Washington, technocrats vie for power with idealogues, with no one willing to set the common good above their own. In a time of war and economic crisis when we might expect leaders to emerge, this has not happened. At least in politics, leaders have failed to materialize.

According to opinion polls, the only institution in America with an approval rating of over 50% is the military. One reason for this may be that the armed forces have continued to generate leadership, at least to some degree. Colin Powell was for some time looked to as a leader, until his performance at the United Nations damaged his credibility. David Petraeus was lionized for a time, until he was brought down by affair and scandal. And then there is General Stanley McChrystal.

McChrystal has just written a book, My Share of the Task, which is about leadership. He begins with the bold claim, that even from his youth, “Leadership was always the objective.” And in his short epilogue, he writes: “In the end, leadership is a choice. Rank, authority, and even responsibility can be inherited or assigned, whether or not an individual desires or deserves them. Even the mantle of leadership occasionally falls to people who haven’t sought it. But actually leading is different. A leader decides to accept responsibility for others in a way that assumes stewardship of their hopes, their dreams, and sometimes their very lives. It can be a crushing burden, but I found it an indescribable honor.” McChrystal knows enough to say that leadership cannot be captured in a definition. And yet his book is undoubtedly an illustration through his example.

I cannot help compare McChrystal’s leadership with the political leadership of our country. As I read My Share of the Task, I am struck by McChrystal’s clear moral vision. Honor is at the very core of McChrystal’s understanding of leadership. He writes of the “unofficial code of honor” that governs West Point. While cadets could break rules and regulations and receive mere punishments, if they broke the code of honor, they were expelled. “The code existed to ensure that the words of cadets and officers alike could always, in all situations, be taken as truth. Lies, even small ones, threatened that system of truth.” There is in McChrystal’s creed a steely sense that a leader can make a difference. There is no doubt that McChrystal believes his leadership helped turn the tide of the war in Iraq. And from all accounts, he is right.

General Stanley McChrystal is widely credited with helping to bring about a renaissance in the American armed forces. A soldier for over 40 years, McChrystal served in the Rangers elite special forces, came to command the Rangers and then rose to command the entirety of US Special Forces involved in the war on terror. In that capacity he revolutionized the special forces, taking an agglomeration of tribal groups and integrating them into a single networked fighting machine that is credited with turning the tide of the battle against Al Qaeda in Iraq. So successful was McChrystal that President Obama put him in charge of the war in Afghanistan.

It is unclear whether McChrystal’s strategy was working in Afghanistan. His command ended when he resigned, chased out of office because of an article in Rolling Stone Magazine by Michael Hastings. Hastings article, “The Runaway General,” opens with McChrystal asking: “How’d I get screwed into going to this dinner [with a French diplomat]?” When told it comes with his position, he responds with his middle finger and asks: “Does this come with the position?” One of his staff later adds: “Some French minister…. It’s fucking gay.” When McChrystal gets a message from Richard Holbrooke on his Blackberry and exclaims: “I don’t even want to open it.” He puts the phone back in his pants. An aide responds: “Make sure you don’t get any of that on your leg.” In another incident, one of McChrystal’s top aides says of the Vice President: “Biden?” “Did you say: Bite Me?” McChrystal knew that the fallout from the article was unsalvageable. He flew to Washington and resigned.

What makes McChrystal’s fall from command interesting is the aura of leadership that has grown up around him and that is on full view in his book, My Share of the Task. McChrystal lives the kind of Spartan full-throttle military existence that is for even our most hardened soldiers a romantic myth of times past. Leadership in his mind requires walking the walk. One “extraordinary demonstration of leadership” McChrystal recounts involves a Ranger military ceremony on a cold and rainy day. The Rangers stood in formation in the rain in front of an empty bleachers.” One commander, Major General Gary Luck, walked out and sat alone in the bleachers. “He didn’t wave or call out. He didn’t order us into rigid attention. He simply sat still, under the same rain that fell on us…. I never saw a commander closer to soldiers than he was at that moment.” This is McChrystal’s holy grail: to be a leader loved by his troops, not because they like him, but because they respect him.

McChrystal’s model for leadership in the modern military is, surprisingly, Admiral Horatio Nelson, the hero of the British fleet who famously lost his life as Captain of the Victory at the battle of Trafalgar in 1805. McChrystal argues that “Nelson’s force was able to win without him in command because of what had happened long before the first shot was fired. In the years leading up of Trafalgar, Nelson cultivated traditional strengths inherent in the British navy by making technical mastery and a capacity for independence prerequisites for command.” McChrystal’s gloss is that Nelson promoted “entrepreneurs of battle,” those commanders who shared his vision and his value of technical mastery, but who could act on their own without his authority.

In McChyrstal’s own leadership we see him striving for the very entrepreneurial style he so admired in Nelson. He sets up a command structure so transparent that his subordinates always see him thinking through decisions: “As I stressed transparency and inclusion, I shared everything with the team sitting around the horseshoe and beyond. E-mails that came in were sent back out with more people added to the “cc” line. We listened to phone calls on speakerphone.” He transformed an insular and secretive special forces culture into an open and integrated force, “one of the most important nodes in an integrated network.” The result, McChrystal writes, is that his aides “could frequently anticipate my position on an issue and make the decision themselves.” His effort was to foster “decentralized initiative and free thinking while maintaining control of the organization and keeping the energy at the lowest levels directed toward a common strategy.”

Over and over, McChrystal emphasizes the way he acted on his belief that commanders must be given the control and authority over their missions. He refuses to second guess his subordinates. He seeks relentlessly to push “authority down the chain of command until it made us uneasy.” He tried to set a “climate in which we prized entrepreneurship and free thinking, leaned hard on complacency, and did not punish ideas that failed.”

It is easy to call such a reverence for leadership a myth. And yet I am not sure it is. In the Rolling Stone article that brought McChrystal down, Hastings writes: McChrystal “set a manic pace for his staff, running seven miles each morning, and eating one meal a day. (In the month I spend around the general, I witness him eating only once.) It’s a kind of superhuman narrative that has built up around him, a staple in almost every media profile, as if the ability to go without sleep and food translates into the possibility of a man single-handedly winning the war.” Hastings is clearly skeptical. But My Share of the Task is an exceptional brief for McChrystal.

McChrystal’s book is a scintillating read. The story is part biography, part history of the Iraq war, part an account of the rebirth of the US military, and also a revealing account of new US military strategy of a networked military that melds drones and other surveillance techniques with technologically superior and quickly deployed elite troops.

At its core, My Share of the Task tells the story of the rise of a networked military. The fulcrum of McChrystal’s narrative are the dual efforts—led by McChrystal’s Task Force 714—first to retake the city of Fallujah after the assassinations and public hangings of members of a Blackwater security convoy and then the manhunt for Abu Musab al-Zarqawi.

After the U.S. abandoned Fallujah to Zarqawi’s Al Qaeda in Iraq, TF 714 was charged with leading the effort to retake the city. The greatest challenge and need was information and reconnaissance. “Of most value,” McChrystal writes, “was our increasingly sophisticated employment of unmanned aerial vehicles (UAVs), with the Predator being the most common version.” (137) What TF 714 needed was technology, analysis and surveillance.

The rise of drones in the US military strategy was, originally, not as an offensive weapon, but as a tool for surveillance. If conventional warfare required surveillance of large and static targets, the war to retake Fallujah “required constant surveillance of people or moving vehicles, often looking to identify subtle movements or specific mannerisms.” What McChrystal sought was a “picture of life within the city,” which came to be known as “pattern-of-life” analyses “that followed the targets’ habits as they undertook their daily routines.” (139) With drones as eyes in the air, TF 174 “watched the circles where the insurgents sat when they would gather for ceremonial meals of lamb in the compound courtyards just prior to suicide bombing missions. And we bombed those. We saw this patchwork of movement from our eyes in the clouds and rounded out the picture with increasing human and signals intelligence.” (144)

Much of My Share of the Task is a description of McChrystal’s growing awareness and commitment to a new type of war, one that was less a struggle for land or for position but was essentially a “battle for intelligence.” (156) “By the end, in the months when Iraq’s fate would be decided, TF 714’s formidable offering was its network—its ability to gel diverse talents into an organic unit that gathered information swiftly and acted accordingly.” (93) The mantra that McChrystal embraces is “It takes a network to defeat a network.” What that means is that the U.S. needed an armed forces that operated like Al Qaeda. In such a network, leadership is essential, but a special kind of leadership.

Another theme that reappears throughout My Share of the Task is McChrystal’s thinly veiled disgust at American consumerism. He makes a point of having no TVs in living quarters in Iraq and emphasizes his insistence on Spartan quarters for his operators to prevent distractions. He is upset when “fast-food restaurants and electronics sales displays” pop up around the US bases. He considers these to be a “serious distraction from the business at hand,” and thought that “attempts to replicate the comforts of home could deceive us into thinking we weren’t in a deadly fight.” Over and over we see McChrystal insisting that his forces focus on the task at hand with unswerving dedication.

Such monomania may simply be ill-suited to politics. And yet, to recall great political leaders from Gandhi to Franklin Delano Roosevelt and from Abraham Lincoln to Martin Luther King Jr. is to become aware that leadership requires, at the least, an unshakable conviction in what one hopes to accomplish.

McChrystal’s intensity is neither inhuman nor inhumane. There are a number of times in the book when McChrystal reflects on the inhumanity of this particular war. At one point he goes on a dangerous daytime raid with Rangers in Ramadi. As the Rangers arrive at their target they force civilian men to lie down on the ground. McChrystal focuses on a boy, about four years old, standing near a man no doubt his father. As the men forced the men down, the boy, confused, lay down on the ground and “pressed his cheek flat against the pavement so that his face was turned towards his father and folded his small hands behind his back.” McChrystal tell us: “As I watched, I felt sick. I could feel in my own limbs and chest the shame and fury that must have been coursing through the father, still lying motionless. Every ounce of him must have wanted to pop up, pull his son from the ground, stand him upright, and dust off the boy’s clothes and cheek. To be laid on the ground in full view of his son was humiliating. For a proud man, to seemingly fail to protect that son from similar treatment was worse. As I watched, I thought, not for the first time: It would be easy for us to lose.”

In another spot, McChrystal reports a story told by one of his operators Chris Fussell who tried to make conversation on a trip once. Apparently “Fuss” asked: “You see one of the dogs died on the target last night?” he asked. He was referring to one of the dogs that soldiers sent into houses ahead of them to check for booby traps. “Really sad,” “Fuss” remarks. McChrystal snaps back: “Seven enemy were killed on that target last night. Seven humans. Are you telling me you’re more concerned about the dog than the people that died? The car fell silent again. “Hey listen,” I said. “Don’t lose your humanity in this thing.”

My Share of the Task is a book needed in our times. It holds out a basic thesis: That leaders matter and that leaders, if they lead, can make a difference. At a time of cynicism and disillusionment in politics, we need to think again about leadership and demand of our politicians that which they at present refuse to offer. For that exemplary lesson, McChrystal’s book is to be welcomed.

For this weekend, sit down and enjoy My Share of the Task. And then, on Sunday, March 10, come join General McChrystal, Roger Berkowitz, and Walter Russell Mead at 5 pm at the New York Historical Society in Manhattan. Tickets are available here.

When people talk about the cost of entitlements or pensions, there is often a whiff of condescension, as if government employees don’t deserve their benefits. Often forgotten is the fact that private pensions are underfunded as well, and they are insured by the federal government. And now we are told that the military may have the biggest pension problem of all. Here is what the Financial Timesreports:

Of all the politically difficult budget issues that Mr Hagel will face, few are more charged than the question of military entitlements which have risen sharply over the past decade. A report last year by the Center for Strategic and Budgetary Assessments concluded that at current rates, “military personnel costs will consume the entire defence budget by 2039”. Robert Gates, Mr Obama’s first defence secretary, once warned that these expenses were “eating us alive”.

Just as pensions and entitlements will soon crowd out all other government spending, so too will military pensions crowd out all military spending.

No one today can responsibly argue against pensions and health care. And no one can call the soldiers lazy burdens on the public weal. But neither can we fail to recognize that our addiction to entitlements is destroying our politics and our public spirit. We are sacrificing public action—be it the pursuit of scientific knowledge, the erecting of monuments, the education of our young, the building of infrastructure, and even a well-outfitted military—for the private comfort of individuals. It is no wonder that our political system is broken at a time when all incentives in the country lead interest groups to focus on parochial interests above the common good. It is inconceivable that this situation is not in some way related to the emergence of entitlements as the central function of government.

The question is one of principle. We have gone from a common sense that people are responsible for themselves and the government provides a safety net to a common sense that everyone should receive an education, everyone should receive healthcare, and everyone should receive pension benefits for as long as they live. It is possible to embrace the latter common sense, but with it comes a significantly higher tax burden and a much more communal ethic than has typically reigned in America. This is not a problem that hits only public employees. It is endemic throughout society. And our military.

There is probably no presidential speech more quoted in Academic circles than Dwight D. Eisenhower’s 1961 farewell speech, on the final day of his presidency. It was in that speech that Eisenhower warned of the danger of a military-industrial complex.

The need for a permanent army and a permanent arms industry creates, he writes, a gargantuan defense establishment that would wield an irresistible economic, political, and spiritual influence. In the face of this military-industrial complex, we as a nation must remain vigilant.

In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist.

Eisenhower’s speech was prescient. Particularly academics love to point to his speech to criticize bloated defense spending and point to the need to critically resist the military demands for more weapons and more soldiers. They are undoubtedly right to do so.

This is true even as today the military may be the one significant institution in American life where top leaders are arguing that America’s world preeminence is not sustainable. In Edward Luce’s excellent new book Time to Start Thinking, he describes how military leaders are convinced that the U.S. “should sharply reduced its “global footprint” by winding up all wars, notably in Afghanistan, and by closing peacetime military bases in Germany, South Korea, the UK, and elsewhere.” The military leaders Luce spoke to also said that the US must learn to live with a nuclear Iran and “stop spending so much time and resources on the war against Al-Qaeda.” Military leaders, Luce reports, are upset that “In this country ‘shared sacrifice’ means putting a yellow ribbon around the oak tree and then going shopping.” Many military people seem to share Admiral Michael Mullen’s view that the US national debt is the “country’s number one threat—greater than that posed by terrorism, by weapons of mass destruction, and by global warming.” One must think hard about the fact that military leaders see the need for “shared sacrifice” that will shrink the military-industrial complex while Americans and their elected leaders still speak about tax cuts and stimulus.

Too frequently forgotten in Eisenhower’s speech, or even simply overlooked, is the fact that Eisenhower follows his discussion of the military-industrial complex with a similar warning about the dangers of a “revolution in the conduct of research.” Parallel to the military-industrial complex is the danger of a university-government complex. (Hat Tip, Tom Billings (see comments)). Eisenhower writes:

Akin to, and largely responsible for the sweeping changes in our industrial-military posture, has been the technological revolution during recent decades. In this revolution, research has become central; it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.

Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.

Just as modern warfare demands a huge and constant arms industry, so too does the technological revolution demand a huge and constant army of researchers and scientists. This army can only be organized and funded by government largesse. There is a danger, Eisenhower warns, that the university-government complex will take on a life of its own, manufacturing unreal needs (e.g. a Bachelor of Arts degree in order to manage an assembly line) and liberally funding research with little regards to quality, meaning, or need. While the university-government complex is not nearly as expensive or dangerous as the military-industrial complex, there is little doubt that it exists.

Eisenhower warns of a double threat of this university-government complex. First, the nation’s scholars could be dominated by Federal employment, and gear their research to fit with governmental mandates. And second, the opposite danger, that “public policy could itself become the captive of a scientific-technological elite.”

The existence and power of just such a scientific-technological elite is undeniable today. On the one side are the free-market idealogues, those acolytes of Friedman, Hayek, and Coase, who insist that policy be geared towards rational, self-regulating, economic actors. That real people do not conform to theories of rational behavior is a problem with the people, not the theories.

On the other side are the welfare-state adherents, who insist on governmental support for not only the poor, but also the working classes, the bankers, and corporations. The sad fact that 50 years of anti-poverty programs have not alleviated poverty or that record amounts of money spent on education has seen educational attainment decrease rather than increase is seen to be no argument for the failure of technocratic-governmental solutions. It just means more money and more technical know-how are needed.

It is simply amazing that people in academia can actually defend the current system that we are part of. Of course there are good schools and fine teachers and serious students. But we all know the system is a failure. Graduate students are without prospects; faculty spend so much time publishing articles and books that no one reads; administrators make ever more – sometimes twelve times as much as full professors-and come more and more to serve as the lifeblood of universities; and it is the rare student who amidst the large classes, absent faculty, and social and financial pressures, somehow makes college an intellectual experience.

The idea and practice of college needs to be re-imagined and re-thought. Entrenched interests will oppose this. But at this point the system is so broken that it simply cannot survive. On a financial level, large numbers of universities are being kept afloat on the largesse of federal student loans. If those loans were to disappear or dry up, many colleges would disappear or at the least shrink greatly. This should not happen. And yet, putting our young people $1 trillion in debt is not an answer. For too long we have been paying for our lifestyles with borrowed money. We are now used to our inflated lifestyles and unwilling to give them up. Something will have to give.

The current cost of a college education is unsustainable except for the very top schools that attract the very richest students who then fund endowments that allow those schools to subsidize economic, national, and racial diversity. For schools that cannot attract the wealthiest or do not have endowments that protect them from market forces, change will have to come. This will mean, in many instances, faculty salaries will decrease and costs will have to come down. In other colleges, costs will rise and university education will be ever less accessible. Either way, the conviction that everyone needs a liberal arts degree will probably be revised.

I have no crystal ball showing where this will all lead. But there are better and worse ways that the change will come, and I for one hope that if we turn to honestly thinking about it in the present, the future will be more palatable. This is the debate we need to have.