Monthly Archives: March 2015

“R.I.P. Chinua Achebe, Again,” Joshua Benton
“Last night, my Twitter timeline was filled with sad reflections on the life of Nigerian author Chinua Achebe, who had passed away. Very sad—but the sadness is cut by the fact that Achebe actually died two years ago.”

“In Defense of Difficulty,” Steve Wasserman
“Karl Kraus, the acerbic fin-de-siecle Viennese critic, once remarked that no nation’s literature could properly be judged by examining its geniuses, since genius always eludes explanation. A better metric is the second-rate, which is to say, the popular literature and art that makes up the bulk of what people consume.”

“The Angelus at Work,” Nathan Schneider
“In my work life now—a paragon of telecommuting flexibility from wherever and whenever—the Angelus has become a precious fixture. I mostly forget to say it, of course. But when I don’t, it bounds the beginning and end of the workday, sanctifying each, and abruptly, insistently interrupting in the middle, as if something other than the work before me matters.”

The “standard story” of American religious freedom goes something like this: The Founders, in their wisdom, introduced novel conceptions of religious liberty that ensured a secular government and equal treatment of all faiths. Ensuing generations of Americans failed to honor those principles. In the middle of the twentieth century, a courageous Supreme Court recovered the Founders’ approach. Since then, conservative religious believers have tried to undo the Court’s restorative efforts.

Steven Smith, a professor of law at the University of San Diego, isn’t buying that story, and in his latest book, The Rise and Decline of American Religious Freedom, he explains why. For most of our history, Smith argues, our country largely abided by an “American settlement” for religious pluralism that included separation of church from state (but not of religion from government) and freedom of conscience. But the mid-twentieth-century Supreme Court altered the “American settlement” and thereby placed “religious freedom in jeopardy.”

“Purple Reign,” Chris Lehmann“What Mayer is pleased to call the [Yahoo’s] stable of ‘digital magazines’ is, in reality, the barest of fig leaves for an orgy of sponsored content—i.e., copy commissioned, inspected, and (increasingly) edited by advertisers, and misleadingly packaged as reliable, independent journalism in order to win eyeballs and reader trust.”

“Advertisers Should Pay You,” Thomas R. Wells
“If advertisers had to negotiate directly with you, or at least your software agent, then they would have to start paying a price that would not leave you feeling violated. And at that price they would want to buy much less of your attention than they do at present.”

“Tolstoy Replays History,” Andrei Zorin
“Both Darwin and Marx presented their books to the reader not only as scientific discoveries, but as an important stage in their personal biographies. In the same manner Tolstoy was attempting a total explanation of the current state of Russia that had to be at one and the same time a panoramic historical reconstruction and an intellectual autobiography.”

One year ago today, Russia annexed Crimea as part of what has become an ongoing war in eastern Ukraine. Writing in our spring issue, John Owen and William Inboden observe:

Although it has met with sanctions and other gestures of Western disapproval, Putin’s barely covert conquest (which the US government steadfastly refuses to call an invasion) plays well among his own people, and it will likely provide the Russian president with sufficient leverage to keep Ukraine from entering the EU or NATO. More ominously, it suggests how Putin may continue to behave in Russia’s near abroad, consolidating Moscow’s influence by creating further “frozen conflicts” in Russian ethnic enclaves such as those in Moldova and Georgia or by more brazenly undermining neighboring governments and seizing their territory.

For many in the West, the troubling events in Ukraine have raised the specter of a new cold war. A more apt and even more unsettling parallel comes from 1938. In that year, ethnic Germans in the Sudetenland region of Czechoslovakia, stirred up by Nazi agitators, called for unification with Germany. With the acquiescence of Britain and France, Germany annexed the Sudetenland (having already absorbed Austria earlier in the year). The story ended, of course, as badly as any ever has: In March 1939, Adolf Hitler took the rest of Czechoslovakia, and in September invaded Poland. The move against Poland triggered the Second World War, the most destructive armed conflict in human history, a catastrophe far worse than the one Britain and France had sought to avert by appeasing Germany.

The similarities between 1938 and 2014 are not lost on Europeans today, particularly those in countries that once were part of the Soviet Union or were its satellites in Central and Eastern Europe. Reaction from Western Europe and North America was more cautious, but the alarm is unmistakable.

Owen and Inboden go on to explore the arguments of certain Western scholars who treat Putin’s aggressions as a reasonable and realistic exercise of national self-interest:

The point of comparing the academic realists of the 1930s with those of 2014 is, rather, to show that the thinking of both groups suffers from the same error: reducing international politics to nothing but a power struggle. Use of the term realism is significant here. For [Edward Hallett] Carr, realists are the ones who see things as they are; ideas about justice or welfare are really just contrivances of self-interest, signifying nothing. Utopians are those who mistakenly think that justice and welfare are more than words covering self-interest. Power is real, and all else is illusion.

John M. Owen IV is Ambassador Henry J. Taylor and Mrs. Marion R. Taylor Professor of Politics at the University of Virginia and Faculty Fellow at the Institute for Advanced Studies in Culture. William Inboden is Associate Professor at the LBJ School of Public Affairs and Executive Director of the Clements Center for History, Strategy, and Statecraft and Distinguished Scholar at the Robert S. Strauss Center for International Security and Law, all at the University of Texas-Austin.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

This year, America observes the fiftieth anniversary of many transformative events. Some are to be celebrated; others are to be mourned. Earlier this month, President Barack Obama celebrated the march in Selma and the passage of the Voting Rights Act. This year also marks the fiftieth anniversary of the assassination of Malcolm X, the landing of the first combat troops in Vietnam, and the signing of the Social Security Act Amendments. Yet in the midst of these anniversaries, it is important to remember an event that transformed America and affected the lives of many around the world: the signing of the most significant immigration law in US history, the Immigration and Naturalization Act of 1965.

Promoted by the then–freshmen senator from Massachusetts, Ted Kennedy, the bill ended up receiving more congressional support from Republicans (87 percent) than Democrats (74 percent)—a fact reflecting a very different partisan reality between then and now. The bill passed largely because its proponents claimed, in the words of Senator Kennedy, that “the ethnic mix of this country” would not be “upset” by its passage. Just before its signing, President Lyndon B. Johnson felt the need to reassure any doubters: “This bill that we will sign today is not a revolutionary bill,” he said. “It does not affect the lives of millions. It will not reshape the structure of our daily lives or add importantly to either our wealth or our power. Yet it is still one of the most important acts of this Congress and of this administration.”

And so the bill was signed on October 3, 1965, on a sunny day in the shadow of the Statue of Liberty. President Johnson was absolutely right that the law was one of the most important acts of that Congress and administration. But he and all its proponents were completely off as far as the consequences of the law. The ethnic mix of this country has been completely transformed and the wealth and power of this country has soared as a result.

Up until 1952, almost every immigration law enacted in the United States was written to favor white Protestants of European descent to the exclusion of virtually all other groups. The first immigration law in America, passed in 1790, allowed only “free white persons of good character” to become citizens. It took another eighty years before “aliens of African nativity and persons of African descent” would be granted citizenship and another ninety-five years before a federal law allowed them to exercise their right to vote. Before 1965, laws such as the Chinese Exclusion Act (1882), the Naturalization Act (1906), the “Asiatic Barred Zone Act” (1917), and the National Origin Act (1924) were enacted to bar non-European immigrants. It was not until the Immigration and Naturalization Act of 1952 that race was eliminated as a reason for exclusion, and not until 1965 that the nationality quota system, which benefited northern Europeans, was abolished. The Immigration and Naturalization Act of 1965 allowed immigration based on family reunification and skills. In other words, US citizens could sponsor their relatives to immigrate to the US and individuals with special skills in such fields as science, medicine, and technology could also apply to immigrate.

The most significant consequence of this law was the skill, talent, and sheer brain-power that it brought to the country. President Johnson couldn’t have been more wrong: The strength and wealth of the nation were greatly increased by the influx of talented, hard-working immigrants who were allowed to enter. And while their arrival on these shores caused a serious brain drain in other countries, it benefited the United States immensely.

Today there is hardly anyone in America (or, arguably, in the world) whose life has not been touched by post-1965 immigrants. HIV-positive patients around the world can manage the virus through anti-retroviral therapy thanks to the work of Dr. David Ho, an immigrant from Taiwan who was named Time’s Man of the Year in 1996 because of his pioneering work on HIV/AIDS. Satya Nadella, the chief executive officer of Microsoft, was born in India. Sergey Brin, co-founder of Google, immigrated from the Soviet Union because his parents who were scientists. Jerry Yang (co-founder of Yahoo.com), Steve Chen (co-founder of YouTube), and the entire founding board of PayPal (including Elon Musk, Luke Nosek, Ken Howery, Peter Thiel, Max Levchin, and Yu Pan)—this group of entrepreneurs went on to start companies such as Tesla Motors, LinkedIn, Palantir Technologies, SpaceX, YouTube, Yelp, and Yammer. Fifty years ago, it would have been illegal for some of them to immigrate to the United States, but today they lead some of the most profitable and influential companies in America.

Since the passage of the Immigration and Naturalization Act in 1965, only one significant piece of immigration legislation has been passed: the Immigration Reform and Control Act of 1986. Signed into law by Ronald Reagan, it granted legal status to anyone who had entered the country before 1982 as well as to migrant farm workers. But in the last three decades, there has been no major legislation on immigration, even though the country and the world have changed dramatically. As the nation prepares to enter a new round of debates on immigration reform, it behooves Americans to reflect on how different their country, and the wider world, would be if that bill had not become law fifty years ago.

Tony Tian-Ren Lin, Ph.D., is a Research Scholar at the Institute for Advanced Studies in Culture.

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

In his new book, Elites: A General Model, sociologist Murray Milner, Jr. puts forward (according to Jeffrey Alexander of Yale University) “the first really new theory of elites in many decades.” Milner, a senior fellow at the Institute for Advanced Studies in Culture, systematically analyzes the roles of economic and political elites. Unlike most work on elites, however, he draws special attention to status elites, including modern celebrities. The power of status elites is not primarily political or economic. Below is an adaptation of some of Milner’s thinking on status elites, and why they matter.

Within the literature on elites, status tends to be relatively ignored or neglected in favor of economic and political power. (By status I mean something like prestige, rank, honor, or dishonor—the accumulated expressions of approval and disapproval directed toward an actor or object.) It’s true that “status elites,” as I’ll call them, are in some respects residual, and tend to thrive in arenas where political and economic power are not particularly valued. They derive their power from extraordinary levels of beauty, bravery, knowledge, virtue, or eloquence.

Status is relatively inalienable—it is not easily transferred or appropriated—and inexpansible, so upward mobility is carefully restricted. Norms are made complicated in order to make it difficult to gain status: Brahmins create elaborate rules about purity and pollution; upper classes are often distinguished by accent, demeanor, and style; and scholars like myself must acquire erudition to be taken seriously.

Another source of status is associations; associating with those of high status improves your status, with those of low status lowers your status. This is especially the case with respect to intimate expressive relationships; eating and romantic relationships are key forms of intimacy. Hence, upper castes cannot eat with or marry those of lower castes, though they can interact with them in work situations. Teenagers care about “who eats with whom in the lunchroom” and who “goes with” or “hooks up with” whom, but are less concerned about who sits next to them in class—especially if seats are assigned by the teacher.

Status elites can assume more than one form. They can be religious leaders, or actors, or journalists, or simply “famous for being famous.” Helen Hayes, Laurence Olivier, Frank Sinatra, James Thurber, Ayn Rand, Harriet Beecher Stowe—all are elites who draw their prestige from their cultural or ideological abilities. There are also those elites who are celebrities or pop-stars, such as—today—Beyoncé, Kanye West, Taylor Swift, and reality TV stars such as the Kardashians.

As societies become larger and more complex, it is very difficult for elites to be socially visible by physical presence—which doesn’t matter if you are seeking to become rich, but does matter if you want to become Beyoncé, for whom social visibility is a prerequisite to rank and power. Even the largest stadiums hold only a small percentage of the population of contemporary societies. Status must be created through the media—especially the mass media, which has created the modern-day celebrity. Continue reading →

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

“Democratic Romanticism and Its Critics,” Mark Schmitt
“The idea that American democracy should be more transparent and more inclusive, that it should put the broad public interest ahead of partisanship or local or private interests, is so benign that it’s hard to find a coherent argument against those aspirations. Who speaks for partisanship, patronage, corruption, or secrecy?”

“In Defense of Doing Wrong,” Ben Wizner
“One of my ACLU colleagues, who’s a very fierce privacy advocate … emailed me the other day and said she was sick of talking about surveillance and democracy and liberty. She thought it was time for us to talk more about drugs and porn and adultery and gossip.”

“Confessing and Confiding,” Emily Fox Gordon
“The trauma narrative mode had long been in the ascendant, of course, both in the literary world and in the culture, long enough to have weathered decades of satirical assaults and earnest opinion pieces calling into question the narcissism at its core.”

“Death to Death Row,” Lucy Hughes-Hallett
“Lehrfreund and Jabbar are the executive directors of the Death Penalty Project (DPP), a charity that provides free legal representation to those condemned to death. Personally, both would like to see capital punishment abolished everywhere, but they don’t march in the streets waving banners. They don’t harangue politicians. They don’t barge in where they’re not wanted. They use the law to change the law.”

“Bot or Not?,” James Gleick
“Because the Twitterverse is made of text, rather than rocks and trees and bones and blood, it’s suddenly quite easy to make bots. Now there are millions, by Twitter’s own estimates—most of them short-lived and invisible nuisances. All they need to do is read text and write text. For example, there is a Twitter creature going by the name of @ComposedOf, whose life is composed of nothing more than searching for tweets containing the phrase ‘comprised of.'”

. . . . . . . .

Like The Hedgehog Review on Facebook, follow us on Twitter, and subscribe to our posts via RSS.

In his new book, Organizing Enlightenment: Information Overload and the Invention of the Modern Research University, literary historian Chad Wellmon, a faculty fellow at the Institute for Advanced Studies in Culture, argues against those who claim that the research university is an outmoded, bureaucratic institution ripe for disruption. Recounting the emergence of the research university in another era of media excess, this one driven by print, he focuses on what has always distinguished the research university—an ethics of knowledge. And this, he claims, is needed now more than ever. Here is an excerpt from the afterword of his book:

Misgivings about specialized science and disciplinarity have returned in recent jeremiads about the research university from within its most elite ranks. Harvard professor Louis Menand writes that the “structure of disciplinarity that has arisen with the modern research university is expensive; it is philosophically weak; and it encourages intellectual predictability and social irrelevance. It deserves to be replaced.” Similarly, CUNY professor Cathy Davidson has criticized the research university as an “archaic, hierarchical, silo’d apparatus of the nineteenth century.” Our institutions of higher learning have “managed to change far more slowly than the modes of inventive, collaborative, participatory learning offered by the Internet” and other online and digital technologies. Unlike some of the more general critiques of the university’s disciplinary structure, however, Davidson’s critique is more focused on what is actually at stake. Our universities are “stuck,” she writes, “in an epistemological model of the past.” Our digital age entails not just new and better technologies but an entirely different notion of what constitutes true knowledge: how it is produced, authorized, and disseminated. The disciplinary organization of knowledge is antiquated and dispensable. The very structures and forms of knowledge are changing, and, for Davidson at least, the disciplinary research university is being left behind.

In her more recent work on the future of education, Davidson embraces the potential of digital technologies to undo the authority structure of the research university and spur “collaborative” forms of knowledge production. And yet, in what she describes as a “field guide and survival manual for the digital age,” her Now You See It: How the Brain Science of Attention Will Change the Way We Live, Work, and Think, she relies on that same authority structure she seems eager to escape. She bases her “guide” for the digitally perplexed on what she calls “the science of attention.” She grounds her argument in the authority of modern, disciplinary-based science as she cites study after study, all of which are legitimated by the authority of the disciplinary order of the modern research university.

Davidson’s bad faith is a testament to just how enduring a system the research university ethic is. But it has endured not because it was a rigid, hierarchical system, a Weberian iron cage, a Foucauldian panopticon, but rather because it has sustained communities of people engaged in a common pursuit. Research universities have never overcome the fragmentation of knowledge or realized anything like a universal knowledge. But what they have done is organize intellectual labor, traditions, and desires more effectively over the past two hundred years than any other technology. To dismiss the research university as an antiquated bureaucratic “apparatus” defined by constraint and enforceable standards is to overlook the ways in which its continuity and stability depended on the transformation of actual people….

At this particular moment of technological and institutional change, we need motivating ideals to orient our institutions and ourselves. The idea of the research university is more than its bureaucratic structures. However haltingly, the research university embodies ideals and virtues that scholars both inside and outside the university hold dear. This is where primarily structural accounts of the research university as simply a bureaucratic system, seemingly lacking human agents who endow it with meaning and life, can offer no compelling vision for a future research university. These cool, distant accounts of the research university, so redolent of Weber’s description of any other modern, rational system, see nothing at stake, just the inexorable logic of another modern bureaucracy. They overlook the persons and norms that have always been the core of the research university. Anthony Grafton describes this attitude best: the “loss of patience, or faith, or interest in specialized knowledge” is ultimately a capitulation to the absoluteness of the bureaucratic system of the contemporary research university. Such an attitude belies a thoroughly structural account that omits the research university’s most basic feature: its underlying ethic. These more radically functional accounts, however descriptively illuminating, can never answer a basic question: why would anyone choose to devote herself to specialized knowledge and an institution such as the research university? The research university reproduces itself by forming people into its culture. Its survival relies on the decisions of actual people, not simply on the abstract totalizing mechanisms of an institution. Advocates of the contemporary research university need to recognize and embrace its most central feature: the fact that it embodies a set of norms, practices, and virtues central to modern knowledge. Whatever its myriad failings and bureaucratic functions, the research university sustains what scholars hold in common and commit themselves to—an ethics of knowledge.

Post navigation

Who We Are

The Hedgehog Review is an intellectual journal concerned with contemporary cultural change published three times per year by the Institute for Advanced Studies in Culture at the University of Virginia.