News about our books, authors, and more.

Studying the Middle East at the height of US empire reveals the politics of academia.

by LARA DEEB and JESSICA WINEGAR

Supporters of academic boycott during the annual business meeting of the American Anthropological Association in Denver, Colorado, November 21, 2015. Photo by Alex Shams.

The consummate image of the scholarly life is that it is defined by the free and impassioned pursuit of ideas. We conduct research and we teach; we produce, question, and impart knowledge. Yet all of us working in colleges and universities know that the life on which we once, perhaps naively, embarked is also filled with politics, much of it quite fraught. Perhaps nowhere is this more salient today than in the field of Middle East studies. In particular, anthropological study focused on this region provides a compelling lens through which to view some of the key stakes in the political struggles of academe and their relation to broader structures of power—particularly as the region has taken center stage in US imperial ambitions.

US global engagements in the Middle East and North Africa has for decades influenced how and why people research and teach about the region.

Those ambitions have, on the one hand, precipitated significantly more interest in funding work on the region and hiring scholars to research and teach about it. On the other hand, since at least the 1970s, academics who research or teach topics against the grain of dominant US national narratives about and interests in the region have faced the prospects of not having their research funded, not being hired, being accused—by parents, students, administrators, and people unassociated with academe or their campus—of bias and even treason in their teaching and public lectures, being targeted by blacklists and hate mail, and even losing their jobs.

Nicaragua and the United States are approaching the 30-year anniversaries of two periods of national reckoning that took place in the waning years of the Contra War. The conflict erupted in 1981 just two years after the Sandinista National Liberation Front overthrew the Somoza regime, a brutal family dictatorship that had ruled Nicaragua for more than forty years. Once in office, Ronald Reagan, a devout anti-communist crusader, authorized the training and funding of counter-revolutionary forces or contras as part of a campaign to destabilize the Sandinista state. Armed resistance spread to the Atlantic coast region where dissatisfaction with the revolution grew in indigenous and Afrodescendant communities with the imposition of a new ruling order from Managua. By the end of the 1980s, the United States would extend over $400 million USD in aid to the contras, while the war and destabilization campaign would result in more than 30,000 deaths and billions of dollars in losses for Nicaragua.

What has U.S. militarization meant for the people who live in militarized places around the world?

The Contra War continued until 1990 when the Nicaraguan people removed the Sandinistas from power by popular vote. But indigenous and Afrodescendant resistance began to subside in the mid-1980s as the Sandinista state sought to reconcile the revolutionary project with these communities by recognizing their rights to land and regional autonomy. In November 1986, the state enshrined these rights in law with the adoption of a new constitution followed by the passage of an autonomy statute for the Atlantic coast region in 1987. The reforms established the framework for some of the most expansive multicultural citizenship rights in Latin America. It still took more than two decades for the Nicaraguan state to title indigenous and Afrodescendant territories. And even with formal recognition, conditions remain precarious in these territories where deforestation, land dispossession, capitalist intensification, and drug war militarization threaten community life.

Rwanda teaches students that orderly innovation is the path to national progress.

by CATHERINE A. HONEYMAN

A customer at a microfinance center in Rwanda. Photo from Trócaire. CC BY 2.0 via Flickr.

Pineapples with juice dripping down their sides, neatly tied bags of passion fruit and tree tomatoes, shiny green imported apples, golden-skinned finger bananas—at one time, the intersection close to my Kigali home was crowded with women carrying their merchandise, in wide baskets atop their heads and in woven bags slung over each arm. Near them, you could always find a young man or two selling sweets and biscuits from a cardboard box. Needed to clean the dust off your shoes before venturing into town? Someone was always carrying around packages of tissues for 100 francs each.

Rwanda is the site of one of the most extensive efforts to promote youth entrepreneurship in the world.

Once a characteristic image of street life just about anywhere on the African continent, this sort of scene has almost disappeared in Rwanda. Street businesses have been tidied up and brought into the formal market, and they are required to have a fixed and formal place of business. Prepared foods must be properly labeled and inspected for consumer safety; motorcycle taxi drivers must belong to a cooperative, wear numbered uniforms, and provide helmets; all businesses must register, obtain a license, and become part of the tax system.

These are all sensible regulations, arguably modeled on the way things work in many developed economies. And in Rwanda, they are enforced with increasing effectiveness each year. This is Rwanda’s contemporary aesthetic of entrepreneurship, of national progress: clean streets, orderly businesses, everything registered and known—an orderly and regulated form of self-reliance from the broadest policies down to the tiniest details.

Decades after the truth and reconciliation process, South Africans are still seeking justice.

by RITA KESSELRING

Elsie Gishi, a claimant in the apartheid litigation who provided testimony to the Truth and Reconciliation Commission. Photo by Rita Kesselring.

Twenty years ago the first public hearing of the Truth and Reconciliation Commission of South Africa was heard and tomorrow marks eighteen years since the handing over of that commission’s findings to President Nelson Mandela. The Truth Commission, which has since been replicated in dozens of other countries, was an integral component of South Africa’s post-apartheid transition to what many hoped would be a more open, freer society. Now, two decades after its founding, a whole generation of young people has grown up in South Africa since this first hearing, and the commission has become a part of the new nation’s founding myth but South African officials rarely stop to ponder its success and limitations. If apartheid is mentioned in speeches today, it merely serves to provide reasons for problems the government has not been able to solve. Preoccupied with the present, with party-political scheming, elections, and scandals like the one surrounding renovations to President Zuma’s private home (financed with taxpayer money), the African National Congress—a former liberation movement and now the incumbent ruling party at the national level—is slowly losing its grip on its electorate.

The apartheid past lingers on in today’s South Africa.

While the Truth and Reconciliation Commission (TRC) was intended to be a powerful tool for restorative justice, the apartheid past lingers on in today’s South Africa. The notion of “the past in the present” suddenly pops up everywhere—a fairly new, but pervasive, feature in public discourse. The recent rise of South African students—from privileged universities like the University of Cape Town to the underfunded University of Limpopo in Polokwane—exemplifies this new cry for justice. Increases in tuition fees sparked student protests across the country last year (inspiring the hashtag, #FeesMustFall); but the protests, still ongoing, are equally directed against the non-transformation of the entire education system, the bias in the curricula, and the financial exclusion of the majority of the so-called “formally disadvantaged” groups. Students have not protested so loudly and decidedly since the 1976 Soweto uprising. Indeed, this and other apartheid-era protests against minority rule are today drawn on as models for current protests.

Shortly after I had started my fieldwork among Chinese elite university students, a young woman jumped to her death from a university building. Statistics around student suicides in China can be difficult to obtain, but suffice it to say that her death was not an anomaly—the phenomenon of suicide among Chinese students at elite universities is something of a public secret: generally well-known but rarely discussed in official channels. I was drawn to this paradox of why young people like her, who have made it to the top of China’s extraordinarily competitive educational system, would choose to take their own lives: What actually happens to the lucky few at the top of the pyramid?

Suicide among Chinese students at elite universities is something of a public secret.

In recent years globally-oriented educational debates often play out in relation to the ambiguous rise of China, which has emerged as a figure of both allure and anxiety for a number of reasons. China’s system of rigorous testing is well known; The fact that Shanghai students topped the PISA tests in 2010 (tests designed by the OECD to assess scholastic performance across countries) gave rise to a sense that China is beating us on our own terms. And yet, as the West seems to be looking East to find the key to success in a brave new knowledge society, the East is looking West.

Who precisely is the “common good” that the public health sector purports to serve?

by KATHERINE A. MASON

The Zika-carrying Aedes Aegypti mosquito. Photo by James Gathany of the Centers for Disease Control and Prevention. Public domain.

It has been barely a year since the Ebola outbreak in West Africa released its grip on the frightened imaginations of the Global North, and already global health officials are in the midst of another viral panic. After spreading across the Pacific to Brazil and Puerto Rico, the mosquito-born Zika virus this summer is expected to continue its march to the mainland United States, where it will make landfall on the Gulf Coast. The message from global and American health authorities is the same as it was in 2003 with the appearance of SARS, in 2006 with H5N1 avian flu, and in 2014 with Ebola: Be very afraid.

“This could be a catastrophe to rival Hurricane Katrina or other recent miseries that disproportionately affect the poor,” writes Peter J. Hotez, Dean of the National School of Tropical Medicine at Baylor College of Medicine in a recent New York Times Op-Ed. “If I were a pregnant woman … in an impoverished neighborhood in a city like Houston, New Orleans, Miami, Biloxi, Miss., or Mobile, Ala., I would be nervous right now.” Of course, Hotez is none of the above. So are those women nervous? Will controlling Zika end up helping them? And would Hotez listen if the affected women thought it wouldn’t? As we enter another round of emergency epidemic control, these questions are critical to ask.

Public health professionals felt it was their duty to manage this threat carefully in order to serve the “common good.” But who made up the “common”?

What I have found from my own research is that even when a global disease response appears to be effective, unintended consequences can emerge that threaten to undermine, rather than support, the long-term health and well-being of vulnerable communities most affected by the disease in question. In the urban Chinese settings I studied during and after the SARS outbreak, China’s enormous population of rural-to-urban migrant workers was seen as a public health menace capable of spreading the same dangerous diseases that threatened it. In the wake of the 2003 outbreak, Chinese public health professionals felt it was their duty to manage this menace carefully in order to serve the “common good.” But who made up the “common”?

The question of what is just is not an ahistorical one—it is answered daily in the spaces of lived experience.

by SANDRA BRUNNEGGER and KAREN ANN FAULK

"Mothers of the Plaza de Mayo." May 31 2007. CC BY-NC-ND 2.0via Flickr.

Latin America is as culturally diverse as it is geographically vast. Yet, the nations of Latin America share important historical and institutional characteristics. Perhaps most significantly, countries across the region continue to grapple with the legacies of colonialism—from the classical era of Iberian colonialization to the neocolonial domination enacted through economic penetration in the early twentieth century.

At the approach of the twenty-first century, Latin Americans found themselves constrained by the demands of international lending agencies and awash in the flood of cultural and material products made ever more readily available by multinationals striving to captivate and capitalize on the “emerging markets” opened by neoliberal reform. The continent has also had to contend with the legacies of state violence and dictatorial regimes that sought to strip society of its vibrant forms of popular organizations, preemptively crushing opposition and laying the foundation for the economic restructuring that was to come.

In all of these cases, the protagonists are seeking one thing: justice.

These shared processes of emergence paved the way for a diversity of forms of resistance. In the Chilean Atacama Desert, residents have undertaken a prolonged struggle for their right to groundwater. Family members of bombing victims in Buenos Aires brought a case against the state of Argentina before an international human rights body and are still working through a slow process of attempted resolution. In Colombia, some victims of political violence are turning increasingly to the courts for resolution in the wake of devastating personal tragedy, while others reject the state’s ability to fairly adjudicate their grievances and construct instead a nonstate tribunal to consider the damages they have suffered to both persons and property.

One Japanese film offers a window into the lived experience of the country’s recessionary period.

by ANDREA GEVURTZ ARAI

A still from the 2008 Kiyoshi Kurosawa film, Tokyo Sonata.

In 1989 through 1991, the Japanese Stock Exchange fell by 60 percent; gross domestic production followed suit, declining precipitously. From the famed double-digit growth of the 1960s through the mid-1970s, and a steady 4 percent annual rate during the 1980s, economic growth dropped to 1.5 percent by the year 2000 and dropped to negative rates by the time of the September 11, 2001, attacks in the United States. As Japanese banks and businesses began to fail—for the first time since the Great Depression of the 1930s—terms of financial instability and failure, like bankruptcy and restructuring, identified in Japan with other countries like the United States and China, began to appear with regularity in the Japanese news. The financial plummet was met with domestic and international disbelief. It’s just a correction, wrote many international economists at the time; the Japanese will figure it out.

The financial downturn of the early 1990s led to a decade-and-a-half–long recession, constricting of the job market for young adults, and restructuring of the lifetime employment system.

Tokyo has always had a magical effect on me. I grew up in Hamamatsu, a mid-sized industrial city known for producing Yamaha motorcycles and pianos. In contrast to my ordinary life there as an office assistant in the mid-nineties, I found Tokyo to be extraordinary—with its splendid commercial districts, dense population, and urban sprawl.

After studying in the United States for seven years, I returned to Japan in 2004 as an anthropologist to conduct research on Tokyo’s red-light district and the host clubs where men cater to female consumers for exorbitant sums of money. While living there, I saw firsthand how rapidly Tokyo had changed since my youth in the nineties. For instance, Roppongi Hills, a 54-story mega-complex of apartments, offices, bars and restaurants, designer boutiques, galleries, and a movie theater had just opened. Real estate developers, politicians, and journalists heralded the project as a vivid symbol of Japan’s future. Tourists took advantage of the building’s rooftop observation deck not only for its panoramic views of the city but also the excitement and even optimism that these scenes of urban life often provoke. While visitors enjoyed the view, global investors were privy to a different vision: such urban developments are also sites of intense speculation.

On Muslim youth growing up on the front lines of nationalist politics in Denmark.

by REVA JAFFE-WALTER

Refugee children from Syria at a clinic in northern Jordan. Photo by the UK Department for International Development. CC BY 2.0 via Wikipedia.

In 1943, when other European governments watched while Jews in their country were rounded up and deported to concentration camps, the Danish organized a nation-wide effort in which Danish fisherman carried close to 8,000 Jews across the Oresund sea to safety in nearby Sweden. Similarly, in 1983 when refugees were fleeing the Iran/Iraq war and violence in Palestine, Denmark welcomed them and led Europe in having the most generous humanitarian refugee policies offering the right to asylum, full legal rights and the same social benefits as Danish citizens.

Today, in stark contrast, Denmark has some of the most restrictive immigration and refugee policies in Europe. These policies reflect a dramatic shift from a posture of humanitarian outreach and compassion towards refugees to one focused on the increased restriction and policing of migrants and immigrants. Danish police patrol the border and the bridge between Sweden and Denmark to prevent Syrian refugees from entering. New laws emerge to deter migrants from seeking refuge in Denmark—such as the passage of a national law allowing the government to seize the personal assets of those applying for refugee status. Other laws, meanwhile, target Muslims already living in Denmark—including local mandates targeting school-age children that require that pork be served in elementary school lunch programs.

Denmark has some of the most restrictive immigration and refugee policies in Europe.

Last week’s renewed debate between President Barack Obama and Republicans in the Senate, reminds us how murky and poorly defined the goals and strategies of the so-called war on terror remain as it enters its fifteenth year. Nowhere is this ambiguity more apparent than in Afghanistan, the place where most of the Guantanamo detainees were first apprehended.

Beginning in 2006, I spent a year and a half working with a small group of potters in a picturesque town in the mountains north of Kabul. Even while the insurgency spread in the south and the east of the country, the town, which had been leveled by the Taliban, remained staunchly in favor of the international presence. Over the course of the next nine years, however, corrupt elections, an ineffective government and a sense that a small group of former warlords had largely taken over all the key resources, led to the growing sense that the international intervention had failed to fulfill its initial promises. Returning last spring, I was stopped in the grape fields below town by a roadblock set up by the Afghan Army. The soldiers lounging on their armored personnel carriers, gifts from the US Department of Defense, said that there was an ongoing operation in the villages above, to clear it of the Taliban.

The recent news coming out of Afghanistan has not been good. The UN recently reported that 2015 had the highest number of civilian casualties since they began tracking the number.

What the arrest of 5 Chinese booksellers reveals about the sexual politics of China.

by ELANAH URETSKY

The 18th National Congress of the Communist Party of China. After this 2012 meeting, an unprecedented anti-corruption campaign, spearheaded by President Xi Jinping, commenced to root out abuse of power and other excesses in the Chinese government. Public domain via Wikipedia.

In January of 1998 news leaked that President Bill Clinton had engaged in ‘improper’ relations with White House intern Monica Lewinsky. People around the country debated whether a man with such moral character was fit to run the country. This carried over into Congressional hearings and Clinton eventually became the second president to be impeached, charged with perjury and obstruction of justice. He was later acquitted in the Senate, served out the rest of his term in the White House and went on to become a popular former president known for doing good around the world.

Now imagine the same scenario in China of a president having an affair with a woman outside his marriage, news being leaked to the media, and popular debates ensuing about his ability to rule the country. Having a hard time with that image? That’s probably because the Chinese government would never allow such a scenario to develop. Surely the president may have an affair but the government will go to any length, as we are now witnessing, to prevent the news from being leaked.

The individual in China is still expected, first and foremost, to be loyal to the state with expectations for representatives of the government and the Party to serve as moral role models.

Last fall five men connected with Hong Kong publisher, Mighty Current, and their affiliated bookstore, Causeway Bay Books, went missing—all are now confirmed to be in police detention in China. The publishing house, which is known for releasing books critical of the Chinese government, is thought to have been working on a tell-all book that would reveal sensitive information about Xi Jinping’s love life before he became president. The Chinese government has gone to great lengths to cover up any allegations that could be revealed in the book—crossing over sovereign territory into Hong Kong and Thailand to abduct these men, two of whom now hold citizenship with European countries, and clandestinely bringing them back to China. Sure, Clinton tried to cover up his relationship with Lewinsky—he knew, after all, that admitting to it would raise eyebrows and damage his political career. But as far as we know, he never committed a crime, impinged upon the rights of individuals, or disregarded international norms to cover up or erase his indiscretion. The stakes were also a lot lower for Clinton—despite social and personal fallout, ultimately he retained his role as president. Xi Jinping or any Chinese leader, on the other hand, would certainly face the end of their career, and maybe more, upon the exposure of an extramarital affair.

Thirty-seven years ago on February 11, 1979, on my eighth birthday, Iran, my country, went through a radical shift. My family left Iran a year after the Revolution, and I have been trying either to understand what happened or to explain it ever since. My latest attempt is Last Scene Underground, an ethnographic novel of life in contemporary Iran.

What’s real? They want to know where the boundary lies (literally “lies” in a non-truth-telling sense) between fiction and non-fiction.

If the Q&A at book readings is anything to go by, when you’ve written a book that’s both a novel and an ethnography, the question on most people’s minds is: What’s real? They want to know where the boundary lies (literally “lies” in a non-truth-telling sense) between fiction and non-fiction. Ethnography and literature have in common a very fluid boundary between the real and the fictive. Even in science fiction, a writer creates a work of fiction based on his or her own understanding of human relations, impulses, and desires from lived experiences and factual knowledge that feeds the imagination.

If an anthropologist were to set-up a camera and begin to record a “scene” of life somewhere, the very decision of where to place the camera frames the scene with a subjective and therefore not fully honest view. This does not mean that what was recorded is not real or the “truth,” but in excluding major parts of the scene, it skews and changes “reality.” If I focused my camera on one section of students in a lecture where only men are sitting, one may have the mistaken impression that only men take my anthropology class. The camera recorded “reality,” but it was not an honest representation of the class. When we social scientists translate and write about an “objective world” through our subjective positions, we may be factual but not necessarily honest about all the ways in which we are creating a new meaning or a fiction, possibly even a fantasy.

On questions of access, authenticity, and authority in ethnographic research.

by PARDIS MAHDAVI

A mural at the former US Embassy in Tehran. Photo by Ninara—CC-BY-2.0 via Flickr.

Over the years, anthropology’s position on which type of ethnographer has the most advantageous position has shifted. During the 1970s and 1980s it was thought that only a non-native anthropologist, or a “true outsider,” would be able to objectively study the natives of a culture. This perspective hailed from the colonial roots of the discipline and predominated during a period wherein non-native anthropologists were the primary researchers featured within the discipline. This was certainly true for the anthropological study of Iran. Prior to 1978, non-Iranian Iran scholars such as Mary Hegland, Michael Fischer, William O. Beeman, and Erica Friedl authored many of the ethnographies that received wide circulation and attention, but shifting approaches in anthropology coalesced with the 1979 Revolution to drastically change the ethnographic terrain in Iran, raising methodological questions about access, authenticity, and authority.

Following the post-colonial turn, some wondered if native anthropologists actually had the more advantageous position due to their “intimate” knowledge of their interlocutors and in the years after the revolution native Iranian scholars, including Shahla Haeri and Ziba Mir-Hosseini, gained prominence within the ethnographic literature. During the Khatami era, those of us entering Iran to conduct ethnographic fieldwork were “halfies” or “somewhat native” anthropologists; Whereas the natives were Iranian-born, halfies were foreign-born but still claimed strong ties to Iran. What placed people like myself in this later group was a combination of having some Iranian heritage (my parents were born and raised in Iran and only migrated to the U.S. shortly before I was born), a somewhat native command of Persian (for me it was my first language, despite the fact that I was raised in the U.S.), and, for many of us, Iranian passports, which were crucial in gaining entry to Iran.

Following the post-colonial turn, some wondered if native anthropologists actually had the more advantageous position due to their “intimate” knowledge of their interlocutors.

What mysterious deaths and memory struggles in Chile can teach the U.S.

by ADAM ROSENBLATT

Chilean poet, Pablo Neruda, recording his poetry for the Library of Congress in 1966 (public domain). New evidence may indicate that the poet did not die of natural causes.

The week after I took my children trick-or-treating on the streets of our Philadelphia suburb turned out to be a time of ghosts in another place where I once lived: Chile. On November 5, Chile’s Interior Ministry released a public statement calling it “highly probable” that Pablo Neruda, the Nobel Prize-winning Chilean poet, died of poisoning soon after the coup that ushered in Augusto Pinochet’s 27-year dictatorship, and not from the natural progression of the prostate cancer for which he was being treated at the time. If true, this would confirm years of suspicion that Neruda, like the singer Victor Jara and thousands of other Chileans, was a victim of Pinochet’s violent efforts to suppress political dissent.

Chile’s Interior Ministry released a public statement calling it “highly probable” that Pablo Neruda, the Nobel Prize-winning Chilean poet, died of poisoning.

The retelling of Neruda’s death began when the poet’s former driver stepped forward in 2011 to allege that after treatment in Santiago’s Santa María clinic for his cancer, and only hours before his death, Neruda confided that he had been given a strange injection in his stomach. The path to an official investigation—which would result in the exhumation of Neruda’s corpse from his seaside grave and posthumous travel to four different forensic laboratories in as many countries—has been contentious.

Despite the Chilean government’s bold declaration, the tests thus far don’t offer conclusive evidence. The Staphylococcus aureus bacteria found in his body has no relation to cancer, but this does not conclusively prove that Neruda died from poisoning. Further tests are still needed to clarify the origins of such a potentially lethal microorganism (healthy individuals can be perfectly asymptomatic carriers of the bug), and even then we may never know for certain. For those inclined to think Neruda died of cancer—possibly accelerated by grief at seeing his country fall into the hands of a brutal dictator—and those who believe he must have been poisoned, there is no reason yet to significantly alter their version of history.

Search

About the Blog

The SUP blog showcases new books and Press news in addition to serving as a forum for our authors—past and present—to expound on issues related to their scholarship. Views expressed by guest contributors to the blog do not necessarily represent those of Stanford University or Stanford University Press, and all guest contributions are denoted by a byline and an author bio.

Republishing Guidelines

If you would like to republish an article from the Stanford University Press blog, please contact us at blog@www.sup.org. We can secure permission to republish from our authors and provide text and other files for easy republication.

If you wish to republish an article, we ask that you credit the Stanford University Press blog as the original publisher of the material and provide a link to the source post somewhere alongside the republished content.