Friday, November 29, 2013

Alan Mikhail is Professor of History at Yale University. He is a historian of the early modern Muslim world, the Ottoman Empire, and Egypt whose research and teaching focus mostly on the nature of early modern imperial rule, peasant histories, environmental resource management, and science and medicine.

Mikhail is the author of Nature and Empire in Ottoman Egypt: An Environmental History and editor of Water on Sand: Environmental Histories of the Middle East and North Africa.

Page 99 finds us in Egypt’s dogdom. It relates how changes in the social function of dogs led to changes in human attitudes toward the animal. Dogs in Cairo were historically what pigs were in New York City—sanitation workers. They consumed the city’s waste, thereby forging a productive social and economic niche for themselves among human communities. At the turn of the nineteenth century, more and more people relocated to Cairo, ideas about disease radically changed, garbage was moved outside the city’s walls, and the metropolis underwent a massive campaign of construction and urban transformation. These and other phenomena all dealt a critical blow to dogs. Humans now saw them as interspecies competitors for space, mangy potential disease vectors, useless noisemakers, and threats to ideas of civilization emerging in the period. The end result? Widespread dog eradication campaigns in Cairo (and other Egyptian cities) in the first third of the nineteenth century. For the first time since the founding of Cairo in the seventh century CE, dogs were purposely separated from humans.

This separation of species is one of the overarching processes followed in The Animal in Ottoman Egypt. In its three parts, the book traces how three classes of animals were actively separated from human animals through different historical mechanisms and forces. The three classes of animals are domesticated laboring animals, dogs, and charismatic megafauna. The interspecies separations the book examines all occurred between 1750 and 1850. This was a period in which Egypt, like many other parts of the world, experienced massive social, political, environmental, and economic changes. Egypt became increasingly autonomous as a province of the Ottoman Empire, beginning a move toward independence. Its cities grew in size. Its economy became increasingly enmeshed in global commercial networks and capitalist modes of exchange. The Animal in Ottoman Egypt argues that what happened to Egyptian animals’ relationships to humans was part and parcel of these other transformations.

To be specific—the story of domestic work animals explains fundamental changes in Egypt’s labor regime that saw humans replace animals as the preferred and dominant means of labor in the countryside. Page 99 and its neighbors elucidate how the history of Egypt’s dogs explains wider changes in understandings of urban sanitation, health, and the human and animal body. And the changing social, economic, and political roles of charismatic megafauna track Egypt’s shifting position in the nineteenth century’s global capitalist economy.

Wednesday, November 27, 2013

Christopher Davidson is reader in Government and International Affairs at Durham University, a former visiting associate professor at Kyoto University, and a former assistant professor at Zayed University in the UAE. He is the author of several books on the politics and international affairs of the Gulf states, including Abu Dhabi: Oil and Beyond, Dubai: The Vulnerability of Success, and The Persian Gulf and Pacific Asia: From Indifference to Interdependence.

…competitive research grants—it is likely that it will hope to get more from the same pot in the future. In these circumstances junior members of staff or postgraduate students tend to feel uncomfortable discussing either the source of the funding or pursuing sensitive topics relating to the donor country. It is almost inconceivable, for example, to imagine an aca­demic with no alternative source of income researching and writing a serious critique of a regime that has either paid for his or her salary, schol­arship, or the building that houses his or her office. In many leading uni­versities this is now no longer a possible scenario, but instead a likely one.

In addition to promoting self-censorship, the donations also tend to encourage the steering of academic debate away from the Gulf monar­chies themselves—and especially studies on their domestic politics or societies—by instead promoting research on ‘safer topics’ in the broader region or on Arabic language or Islamic Studies. Indeed, the latter two fields are particularly palatable as they provide further support for the monarchies’ attempts to build up cultural and religious legitimacy resources. In Saudi Arabia’s case the funding of leading Islamic Studies centres also seems to be part of an effort to make the Saudi state’s highly controversial interpretation of Islam more ‘mainstream’ and acceptable, at least in scholarly and government circles. What all of this will soon lead to (and in some cases already has led to) is an academic discipline that carefully skirts around the key ‘red line’ subjects such as political reform, corruption, human rights, and the prospects of revolution—as these are usually perceived by university fundraisers and executives as likely to anger or antagonise their Gulf patrons. As such, this particu­lar stream of funding is in some ways an even more powerful and sen­sitive soft power strategy for the Gulf monarchies, as it is not primarily aimed at influencing public or even government-level opinion in the West. Rather its more subtle objective is to sway academic opinion in the West, or at the very least foster a ‘chilling atmosphere’ of apologetic behaviour or avoidance when it comes to intellectual discussion of the Gulf monarchies.

The historic links between Britain and the region have meant that the Gulf monarchies have been particularly attracted to funding British uni­versities, and these currently represent the best examples of the strategy. Indeed, it is now difficult to find any leading British institution focusing on the Middle East that has not received all of the varieties of gifts. Exeter University, home to Britain’s only centre for Gulf Studies…

Page 99 of After the Sheikhs is the beginning of a section on how ruling family members and their affiliated foundations have sought to channel huge donations into those departments of leading western universities that have historically taught on or done research on Middle East politics or Islamic studies. The Gulf monarchies’ aim, as part of a broader oil-financed soft power campaign in the opinion-making centres of their Western military protectors, is to encourage self-censorship among potentially critical academics and in general to steer debates away from controversial topics such as domestic Gulf politics, human rights, or civil society, while also normalizing the controversial Saudi interpretation of Islam within elevated intellectual circles.

Tuesday, November 26, 2013

Thomas Suddendorf is a professor of psychology at the University of Queensland whose research has attracted honors and awards from such organizations as the American Psychological Association, the Academy of the Social Sciences in Australia, and the Association for Psychological Science. His work has been covered by the New York Times, Discover, and Science, among other outlets. Born and raised in Germany, he lives in Brisbane, Australia.

In The Gap I explore why we are the peculiar creatures that we are. What differentiates us from other animals and enabled us to dominate the planet? About half of the book examines the most common proposals about what sets us fundamentally apart from the rest: language, foresight, mind-reading, intelligence, culture, and morality. I find that various animals, in particular our closest animal relatives the great apes, have sophisticated capacities in even these domains. Nonetheless, in each of these contexts the human ability is special for recurring reasons. In particular, two profound characteristics keep re-emerging as critical: our deep-seated drive to exchange our thoughts, and our ability to imagine alternative scenarios (be they about past, future or entirely fictional events) and embed them into larger narratives.

Page 99 illustrates these two characteristics in the context of our capacity to travel mentally in time. As you well know, we can think about how events unfolded or what the future might hold. By comparing alternative routes to the future and deliberately selecting one plan over another we gain a sense of free will and an edge over creatures with less foresight. However, this also burdens us with the responsibility for getting it right. We are not clairvoyants. Constructing clever scenarios of the future is a complex skill that draws on many components (I compare it to what is involved in putting on a play in a theatre) and on page 99 I highlight how we frequently get it wrong — as well as how our urge to connect our minds was critical in turning this fallible system into a powerful adaptive strategy that harnesses our collective wit.

…spectacular miscalculations can be found among the annual winners of the Darwin Awards. We will never know what exactly a wheelchair-driving 2010 winner was anticipating would happen when, after missing a closing elevator, he decided to impatiently ram the door until it broke down—only to fall into the now-empty shaft. Most of our foresight errors are minor by comparison, leading to inconvenience or embarrassment. You may fail to usefully imagine the future because of shortcomings in any one of the components in the theater metaphor. Stage: you may fail to disengage from the present to imagine the future—perhaps like that wheelchair driver. Actors: you may miscalculate how others will feel or act—as so often happens in pranks. Set: you may misjudge physical relations—say, when you think the boat could surely take a much heavier load. Playwright: you may fail to generate the relevant scenarios—and you later have to admit that you didn’t think of this or that. Director: you may not have practiced for the future sufficiently—leading you to look distinctly underprepared. Producer: you may end up selecting the wrong plan—d’oh! There are countless ways in which our attempted foresight can let us down.

However, we have radically improved our chances of getting it right through a wonderfully effective trick: we share our plans and predictions with others. We can transmit our mental plays and reflections to audiences around us and, in turn, consider their thoughts. In preparing a speech, it can be helpful to rehearse it not only in our mind but also in front of a friend. We can learn from others’ memory and foresight, and listen to comments on ours. Indeed, we have a deep-seated drive to broadcast our minds and to read what is on the minds of others—to foreshadow the next chapter. And we have an extraordinarily effective way of exchanging our mind travels through language—to remind you of the previous chapter. Language is ideally suited for this mental exchange, and much of human conversation is indeed about past events (who did what to whom, and what happened next) and future events (what will happen to whom, and what we are going to do about it). By exchanging our experiences, plans, and advice, we have vastly increased our capacity for accurate prediction. In Stumbling on Happiness the psychologist Dan Gilbert discusses errors and biases in our foresight and argues that the most reliable way to predict a situation is to ask people who have experienced something similar. Indeed, for much of our past the stories of our fellow tribespeople would have been all we had to go by.

Page 99 brings us to a meandering passage in A Week on the Concord and Merrimack Rivers in which Henry David Thoreau reviews his famous brush with the law and night in jail in 1846. He goes on to quote (and translate) Antigone’s defiance of King Creon’s death sentence as illegitimate under divine law. Thoreau learned his Greek at Harvard, but in his allegiance to timeless law he was dissociating himself from a view of ethics he had encountered as a student. He kept his copy of William Paley’s Principles of Moral and Political Philosophy in his library but rejected its preference for expediency and public “conveniency” in ethical decision making. He upheld instead the call of conscience and the eternal claims of justice: “This people must cease to hold slaves, and to make war with Mexico, though it costs them their existence as a people.”

It will be no surprise to find discussion of Thoreau in a book devoted to exploring an American tradition of civil disobedience from the colonial period to the present. The surprise may be that it has taken more than ninety pages to get there. That is because I emphasize the role of predecessors – colonial protesters against religious injustice, missionary champions of American Indian nations, fugitives from slavery, benevolent women reformers – in creating a tradition. Thoreau’s greatest significance came in the 20th century, after he had been discovered by Mahatma Gandhi in India and South Africa and cited in defenses of civil disobedience by a succession of American dissenters and activists from the 1930s on. His influence would eventually be deplored by leaders who worried about the instability of democracy, while it was glorified by dissenters from national policy, and adopted in defense of activist law breakers in courtrooms and popular books.

Perhaps it’s appropriate that Page 99 of Alternate Histories of the World is a full-page picture. My book alternates between over 90 full-page images of monsters, robots, and zombies rampaging through human history, and accompanying pages of descriptive text. It’s also a book that started from my online Alternate Histories store (AlternateHistories.com) which is entirely image-based.

Page 99 shows a recruiting poster for the Canadian Mounted Rifles “Monster Corps” (Canada’s Crack Calamity and Colossus Corp) - [Canada’s] First Line of Defence Against Monsters, Zombies, Aliens, and Other Unearthly Creatures.” In the picture an officer is seated on his horse using field glasses to spy on a UFO destroying a small town.

In the book, this Canadian Monster Corps plays a significant part in explaining why we don’t have monsters & zombies running around in modern times. I posit that the Canadians in the 1920s started this Monster Corps program and staffed it with battle-hardened veterans of World War I. These officers, later assisted by Mechanical Man Philip J. Gearsworthy, set up the beginnings of protocols designed to deal with outbreaks of the zombie virus, gigantic monsters smashing tall buildings, and robots rising up to destroy their creators.

In a later page of the book (pp. 104-105), I discuss the United States’ similar but more more extensive efforts after World War II, with the creation of the S.M.A.S.H. (Stop Monsters And Save Humanity) Squads who used these Canadian techniques and combined them with American technology, industry, and that can-do spirit. Eventually they managed to essentially wipe out the scourge of monsters, contain the zombie virus, and hold back alien invading forces.

The rest of my book deals with these monsters & creatures running rampant through our past, and these two pages are a feeble (if hopefully amusing) attempt to explain why you, the reader, were not aware that a robot helped to write the Declaration of Independence. Better to think of the book as a product of another universe, one where supernatural creatures roamed freely. An Alternate History, if you will.

Friday, November 22, 2013

Susan Carle teaches legal ethics, anti-discrimination law, labor and employment law, and torts at American University Washington College of Law. She writes primarily about the history of social change lawyering, anti-discrimination law, and topics at the intersections between civil rights, employment, and labor law. In the past she has been a community organizer, civil rights lawyer, and union-side labor lawyer.

It takes courage to apply the page 99 test to one’s own work with honesty, though honesty may be the most important virtue of a writer. So I will try: Defining the Struggle tells the story of the civil rights leaders of the late nineteenth and earliest years of the twentieth century who started to develop the ideas that would lead to the rich and multifaceted campaign for racial justice in the United States, which eventually grew into the later twentieth century civil rights movement. As a legal historian, I wanted to convince other legal historians of the importance of this largely overlooked “prehistory” of legal civil rights organizing, but I also desperately wanted to write a book that would be accessible and interesting to a more general readership. Did I accomplish this? I believe the page 99 test shows I did, in part. That page begins a discussion of the founding of the longest-lasting national civil rights organization of the turn of the century period: The National Afro American Council, which lasted for a decade before essentially merging with other efforts to become the NAACP. It describes the Afro American Council’s founding in the midst of a celebration of Frederick Douglass in Rochester, New York. Dignitaries in attendance included famous suffragette and good friend of Douglass Susan B. Anthony, thus reflecting the connection some of the more visionary leaders of this period (though not all) saw between the campaigns for racial equality and for women’s rights. Also active in the proceedings was the now-famous anti-lynching crusader Ida B. Wells. Wells was known for her blunt speaking style, and she shared her ideas about who would be best to lead this new organization; her thoughts were heeded in the appointment of the savvy organization-builder Alexander Walters, the youngest-ever bishop of the African Methodist Episcopal Zion Church, to head this new organizing effort. The “standard story” holds that the African American civil rights leaders of this period were almost all descendants of a free black elite, but this hardly describes the backgrounds of either Wells or Walters, who were both born enslaved and moved into national leadership roles despite very modest economic and educational backgrounds.

Does this snippet whet one’s appetite for more? The answer to that question probably depends on one’s personal taste for historical accounts that seek to revise standard stories and settled judgments of who counts as historical figures of enduring importance. Thus page 99 is a terrific litmus test: This book will appeal if one is interested in the kind of story it tells, but the writer’s talents are such that it might not convert a reader’s interest absent an underlying enthusiasm for stories about history’s forgotten heroes.

Thursday, November 21, 2013

Ronen Shamir is Professor of Sociology and Anthropology at Tel-Aviv University and author of The Colonies of Law: Colonialism, Zionism and Law in Early Mandate Palestine (2000) and Managing Legal Uncertainty: Elite Lawyers in the New Deal (1996).

What does it mean to be 'connected'? Or 'wired'? Page 99 is part of the third chapter of a book which aspires to theorize electricity-consumers as sociological types (and internet or cellular consumers as well for that matter). Page 99 invites us to think of the seemingly mundane electric-meter. This age-old device – beyond its official role as a reader of electric consumption calculated on the basis of Kilowatt per Hour – is an object which creates a boundary between the public domain ('the main distribution system') and the private domain (private premises and households). It is a foundational assembly, one among many such micro-sites where the distinction between the public and the private are performed and affirmed. Page 99 also notes that this division allows for stratification: The main distribution system is nominally egalitarian; once in place, it offers equal opportunity for all to 'connect' and consume electricity. The reading of meters, where the private sphere lies, is where asymmetry begins. The reading of meters facilitates the ability to measure and compare different amounts of consumption and to classify types of consumers (big, small, domestic, commercial etc.).

Now consider the sociological status of the consumer in and on such assemblies. Consumers are at once subjects with contractual rights and obligations, and objects which function as crucial contact-points for the circulating electric current. On the one hand, the consumer is a product of an “objective” material connection to a grid; on the other hand, this materiality depended on one’s “subjectivity”: a willing attachment to the grid. The constitution of the electric-consumer, then, may be better captured by considering the irreducibility of social identities on and in grids: at once elements attached to a network and actors whose connection allows such networks to circulate their 'materials'. But of course, in order to understand more about the social properties of electric grids and the social divisions they create, you need to read more than one page!

Page 99 of What’s Wrong with the Poor? finds us in the middle of a discussion of early intervention programs, and it begins in introducing Project Head Start. This is particularly apt, as Head Start is one of the best-known legacies of the War on Poverty, and as I argue in the chapter, was based on theories of deprivation. The debate over Head Start demonstrates how theories of mental health and development were utilized in discussions of poverty, its causes and its prevention. In particular, these theories highlighted the role of deprivation, focusing on what was it that poor men, women and children lacked.

Throughout the book, I look at different theories of deprivation, which focused on a wide range of experiences mental health experts believed low income children were lacking. These perceptions were deeply stereotyped. Parents were described as non-verbal, mothers failed to adequately stimulate their children, homes were drab and colorless, and there were no books or educational toys for the children to play with. Low income children were seen as lacking the necessary stimuli for mental and psychological development, and hence early intervention programs were designed to provide that which these children lacked in their homes.

Project Head Start set out to combat these deficiencies. While the goal to provide quality educational experience for the poor may have been laudable, it was based on flawed and racialized perceptions of low income families and children, many of whom were African American. Rather than focusing on the strengths and resilience of low income families and examining the structural causes of racial and socioeconomic inequality in American society, theories of mental health provided policy makers a means by which to turn poverty into an intra-psychic deficiency. This view of poverty led to funding priorities that privileged mental health and educational interventions, thus circumventing a discussion of the structural factors that create and perpetuate such disparities within American society. What’s Wrong with the Poor? examines the interrelations between mental health and public policy and asks how can we effectively combat poverty without pathologizing the poor.

Tuesday, November 19, 2013

Emily Mayhew is a Research Associate at Imperial College London and an examiner at the Imperial College School of Medicine. She is a consultant and lecturer to museums including the Wellcome Collection, the Imperial War Museum and the Royal College of Surgeons.

Anyone reading page 99 of Wounded - A New History of the Western Front in the First World War would find the material puzzlingly unrepresentative. There is a nurse, to be sure, but the page is mostly about other residents of the Field Hospital where she worked - a bandy-legged fox who stole from the hospital garbage, a posh French hunting dog rescued from No Mans Land by the chief surgeon- and the social activities that took place when the Field Hospital wasn't receiving battle casualties. The animals patrolled the food stores and kept the rats down and were used as therapy in the wards by the nurses who encouraged patients to get back on their feet by taking them for walks in the nearby woods. Half the page is taken up with a description of a game that was popular among the medical staff: a paper chase, also known as Hare and Hounds, where a trail of paper shreds was laid in the nearby countryside and then one member of staff was chased by the others. The first to arrive back at the Field Hospital was the winner, either a Hare or a Hound. Not much war there - but there are hints. The nurse, Winifred Kenyon, has mixed feelings about encouraging her patients to walk the animals. If they are strong enough to walk, then soon they will be strong enough to go back to their battalions and fight again. It is the patients who lie in bed tearing up old newspapers to make the trail for the paper chase game, testifying to the strong social bonds they have with their carers. Nurses and orderlies move them out of their wards to watch the game in the sunshine, cheer the runners on and keep a surreptitious book on likely winners. The Field Hospital, closer to the front line than it is to any of the towns or military camps to the rear, is a world of its own, self-sustaining and kept both medically and socially effective by its nurses. Closer to the war than any women before them, the nurses developed very particular expertise and skills to respond to the nightmare of casualty on the Western Front. From complex resuscitation techniques, to the care of the dying, the breaking of bad news to soldiers with limb loss or other serious wounds, to the vital social life that kept everyone together and sane, all of it was dependent on the nurses and their dedication to their patients and their colleagues in an extraordinary time of war and sacrifice.

Monday, November 18, 2013

Stephen M. Kosslyn was a cognitive neuroscientist and professor of psychology at Harvard University for over 30 years and now serves as the founding dean of the Minerva Schools at the Keck Graduate Institute. G. Wayne Miller is an author, filmmaker and Providence Journal Staff Writer.

On page 99 we examine a question readers may have as they learn about our new Theory of Cognitive Modes: What determines a person’s dominant cognitive mode? Is it genes or experience – nature or nurture – or a combination?

Some background: Our theory of how people prefer to interact with the world and with others is based on the anatomical division of the brain into its top and bottom parts. Why not left and right, which holds that individuals are either analytical/logical, or artistic/intuitive? Because that pop-culture story has no solid basis in science. In Top Brain, Bottom Brain, we debunk this myth as background introduction to our theory.

Our Theory of Cognitive Modes, based on decades of solid research that until now has remained largely inside scientific circles, states that the anatomical division of the brain into top and bottom parts provides a better foundation for understanding thought and behavior.

The top part is involved in setting up plans, controlling movements, registering changes in where objects are located in space, and detecting when expected events do not occur and updating plans accordingly. The bottom part is involved in classifying and interpreting what we perceive, and allows us to confer meaning on the world. We all use both parts of the brain, but we vary in the degree that we tend to rely on each of the two brain systems for functions that are optional -- are not dictated by the immediate situation.

Unlike left/right, we do not focus on one part or the other. Instead, we focus on how the two parts interact – the brain is a single, interacting system.

Some people tend to rely heavily (in optional ways) on both brain systems, some rely heavily on the bottom brain system but less so on the top, some rely heavily on the top but less so on the bottom, and some don’t rely heavily on either system.

Which leads to four possible Modes: Mover, Perceiver, Stimulator and Adaptor.

You can determine your own preferred mode with a simple test in the book, and also online at www.TopBrainBottomBrain.com. We believe that learning about your preferred mode can help you not only better understand yourself but also help you understand your relationships. So, does nature or nurture determine a person’s dominant mode? As you might imagine, the answer lies in a combination of both.

The scene is a street in South Memphis, a mostly black section of the city, on the afternoon of May 1, 1866. Moments earlier, a shootout had erupted between four white policemen and a boisterous crowd of black men that the officers had tried to disperse. Two of the policemen have fallen, badly wounded; the others are fleeing.

The blacks are exultant, congratulating themselves for standing up to the despised, abusive Memphis police. But as the smoke clears and the excitement subsides, they begin to worry about the consequences. Some decide to lie low for a while.

They are the lucky ones. An hour and a half (and three pages) later---word of the shootout having spread throughout the city---mobs of armed white men descend on South Memphis and begin shooting and beating every black person in sight.

Thus began the Memphis race riot. It stretched over three days, during which white mobs repeatedly invaded black neighborhoods wreaking death and destruction. The toll was horrific: 46 black men, women, and children murdered, many others wounded, robbed, or raped, and every black church and school and many black dwellings in the city put to the torch.

My account of the riot--a minute-by-minute narrative written in the present tense, the better to convey the drama, the kaleidoscopic rush of events, and the sheer horror of those three days--comprises the second of the book’s three parts. The first, written in the customary past tense, is a detailed portrait of Memphis’s inhabitants on the eve of the riot, not only the blacks and ex-Confederates but also the many Irish and Yankee immigrants. It shows how the newly-emancipated people reveled in their freedom while Rebel and Irish resentment toward them and the Yankees festered. The third part, likewise in the past tense, explores the aftermath. The riot was a key event of the post-Civil War era. It outraged Northerners and encouraged Congress to come down hard on the white South, thus helping usher in the controversial era of Radical Reconstruction.

The riot was not only one of 19th-century America’s most significant episodes but also one of the best documented. Three federal agencies investigated it, gathering vivid testimony from hundreds of eyewitnesses. Few other events of that era can be so meticulously recounted. Mine is the first book-length study of it.

The first piece of good news about page 99 of America Is Elsewhere is that it is not blank—as, for example, page 106 is. Close call, there. If this had been the Page 106 Test I might have been feeling depressed, since the test would seem to suggest that the book is meaningless or incomprehensible. As it is, I can put that worry aside, at least until the reviews come out.

In fact, page 99 is not a bad one to turn to. The big argument of the book is that the films and novels that make up the noir tradition—classic noirs as well as later developments like conspiracy stories or cyberpunk—are always responding in some way to the rise of consumer culture in America, which really gets going in the postwar forties. By page 99, I’m talking about how classic noirs featuring hard-boiled detectives always make connections between local crimes, like murders and robberies, and the larger social systems within which those crimes take place. Here’s what I say on that page:

The hard-boiled detective novel generally begins with a crime; as the detective pursues the investigation he finds that that seemingly isolated crime expands outward, involving larger and larger spheres of social power, until ultimately the distinction between crime and the functioning of society disappears. The detective is left in an ambivalent position, and the story draws our attention to his limited sphere of enforcement. He can solve the local crime in a local way, but he cannot solve the problems of society and political economy that enable the crime at a further remove. He can only act as social critic, pointing to the powerful politicians and businessmen, the Mr. Bigs and Harlan Potters, who bear social responsibility.

So what this leads to, in these books and films, is an attempt at a critique of American capitalism itself, but one that still has to present that critique as an indictment of bad individuals. I go on to argue that in the sixties this critique gets taken one step further as noir evolves into the conspiracy narrative. In those texts—Pynchon’s novels, for instance, or the paranoid films of the seventies—the detective can no longer solve the crime at all. The conspiracy of multinational capital is too large and complex to be understood by the individual, so the noir investigation fails. But that failure still works as an example of noir political critique, since it suggests the presence of the large political and economic forces that it can’t actually represent, when the protagonist runs into a wall as blank and forbidding, in its way, as page 106.

I am particularly pleased to be dealing with page 99 of my book, since it contains one of the most pithy — if also one of the most derisive — descriptions of its subject: ‘“What’s a saint?” ask the devils in Cardinal Newman’s poem The Dream of Gerontius, and answer with glee, “A bundle of bones, which fools adore.”’ Well, the book certainly does not assume that only fools adore saints, but it does start from the assumption that a saint is indeed “a bundle of bones”. The title of the book is a question asked by the great Christian thinker, St. Augustine, as he pondered the miracles worked by the saints. For hundreds of years, and in some places to this day, Christians revered the physical remains of the holy dead. As p. 99 puts it, ‘The corpse was a source not of pollution but of supernatural power. When Elizabeth of Thuringia’s body was laid out after her death in 1231, crowds assembled and cut away portions not only of her clothing, but also of her hair, her nails, even her nipples, which “they preserved for themselves as relics”.’ The book attempts to make sense of these beliefs and this behaviour.

Wednesday, November 13, 2013

Molly Worthen is Assistant Professor of History at the University of North Carolina at Chapel Hill. She is the author of The Man on Whom Nothing Was Lost: The Grand Strategy of Charles Hilland is a regular contributor to the New York Times, Slate, Christianity Today, and other publications.

The pressures on faculty and administrators are mounting. Money is tighter than ever. Students won’t enroll without a guarantee of a good job after they graduate. Peer institutions are jostling for applicants and funding, even if it means embracing “innovation” and “creative disruption” with only a vague notion of what these changes might bring.

I’m talking about fundamentalist Bible schools, circa 1947.

The Page 99 test lands on the opening of Chapter 5, “The Marks of Campus Conversion,” and plunges us into this world. Most outsiders think of fundamentalism as an isolated subculture, a Christian fortress secure against the sins of the world. Don’t Bible colleges exist to protect young Christians from the predations of modernity? Don’t they reject innovation in favor of old-time religion? After spending the past couple of years exploring the archives of these institutions, I learned that these stereotypes could not be more wrong: fundamentalists and evangelicals are no strangers to the pressure to change in response to the marketplace, and their relationship to secular learning is complicated, to say the least.

On page 99 we meet Sam Sutherland, president of Biola College, a bastion of fundamentalism outside Los Angeles. He criticizes more liberal-minded Christians who confuse “their egghead enterprises with Bible-based revival.” Yet Sutherland himself was hardly an academic slouch—he arrived at Biola with degrees from Occidental College and Princeton Theological Seminary. In 1948 he complained that the world had gone, “I would dare to say, educationally berserk.” But only two years later we find him crowing to colleagues about the fleet of new PhDs on his faculty. Sutherland’s ambivalence toward higher learning confirms the central argument of my book: conservative evangelicals are torn between sincere respect for human reason and academic achievement, on one hand—and on the other, their deference to a cripplingly narrow understanding of scripture that clashes with the spirit and conclusions of secular inquiry. They are caught between conflicting sources of intellectual authority. This crisis of authority is nothing new: it is as old as evangelicalism’s origins in the years after the Protestant Reformation.

But Sutherland’s savvy transformation of his college—once a fundamentalist Bible institute; today, a thriving Christian university—proves that if this crisis of authority is the scourge of the evangelical mind, it is also a kind of genius: a never-ending balancing act, a source of anxiety and energy that propels evangelicals to constantly revise their relationship to their own traditions and to mainstream culture.

"Open your book to page ninety-nine and…the quality of the whole will be revealed to you."

Please.

Funny thing, when I opened my book to page ninety-nine I found that it featured a central theoretical claim in the work and that it offered a distillation of the book's unorthodox methodology. In this case, both the claim and the method relate to a still image from the 1970 movie Watermelon Man (the image itself is on page 100). I describe my first encounter with the image this way:

I do not remember if my brother called me into the family room to watch another seminal moment in the era's cinematic cultural race battles, but I recall being there all the same. Previously, at my brother's invitation, I had seen John Shaft give the white man the finger and, in doing so, create entirely new ways to read the black body. My only memory of this second moment—of the entire film, in fact—is an image of a black man sitting in a bathtub filled with gallons upon gallons of milk.

The person in question is dark-skinned actor Godfrey Cambridge. In Watermelon Man—"a social commentary clothed in a comedy built on a fantastical premise"—Cambridge plays Jeff Gerber, a bigoted insurance salesman who wakes up one morning to discover that he has turned black, very black. (Cambridge is in white face for the first fifteen minutes of the film.) This milk bath is one of Gerber's last desperate efforts to become white again.

This page is from a chapter that explores how the black body—fingers, arms, skin—serves as a repository of traumatic memory. Also on this page I am writing in the first person, a voice that is interspersed throughout the book to highlight how official narratives of the past (history) collide with personal narratives (memory). In Jim Crow Wisdom I examine the consequences of these collisions.

On page ninety-nine we bear witness to Gerber's trauma when he discovers his sense of self did not align with others' assessment of how his new, black body would be allowed to navigate the waters of mainstream America. By this point in the film, moviegoers could see that Gerber was sliding toward madness as he realized that he lacked control over the process of narration.

My book explores black memory from many angles—popular literature, social science, dance, the built environment, memoir, and film—and asks critical questions about who gets to write and claim those memories and what's at stake in the process. For Jeff Gerber, the story his body told others (and the presumptions of black memories that were subsumed in his skin) revealed the traumas of a second-class citizenship. For others, people who were black their entire lives, the consequences of these conflicting narratives were so patently obvious and negative that they could not afford the luxury even of the mistaken hope that bathtubs filled with milk could solve their problems.

Monday, November 11, 2013

Derek J. Penslar is the Samuel Zacks Professor of Jewish History at the University of Toronto and the Stanley Lewis Professor of Israel Studies at the University of Oxford. His many books include Shylock's Children: Economics and Jewish Identity in Modern Europe, Israel in History: The Jewish State in Comparative Perspective, and The Origins of Israel, 1882-1948: A Documentary History.

On Page 99 of Jews and the Military I write of the effect of the Dreyfus Affair on Jewish career officers in France:

Frustrated and aggrieved Jewish officers became increasingly wont to challenge their bigoted comrades to duels, at times fatal. Many Jews resigned their commissions out of fear that their career advancement would be stifled.... Not surprisingly, at the end of the century the numbers of Jews entering the [military-technical] Ecole Polytechnique and [military academy] Saint-Cyr plummeted.

Nonetheless, even at the height of the affair Jews continued to graduate from the Ecole Superieure de Guerre and receive promotions from lieutenant to captain, battalion chief, lieutenant colonel, colonel and general. Jewish officers who died in uniform continued to receive elaborate and respectful military funerals. At the turn of the century, as the forces of republicanism reasserted themselves, Catholic and monarchist officers, not Jews, were the targets of investigation by the War Ministry…. Jews continued to flow into the Ecole Polytechnique, albeit at reduced numbers, leading to the training of a whole new generation of Jewish officers who would take command during and after World War I.

These passages speak to the heart of my book’s argument: that Jews in the modern world have often been willing, even eager, to serve in the military. In countries such as France and Italy where the officer corps was available to Jews, they sought it out. In eastern Europe Jews were frequently persecuted, and so their attitude to the state was more hostile than in the west, but even here most Jews dutifully performed their military service, and only a small minority were draft dodgers. For Jewish men in any country, military service presented an opportunity to display masculine valour. Jews worried lest their co-religionists face each other in battle, but they celebrated the virility, bravery, and above all the patriotic spirit of their men in uniform.

The excerpt is from a chapter on Jews as career military officers in Europe and North America. Other chapters in the book deal with soldering and warfare in pre-modern Jewish civilization; the relationship between conscription and emancipation; the tension between patriotism and trans-national solidarity when Jews fought in wars in the 1800s; Jewish soldiers in World Wars I and II; and the role of diaspora Jews as volunteers in the fight for Israeli statehood in 1948.

Sunday, November 10, 2013

J.J. Carney is Assistant Professor of Theology at Creighton University, Omaha, Nebraska. His research and teaching interests engage the theological and historical dimensions of the Catholic experience in modern Africa.

To be honest, I had never heard of Ford Madox Ford's page 99 test, but it was surprisingly accurate for Rwanda Before the Genocide. Namely, page 99 brings you right into the heart of the long-running controversy surrounding Catholic leaders, ethnic discourse, and Rwandan politics. You encounter Mgr. Andre Perraudin, a Swiss Catholic missionary bishop and future bete noire for the Tutsi exile community. More importantly, you encounter Perraudin's most famous writing, his pastoral letter "Super Omnia Caritas" released in February 1959. This pastoral letter explicitly and exclusively linked late colonial social injustice with Hutu and Tutsi categories, establishing Perraudin's reputation as a pro-Hutu partisan. Perraudin later called this statement the "charter of my episcopate" even as he denied that it contributed to the political climate that helped precipitate revolutionary violence in November 1959. In showing the analytical difference between Perraudin and Rwanda's other Catholic bishop at the time, Mgr. Aloys Bigirumwami, page 99 reminds us that social description lay at the heart of Rwanda's late colonial disputes. Namely, the church may stand for "justice for the poor," but the key issue is how the church understands and defines "justice" and "the poor." In Rwanda many political and religious leaders chose to define "the poor" as "Hutu," helping to establish a dangerous ideological justification for revolutionary anti-Tutsi violence. Such discourse helps to explain why many (if not all) Catholic leaders stood by Rwanda's post-colonial Hutu governments even in the midst of major anti-Tutsi massacres in 1964, 1973, and the early 1990s.

Saturday, November 9, 2013

Caroline Vout is Senior Lecturer in Classics at the University of Cambridge and a Fellow of Christ’s College. In 2008 she was awarded the prestigious Philip Leverhulme Prize for Art History. She is the author ofPower and Eroticism in Imperial Rome and Antinous: the Face of the Antique, which won the inaugural Art Book Award.

Page 99 is, appropriately, not words but image -- for 'reading' images is what Sex on Show: Seeing the Erotic in Greece and Rome is all about. The picture captures what remains of an ancient brothel at Pompeii, an erotic painting above the door giving a glimpse of the kinds of things that happened on the stone bed inside. It is an unusual image as far as the book is concerned in pointing so directly to bedroom gymnastics. Why? Because most of its 200 pictures reveal less about the realities of Greek and Roman sex than about fantasies, anxieties, propriety… Many of these pictures (for example those of Athenian pots decorated with scenes of athletics, or statues in the shape of beautiful gods) offer subtler invitations to think about the temptations and constraints that affected ancient bodies. They show objects used in elite drinking parties or temples, some of them rediscovered by Hellenists in the eighteenth and nineteenth centuries. By understanding their visual mechanics, Sex on Show throws new light on ancient sex and gender, cultural identity, religion, as well as modern collecting practices.

A single page, with its atypical image, inevitably struggles to encapsulate the many aspects of our strange relationship with the ancients as seen through the lens of sex. But its failure to do so captures something of the flavour of the book’s subheading and the power not of sex but the erotic. Unlike the consummation that comes from sex, the erotic is always wanting. It is only apt that page 99 should offer but the merest, even potentially distorting taste of the whole, because desire is always veiled.

Friday, November 8, 2013

Lara Deeb is associate professor of anthropology at Scripps College and the author of An Enchanted Modern. Mona Harb is associate professor of urban studies and politics at the American University of Beirut and the author of Le Hezbollah à Beyrouth.

We were pleasantly surprised to find that page 99 of our book captures many of its key themes. Leisurely Islam is about many things: how and why cafes and restaurants boomed in south Beirut in the twenty-first century; the entrepreneurs who opened these businesses; the usually unsuccessful efforts of the Islamic political party Hizbullah to control this new leisure sector; the views of religious leaders on common cafe practices; the aesthetics of cafe interiors; and, as you might guess from the book title, the ways that pious youth navigate both ideas about morality and the complex sectarian geography of Beirut as they piece together ways to have fun with a clear conscience.

Page 99 conveys a sense of the tensions among the three key agents involved in producing and controlling leisure places in south Beirut and their clientele: Hizbullah, Shi‘ite jurisprudents, and cafe owners. The page highlights the late popular religious leader Sayyid Fadlallah’s view that “seeking a rapprochement between jurisprudence and the contemporary world can also be understood to suggest that jurisprudence should adapt to the social world, and thereby facilitate the needs and desires of contemporary youths.” This is an accurate summation of his opinions that are so important in this community. Page 99 also notes that cafe owners try to “act as self-proclaimed paternalistic authorities, monitoring and regulating” youths’ behavior. But perhaps the most important sentence on the page is: “But youths are not passive recipients and can be quite vocal in contesting authorities that interfere with their lifestyle choices.” This sentence encapsulates one of our key arguments in the book - that pious young Muslims are working out their own ways of living a good and moral life which includes both abiding by (their own interpretations of) religious and social values and tenets and having fun with their friends. This really isn’t an earth-shattering idea, but unfortunately in the U.S. today (and beyond), pious Muslims are often described in one-dimensional terms, so that readers and listeners are left with the impression that religion is the only important thing in their lives. One of the reasons we wanted to write this book was to counter with that assumption.

What is missing from page 99 is our argument about how new practices of leisure are affecting the geography of Beirut as a whole. The page gives the reader a glimpse of that idea when it says “Cafes and restaurants in Dahiya are providing youths with many social and spatial opportunities close to home,” but that barely begins to scratch the surface of how urban space is being negotiated and imagined by pious young people as they seek out different places to have fun.

Thursday, November 7, 2013

David Hendy is a fellow of the Royal Historical Society and Professor of Media and Communications at the University of Sussex. He has been a visiting research fellow at Wolfson College and the Centre for Research in Arts, Social Sciences, and Humanities (CRASSH) at the University of Cambridge; Marjorie G. Wynne Visiting Research Fellow in British Literature at the Beinecke Library, Yale University (2010); and Helm Fellow at the Lilly Library, Indiana University, Bloomington (2010). In 2011 he was awarded the James W. Carey Award for Outstanding Journalism by the Media Ecology Association of North America for his five-part BBC Radio 3 series, Rewiring the Mind. He worked as a journalist and producer at the BBC. His book Life on Air: A History of Radio Four (2007) won the Longman-History Today Book of the Year Award in 2008.

We’re in the world of bells – surely among the most iconic sound-making devices in the whole of human history. The earliest evidence for their use comes from China. But on page 99, we’re in the Catacombs of Priscilla, one of the less well-trodden networks of chilly underground passages and chambers hidden under the streets of Rome. Some 1900 years ago, it was lined with bodies – a quiet burial place for the ancient city’s earliest Christians. The Catacombs are silent now. But even back then, the only sounds would have come from simple funeral rites – accompanied, most likely, by the faint tinkling of small bells or ‘tintinnabula’.

A rather mute choice of location, then, for a history of sound. But even the modest, long dissipated chimes of the tintinnabula are a vital link in my unfolding story, which is about power: the remarkable power of sounds to shape our emotions and our social worlds over the past 100,000 years. Bells themselves had an extraordinary, sacred power. Their sound was seen as a ‘manifestation of universal essence’, helping to drive away those multitudinous evil spirits lurking all around. In fact, wherever they’ve been rung, bells have supposedly created a Godly aura of safety.

There’s another reason for being in the Catacombs. Some of the rooms here are stuffed full of inscriptions and frescoes: visual clues for lost soundscapes - scenes of banqueting and ceremonies, weddings and prayers - scenes that look Christian, but also perhaps Pagan, Greek, Jewish. In other words, scenes that conjure for us a turbulent melting pot of beliefs and rituals. What they show, too, is that whatever their precise religious persuasion, these earlier Christians appeared to believe that if sound really was an effective means of repelling troubling spirits, or attracting the attention of the Divine, there was no better way to guarantee access to the holy than by making a bit of racket: dancing, speaking in tongues, creating – quite literally – good vibrations.

The bishops of later centuries would soon put a stop to all this – as later pages of the book will show. But for a brief moment in time, religion has a delicious whiff of Dionysus about it. It remains (to quote the chapter’s title) an ‘ecstatic underground’: a dazzling kaleidoscope of sounds and songs somehow making their presence felt in the damp, still darkness below Rome’s noisy Via Salaria.

Novelty: A History of the New is an historical study of the ways in which people have modeled a quality that is both fundamental and elusive. Common sense and basic philosophy both indicate that even the most astonishing novelty must have come from something and in that sense cannot be absolutely new. In practice, it is easy to debunk the claims of any innovation by showing that it has been preceded or anticipated, and this might lead to a strict judgment that there is never anything new. Science and philosophy are sometimes just this strict, but novelty in common parlance has never meant the eruption from a void of something utterly unprecedented. Some of our most basic models of novelty, such as revolution or renaissance, are cyclical and rely partly on nature and partly on a Christian pattern of return to a better state. Some others, particularly evolution, are based on a pattern of recombination, in which preexisting elements become new when put in a new relation to one another.

Information theory, which is the subject of discussion on page 99, is one of these models. Information theory began as a modest attempt to establish a method for determining the most efficient way of sending signals through a medium, given the inevitable corruption of some sort of interference. From these beginnings, it expanded its influence so that it became the common way of understanding genetic transmission, and it now has such widespread prestige that it is considered by some scientists and philosophers the best way of understanding the universe itself. Thus it is fair to say that the mathematical model behind information theory, which measures the extent to which a set of symbols can generate new combinations, is the dominant model of novelty at the present time.

Unfortunately, it seems that measuring the new and defining it are two different things. According to the argument on page 99, the terms used to define the quality that information theory measures, terms like uncertainty or choice, are fatefully ambiguous in that they describe a mathematical quantity in quasi-psychological terms. The subjective condition of a human receiver is ascribed to the purely objective capacity of the system to generate new combinations. Thus it remains unclear whether the novelty defined by information theory is a basic feature of things in themselves or merely an effect of our limited condition as human beings.

Tuesday, November 5, 2013

Hooman Majd was born in Tehran, Iran in 1957, and lived abroad from infancy with his family who were in the diplomatic service. He attended boarding school in England and college in the United States, and stayed in the U.S. after the Islamic Revolution of 1979.

Majd had a long career in the entertainment business before devoting himself to writing and journalism full-time. He worked at Island Records and Polygram Records for many years, with a diverse group of artists, and was head of film and music at Palm Pictures, where he produced The Cup and James Toback's Black and White.

His books include The Ayatollah Begs to Differ (2008) and The Ayatollahs’ Democracy (2010).

Page 99 is probably for many readers somewhat representative of the story my book tells---it is subtitled "An American Family in Iran," after all---and deals with what almost any parent, anywhere in the world, is most concerned with: the health of their child. A regular visit to the pediatrician, a normal and necessary occurrence at home, is viewed somewhat differently when one is abroad, but that is just one small detail, revealing as it might be, of a culture both alien and sometimes familiar.

The book is nominally about exactly that: a family, thoroughly Western, uprooted and living in a country most Westerners know little, or nothing about. But unlike Westerners living in the East, I am bi-cultural, and although I never experienced living in Iran as an adult (not even as a teenager), my closeness to the culture---along with my spouse's and son's distance from it---are what made me want to experience life among the Persians and to write about Iran from a different perspective. While page 99 is representative of the book in that it is a descriptive entry, the book, at least to me, is much more about the concept of 'home', and what it means to a bi-cultural person. And, of course, it is about my people, and not just the "American family."

Monday, November 4, 2013

Robert Klara is the author of the critically acclaimed 2010 book FDR's Funeral Train, which historian and author Douglas Brinkley called “a major new contribution to U.S. history.” Klara has been a staff editor for several magazines including Adweek, Town & Country and Architecture. His freelance work has appeared in the New York Times, the New York Daily News, American Heritage, and The Christian Science Monitor, among other publications.

I have to thank Ford Madox Ford for his curious maxim; when I turn to page 99 of my book, I find that I could hardly ask for a better representative page for this narrative—with an important qualifier.

I’ll explain the qualifier first. The Hidden White House is a book that details one of the most ambitious and dangerous construction projects in modern American history: The U.S. government’s complete gutting and rebuilding of the White House between 1949 and 1952. In this strict since, page 99 isn’t much of a big-picture summary. However, my conviction has always been that any story about a structure, however iconic or historic, is necessarily the story of the men and women who struggled to build it. In the sense, then, that my book about the White House is really an intimate visit with the people who worked and fought behind its storied sandstone walls, page 99 nails it.

Page 99 of The Hidden White House introduces a man named Clarence Cannon, a conservative Missouri congressman who, as head of the all-powerful House Appropriations Committee, was one of the most tenacious budget hawks our government has ever seen. Notorious for punching out Congressmen who disagreed with him, Cannon took the position that most all government spending was suspect or outright wasteful. So when Cannon got wind that rescuing the dangerously frail White House would cost $5.4 million, he vowed not to permit the effort a penny.

The fact that the White House had already been evacuated of the Truman family; the fact that it was in danger of imminent collapse (owing to decades of neglect and dangerous overloading of the mansion’s old wooden beams)—neither moved Cannon in the least. In fact, as page 99 details, Cannon’s only response was to demand that the White House be demolished and a new one—a cheaper one—built to replace it.

Very fortunately, Rep. Cannon would not succeed in his quest to see bulldozers knock in the mansion’s historic walls. But page 99 introduces Cannon and his cold-hearted threat—one that touched off a national debate, and one that very nearly cost the United States its most important landmark.

Saturday, November 2, 2013

Adam D. Shprintzen received his PhD in History with distinction from Loyola University Chicago in May 2011, where his studies focused on nineteenth century America. Currently, he serves as Digital and Archival Historian (see, Digital Encyclopedia of George Washington) at Mount Vernon, where he manages digital history projects as well as the institution's archival holdings.

Cultural history serves as an important avenue of investigation in my new book, The Vegetarian Crusade. To understand the formation of the vegetarian movement in the United States in the nineteenth century, it is necessary to analyze both how vegetarians viewed themselves and how the group was visualized by society at large. Both factors shaped the development of the movement. Page 99 nicely summarizes popular views of vegetarians by the start of the Civil War, when the group was attacked as being both physically and ideologically weak. The rise of vegetarianism in the era directly before the Civil War (the American Vegetarian Society, the first national vegetarian organization, was formed in 1850) was noticed in many corners of American culture and society, celebrated by reformers and mocked by the mainstream press. The growing vegetarian movement even grabbed the attention of the American music industry.

Oh! Wasn't she fond of her greens! was a composition first published in 1860 by the New York firm H. De Marsan and subsequent versions were in circulation until at least 1869. The song relays a story of courtship from the perspective of the male courtier. The narrator seeks the affections of a young woman named Jane Bell who surpassed all other women in the courtier's eyes. Unfortunately, he also came to find out that she was a vegetarian. The song notes the peculiarity of the courtship, filled with picnics of garden sorrel and breakfasts of watercress. The last verse exposes the complex relationship between vegetarianism and gender roles:

Now, we are married, and settled in life, The old gal behaves very kind;And, when I go home of a night, There plenty of greens I can find.Since marriage, she's taken to meat..How wonderful strange it seems!And sometimes, by the way of a treat, She has a little fat meat with her greens.

The song is clear in its implications. Once the woman was tamed and settled, she no longer adhered strictly to her vegetarian diet, long associated with radical politics including women's rights and suffrage. Now a respectable married woman the "old gal behaves very kind" including disassociation from her political past.

Vegetarians faced harsh attacks during this time period, though this would not always remain the case. By the late 1890s vegetarianism was embraced by mainstream society. How and why did this shift occur? The Vegetarian Crusade attempts to answer this question.

Friday, November 1, 2013

Richard S. Grossman is Professor of Economics at Wesleyan University and a Visiting Scholar at the Institute for Quantitative Social Science at Harvard University. He is the author of Unsettled Account: The Evolution of Banking in the Industrialized World since 1800.

Wrong tells the story of nine significant economic policy blunders from the last two centuries, including why each was adopted, how it was implemented, and its short- and long-term consequences. A main conclusion of Wrong is that policy goes horribly wrong when it is based on ideology rather than cold, hard economic analysis. For example, Wrong looks at how America's unfounded fear of a centralized monetary authority caused it to reject two central banks, condemning the nation to wave after wave of financial panics during the nineteenth century. It also describes how Britain's blind commitment to free markets, rather than to assisting the starving in Ireland, led to one of the nineteenth century's worst humanitarian tragedies--the Irish famine.

Page 99 comes at the end of the chapter that analyzes Britain’s return to the gold standard in 1925. Before 1914, Britain had been on the gold standard for much of the preceding two hundred years, and had been on it consistently for almost one hundred. World War I and the accompanying risk of gold-carrying ships being sunk by the enemy made it impossible for gold to be shipped internationally—effectively ending the gold standard.

Only a few countries reestablished the gold standard after the end of World War I, but that trickle became a flood after Britain returned in 1925. To the British, the gold standard conjured up an era during which they were the world’s dominant political, military, and economic power. Yet the idea of the gold standard was more appealing than the reality that emerged in the interwar period: exchange rates were misaligned, gold holdings were inadequate and poorly distributed, and the equally unappealing alternatives of leaving the gold standard or maintaining it at the cost of domestic economic prosperity put central bankers in an impossible position.

Britain was not the first country to return to the gold standard after World War I. However, because it had played such an important role in the pre-war gold standard, its return provided the signal for many other countries to reestablish the gold standard as well. The widespread return to gold played an important role in propagating and intensifying the Great Depression, as well as preventing policy makers from undertaking steps to counteract the downturn. Had Britain not been ideologically committed to gold, the interwar period might have been quite different.