OUPblog » American Historyhttps://blog.oup.com
OUPblog » American HistoryTue, 26 Sep 2017 11:30:07 +0000en-UShourly1https://wordpress.org/?v=4.7.6(c) Oxford University Press blog@oup.com (Oxford University Press)blog@oup.com (Oxford University Press)Education1440http://blog.oup.com/wp-content/uploads/2012/04/oup-icon.jpgOUPblog » American Historyhttps://blog.oup.com
Academic insights for the thinking world.Academic insights for the thinking world.Oxford, Comment, OUP, publishing, books, education, University, Press, podcastOxford University PressOxford University Pressblog@oup.comnonohttps://blog.oup.com/2017/09/emmett-till-funeral-history/Let the world seehttp://feedproxy.google.com/~r/oupblogusahistory/~3/LCfmF6oQ1ho/
http://feeds.feedblitz.com/~/461616488/_/oupblogusahistory/#respondTue, 26 Sep 2017 11:30:07 +0000https://blog.oup.com/?p=133835When Emmett Till’s body arrived at the Illinois Central train station in Chicago on 2 September 1955, the instructions from the authorities in Mississippi were clear: the casket containing the young boy must be buried unopened, intact and with the seal unbroken. Later that morning, Till’s mother, Mamie Till Bradley, instructed funeral home director Ahmed Rayner to defy this command.

Related Stories

]]>
When Emmett Till’s body arrived at the Illinois Central train station in Chicago on 2 September 1955, the instructions from the authorities in Mississippi were clear: the casket containing the young boy must be buried unopened, intact and with the seal unbroken. Later that morning, Till’s mother, Mamie Till Bradley, instructed funeral home director Ahmed Rayner to defy this command. He objected, citing the promises he made to the state of Mississippi and the professional obligations they entailed. But again she insisted and, finally, against his better judgment, he agreed.

What Mamie Till Bradley saw next horrified her, but it also steeled her resolve and inspired one of the defining moments of the modern civil rights movement. When she told Rayner she wanted to hold an open-casket funeral for her son, he asked her to reconsider, but when she again insisted, he relented, offering to retouch the corpse to make it more presentable. “No,” she told him. “Let the world see what I’ve seen.”

Mamie Till Bradley’s decision that day—coupled with the photos of her son’s corpse that she allowed Jet magazine and the Chicago Defender to circulate throughout the black press—galvanized a generation of civil rights activists. In calling for the world “to see what I’ve seen” (or as she said in another context, “to let the people see what they did to my boy”), Mamie Till Bradley was not the first to challenge white supremacy by demanding that the nation bear witness to this ideology’s strange and necessary fruit: the wounded black body. Others before her, like the editors of Crisis during the early part of the twentieth century, understood that this witness exposed white supremacy for what it truly was: a hatred not simply dependent upon violence as an occasional tool, but one founded on violence and requiring violence for its very existence. In addition, such witness combatted the willful blindness of a nation that, because it did not want to confront the true nature of its own history, enabled this violence to flourish. What Mamie Till Bradley added to this protest tradition, and what in large part accounts for the unique place that Emmett Till’s lynching holds in our collective racial memory, is that she demanded the nation behold not just any wounded body, but a son’s body, and that it reckon with not just any pain, but a mother’s pain.

It must be remembered, though, that in the immediate aftermath of her son’s death the direct visual witness Mamie Till Bradley called for was limited largely to the African American community. Emmett Till’s open-casket funeral was attended mostly by African Americans, and photos of his battered and disfigured face were rarely seen by whites, and never in the white press, during the civil rights movement. Eventually, however, the larger nation did begin to see what Mamie Till Bradley saw on that horrific day in 1955. Starting with the 1987 documentary Eyes on the Prize (whose role in shaping Emmett Till’s legacy cannot be overestimated), images of her son’s corpse started to circulate more widely, both to a new generation of African Americans and, for the first time, to whites. Since then, a steady stream of scholarly studies and documentary films, as well as works of art and websites, have enabled these images to reach a larger and more diverse audience, and this belated and repeated witnessing, coupled with the courage that Mamie Till Bradley showed in 1955, has profoundly influenced the way much of our nation responds to racial violence. Because of what we’ve seen, we cannot ignore Emmett Till. Nor should we want to.

Whether it’s Trayvon Martin or Tamir Rice, Michael Brown or Philando Castile, Eric Garner or Alton Sterling, Mamie Till Bradley’s son haunts our understanding, and he does so largely because of what his mother demanded of us more than sixty years ago: that we bear witness. In a 2016 New Yorker essay on “The Power of Looking, from Emmett Till to Philando Castile,” Allyson Hobbes makes this connection directly and forcefully, in words worth quoting in full: “Mamie Till Bradley and Diamond Reynolds both shared their sorrow with the world. They asked onlookers to view the bodies of two black men and see a son, a brother, a boyfriend, a loved one. Looking is hard. It shakes us and haunts us, and it comes with responsibilities and risks. But, by allowing us all to look, Bradley and Reynolds offered us real opportunities for empathy. Bradley’s moral courage galvanized a generation of civil-rights activists. We have yet to see how far Reynolds’s bravery will take us.”

We hope one day such bravery will be unnecessary, but as we hope we must also see, and recently in Charlottesville another mother buried a child whose life was taken by a white supremacist, with a dozen more injured and a nation shaken. Although we wish we didn’t have to, we know what we must do. We must bear witness to this wound, and the wounds still to come. And we must do so with a faith as strong as Mamie Till Bradley’s, believing that this witness is worth the risk, that in seeing we will learn to see. This is Emmett Till’s legacy, a mother’s gift to her son and, eventually, to us all.

Related Stories

]]>http://feeds.feedblitz.com/~/461616488/_/oupblogusahistory/feed/0History,philando castile,*Featured,African American History,mamie till bradley,Tamir Rice,charlottesville,Michael Brown,emmett till,Racism,Murder of Emmett Till,let the world see,Eric Garner,race relations,America,African American civil rights,Online products,civil rights history,Donald Trump,Politics,Trayvon MartinWhen Emmett Till’s body arrived at the Illinois Central train station in Chicago on 2 September 1955, the instructions from the authorities in Mississippi were clear: the casket containing the young boy must be buried unopened, intact and with the seal unbroken. Later that morning, Till’s mother, Mamie Till Bradley, instructed funeral home director Ahmed Rayner to defy this command. He objected, citing the promises he made to the state of Mississippi and the professional obligations they entailed. But again she insisted and, finally, against his better judgment, he agreed.
What Mamie Till Bradley saw next horrified her, but it also steeled her resolve and inspired one of the defining moments of the modern civil rights movement. When she told Rayner she wanted to hold an open-casket funeral for her son, he asked her to reconsider, but when she again insisted, he relented, offering to retouch the corpse to make it more presentable. “No,” she told him. “Let the world see what I’ve seen.”
Mamie Till Bradley’s decision that day—coupled with the photos of her son’s corpse that she allowed Jet magazine and the Chicago Defender to circulate throughout the black press—galvanized a generation of civil rights activists. In calling for the world “to see what I’ve seen” (or as she said in another context, “to let the people see what they did to my boy”), Mamie Till Bradley was not the first to challenge white supremacy by demanding that the nation bear witness to this ideology’s strange and necessary fruit: the wounded black body. Others before her, like the editors of Crisis during the early part of the twentieth century, understood that this witness exposed white supremacy for what it truly was: a hatred not simply dependent upon violence as an occasional tool, but one founded on violence and requiring violence for its very existence. In addition, such witness combatted the willful blindness of a nation that, because it did not want to confront the true nature of its own history, enabled this violence to flourish. What Mamie Till Bradley added to this protest tradition, and what in large part accounts for the unique place that Emmett Till’s lynching holds in our collective racial memory, is that she demanded the nation behold not just any wounded body, but a son’s body, and that it reckon with not just any pain, but a mother’s pain. Sign outside of Bryant's Grocery in Money, Mississippi. Richard apple, CC BY-SA 3.0 via Wikimedia Commons.
It must be remembered, though, that in the immediate aftermath of her son’s death the direct visual witness Mamie Till Bradley called for was limited largely to the African American community. Emmett Till’s open-casket funeral was attended mostly by African Americans, and photos of his battered and disfigured face were rarely seen by whites, and never in the white press, during the civil rights movement. Eventually, however, the larger nation did begin to see what Mamie Till Bradley saw on that horrific day in 1955. Starting with the 1987 documentary Eyes on the Prize (whose role in shaping Emmett Till’s legacy cannot be overestimated), images of her son’s corpse started to circulate more widely, both to a new generation of African Americans and, for the first time, to whites. Since then, a steady stream of scholarly studies and documentary films, as well as works of art and websites, have enabled these images to reach a larger and more diverse audience, and this belated and repeated witnessing, coupled with the courage that Mamie Till Bradley showed in 1955, has profoundly influenced the way much of our nation responds to racial violence. Because of what we’ve seen, we cannot ignore Emmett Till. Nor should we want to.
Whether it’s Trayvon Martin or Tamir Rice, Michael Brown or Philando Castile, Eric Garner or Alton ... When Emmett Till’s body arrived at the Illinois Central train station in Chicago on 2 September 1955, the instructions from the authorities in Mississippi were clear: the casket containing the young boy must be buried unopened, intact and ... http://feeds.feedblitz.com/~/461616488/_/oupblogusahistory/https://blog.oup.com/2017/09/americas-darkest-hour-timeline-lai-massacre/America’s darkest hour: a timeline of the My Lai Massacrehttp://feedproxy.google.com/~r/oupblogusahistory/~3/RPjBUafPUHs/
http://feeds.feedblitz.com/~/460430095/_/oupblogusahistory/#respondThu, 21 Sep 2017 12:30:42 +0000https://blog.oup.com/?p=133685On the morning of 16 March 1968, soldiers from three platoons of Charlie Company entered a group of hamlets located in the Son Tinh district of South Vietnam on a search-and-destroy mission. Although no Viet Cong were present, the GIs proceeded to murder more than five hundred unarmed villagers.

Related Stories

]]>
On the morning of 16 March 1968, soldiers from three platoons of Charlie Company entered a group of hamlets located in the Son Tinh district of South Vietnam on a search-and-destroy mission. Although no Viet Cong were present, the GIs proceeded to murder more than five hundred unarmed villagers. Referencing My Lai: Vietnam, 1968, and the Descent into Darkness, we’ve put together a timeline that marks the events following the mass killing now known as the My Lai Massacre.

Featured image credit: “Monument of the My Lai Massacre” by -JvL-. CC BY 2.0via Wikimedia Commons

Related Stories

]]>http://feeds.feedblitz.com/~/460430095/_/oupblogusahistory/feed/0History,*Featured,Howard Jones,Ken Burns,military,South Vietnam,Timelines,massacre,Vietnam War,GIs,Books,Asia,My Lai,timeline,Vietnam 1968 the Descent into Darkness,America,Charlie Company,us fw18,Son Tinh,tradeOn the morning of 16 March 1968, soldiers from three platoons of Charlie Company entered a group of hamlets located in the Son Tinh district of South Vietnam on a search-and-destroy mission. Although no Viet Cong were present, the GIs proceeded to murder more than five hundred unarmed villagers. Referencing My Lai: Vietnam, 1968, and the Descent into Darkness, we've put together a timeline that marks the events following the mass killing now known as the My Lai Massacre.
Featured image credit: “Monument of the My Lai Massacre” by -JvL-. CC BY 2.0 via Wikimedia Commons
The post America's darkest hour: a timeline of the My Lai Massacre appeared first on OUPblog. On the morning of 16 March 1968, soldiers from three platoons of Charlie Company entered a group of hamlets located in the Son Tinh district of South Vietnam on a search-and-destroy mission. Although no Viet Cong were present, the GIs ... http://feeds.feedblitz.com/~/460430095/_/oupblogusahistory/https://blog.oup.com/2017/09/william-dean-howells-gilded-age-excerpt/William Dean Howells and the Gilded Age [excerpt]http://feedproxy.google.com/~r/oupblogusahistory/~3/Vad4WP19n4I/
http://feeds.feedblitz.com/~/459510294/_/oupblogusahistory/#respondSun, 17 Sep 2017 11:30:58 +0000https://blog.oup.com/?p=133561Through his writing, novelist and critic William Dean Howells captured the political and social aftermath of the Civil War. Given his limited involvement in politics, Howells’ works focused on the lives of common people over the uncommon, whom he deemed “essentially unattractive and uninteresting.” In the following excerpt from The Republic For Which It Stands, […]

Related Stories

Through his writing, novelist and critic William Dean Howells captured the political and social aftermath of the Civil War. Given his limited involvement in politics, Howells’ works focused on the lives of common people over the uncommon, whom he deemed “essentially unattractive and uninteresting.” In the following excerpt from The Republic For Which It Stands, acclaimed historian Richard White draws on the writings of William Dean Howells as he examines American Reconstruction and the Gilded Age.

Abraham Lincoln, the politician whose memory and legacy dominated the Gilded Age, died as this book begins, but he never really vanished. The novelist and critic William Dean Howells captured part of the reason when he reviewed John Hay’s and John Nicolay’s monumental biography of the president in 1890. Howells wrote that “if America means anything at all, it means the sufficiency of the common, the insufficiency of the uncommon.” Lincoln had come to be both the personification of the American common people and the nation’s greatest—and most uncommon—president. Howells thought it was the nation’s common people and common traits that most mattered.

Howells, famous then and largely forgotten since, knew most everyone, but he always remained detached. He watched, and he wrote. His interventions in politics remained minor. Howells was a Midwesterner, and this was the great age of the Midwest. Originally a committed liberal, he came to acknowledge liberalism’s failures and insufficiencies, and then struggled to imagine alternatives. He did so as a writer, and he and his fellow Realists created invaluable portraits of the age. In his confusion, his intelligence, and his honesty, he reminds us that for those living through the Gilded Age it was an astonishing and frightening period, full of great hopes as well as deep fears. When Howells cryptically embraces the common, it is worth listening to him. Understanding his judgment of the “sufficiency of the common, the insufficiency of the uncommon” provides a lens for assessing the Gilded Age.

“W. D. Howells” by Underwood & Underwood, featured in The North American review. Public domain via Wikimedia Commons.

The Gilded Age produced uncommon men and women. They abound in this volume, but in Howells’s lifetime, and during the twentieth century, businessmen who amassed wealth on a scale never seen before in American history became the face of the period. Contemporary caricaturists and later historians named them the Robber Barons, but this, as well as their later incarnation as farsighted entrepreneurs, gave them too much credit. They never really mastered the age. When Howells wrote of “the insufficiency of the uncommon,” he probably had them in mind, seeing them as insufficient to the demands of the period for the same reasons as Charles Francis Adams, who had aspired to be one of them and then dismissed them his Autobiography.

I have known tolerably well, a good many “successful” men—“big” financially—men famous during the last half-century, and a less interesting crowd I do not care to encounter. Not one that I have ever known would I care to meet again, either in this world or the next; nor is one of them associated in my mind with the idea of humor, thought or refinement. A set of mere money-getters and traders, they were essentially unattractive and uninteresting.

In a period that began with such exalted hopes and among a people so willing to proclaim their virtue as were Americans, sufficient seems condemning with faint praise, but a sobered Howells writing in the midst of what seemed a prolonged economic, political, and social crisis expressed a restrained optimism. Howells did not romanticize the “common people.” The failure of Reconstruction in the South was, in part, their failure. They often at least consented to the corruption of democratic governance. And for most of the Gilded Age the “common people” questioned whether they really had much in common as race, religion, ethnicity, class, and gender divided the nation. Yet their actions transformed the country, even if they undertook perhaps the most consequential of these actions—the movement into wage labor—unwillingly and under duress.

In judging them sufficient, Howells settled down in between the dystopian and utopian fantasies that marked the age. Millions of ordinary Americans had remade the country with their work, their movements, their agitation, their tinkering, their broad and vernacular intellectualism that neither aspired to nor created a high culture, and even with their amusements. They had not succumbed to the long economic and social the crisis that threatened to overwhelm the country. What they accomplished was sufficient. It was a foundation on which to build.

Howells and his contemporaries never escaped the great gravitational pull of the Civil War. The era began with the universal conviction that the Civil War was the watershed in the nation’s history and ended with the proposition that the white settlement of the West defined the national character. Changing the national story from the Civil War to the West amounted to an effort to escape the shadow of the Gilded Age’s vanished twin and evade the failure of Reconstruction. Rewriting the Civil War as a mere interruption of the national narrative of western expansion minimized the traumas and vestiges of the Civil War and downplayed the significance of the transformation of Gilded Age economy and society. But too much had changed, and too much blood had been spilled in the War, for such a simple story of continuity to be fully persuasive. The twin, never born, shadowed the Gilded Age. A vision of a country un-achieved lingered, and quarrels over what should come next remained unresolved.

Related Stories

]]>http://feeds.feedblitz.com/~/459510294/_/oupblogusahistory/feed/0History,*Featured,civil war,Abraham Lincoln,richard white,republic for which it stands,Books,robber barons,Gilded Age,America,excerpt,us fw18,Midwest,the oxford history of the united states,Literature,tradeThrough his writing, novelist and critic William Dean Howells captured the political and social aftermath of the Civil War. Given his limited involvement in politics, Howells’ works focused on the lives of common people over the uncommon, whom he deemed “essentially unattractive and uninteresting.” In the following excerpt from The Republic For Which It Stands, acclaimed historian Richard White draws on the writings of William Dean Howells as he examines American Reconstruction and the Gilded Age.
Abraham Lincoln, the politician whose memory and legacy dominated the Gilded Age, died as this book begins, but he never really vanished. The novelist and critic William Dean Howells captured part of the reason when he reviewed John Hay’s and John Nicolay’s monumental biography of the president in 1890. Howells wrote that “if America means anything at all, it means the sufficiency of the common, the insufficiency of the uncommon.” Lincoln had come to be both the personification of the American common people and the nation’s greatest—and most uncommon—president. Howells thought it was the nation’s common people and common traits that most mattered.
Howells, famous then and largely forgotten since, knew most everyone, but he always remained detached. He watched, and he wrote. His interventions in politics remained minor. Howells was a Midwesterner, and this was the great age of the Midwest. Originally a committed liberal, he came to acknowledge liberalism’s failures and insufficiencies, and then struggled to imagine alternatives. He did so as a writer, and he and his fellow Realists created invaluable portraits of the age. In his confusion, his intelligence, and his honesty, he reminds us that for those living through the Gilded Age it was an astonishing and frightening period, full of great hopes as well as deep fears. When Howells cryptically embraces the common, it is worth listening to him. Understanding his judgment of the “sufficiency of the common, the insufficiency of the uncommon” provides a lens for assessing the Gilded Age. “W. D. Howells” by Underwood & Underwood, featured in The North American review. Public domain via Wikimedia Commons.
The Gilded Age produced uncommon men and women. They abound in this volume, but in Howells’s lifetime, and during the twentieth century, businessmen who amassed wealth on a scale never seen before in American history became the face of the period. Contemporary caricaturists and later historians named them the Robber Barons, but this, as well as their later incarnation as farsighted entrepreneurs, gave them too much credit. They never really mastered the age. When Howells wrote of “the insufficiency of the uncommon,” he probably had them in mind, seeing them as insufficient to the demands of the period for the same reasons as Charles Francis Adams, who had aspired to be one of them and then dismissed them his Autobiography.
I have known tolerably well, a good many “successful” men—“big” financially—men famous during the last half-century, and a less interesting crowd I do not care to encounter. Not one that I have ever known would I care to meet again, either in this world or the next; nor is one of them associated in my mind with the idea of humor, thought or refinement. A set of mere money-getters and traders, they were essentially unattractive and uninteresting.
In a period that began with such exalted hopes and among a people so willing to proclaim their virtue as were Americans, sufficient seems condemning with faint praise, but a sobered Howells writing in the midst of what seemed a prolonged economic, political, and social crisis expressed a restrained optimism. Howells did not romanticize the “common people.” The failure of Reconstruction in the South was, in part, their failure. They often at ... Through his writing, novelist and critic William Dean Howells captured the political and social aftermath of the Civil War. Given his limited involvement in politics, Howells’ works focused on the lives of common people over the uncommon, ... http://feeds.feedblitz.com/~/459510294/_/oupblogusahistory/https://blog.oup.com/2017/09/rebuilding-new-york-city/Rebuilding New York Cityhttp://feedproxy.google.com/~r/oupblogusahistory/~3/_ms7NUYJQr8/
http://feeds.feedblitz.com/~/458176104/_/oupblogusahistory/#respondMon, 11 Sep 2017 08:30:34 +0000https://blog.oup.com/?p=133424In the weeks after the 9/11 terrorist attacks, New York City’s position as the center of the financial world came into question. Now, 16-years after the day that could have permanently changed the course of New York’s history, downtown Manhattan rebuilt both its buildings and status of importance. Lynne B. Sagalyn examines the economic impact of the World Trade Center’s fall and rise in the following excerpt from Power at Ground Zero.

Related Stories

In the weeks following the 9/11 terrorist attacks, New York City’s position as the center of the financial world came into question. Now, 16-years after the day that could have permanently changed the course of New York’s history, downtown Manhattan rebuilt both its buildings and status of importance. Lynne B. Sagalyn examines the economic impact of the World Trade Center’s fall and rise in the following excerpt from Power at Ground Zero.

New Yorkers everywhere found themselves at a loss for words that could capture the physical magnitude and emotional trauma of what had transpired, of what hitherto was unbelievable but replayed in continuous loops on every conceivable medium until the incredulity of what had happened became too painfully real: the giant twins, unmistakable in their towering iconic presence on the skyline, seemingly invulnerable, now lay in ruin, smoldering—a profound graveyard for some twenty-eight hundred civilians who perished inside. The losses were incalculable. Everyone seemed to be in a state of shock and disbelief.

There are moments in time when cities become natural experiments and the 9/11 terrorist attack on New York provided one of them. Did corporations and businesses need to be in New York? Was Wall Street finished as the capital of international finance? Would the immediate trauma or anxiety about possible future attacks cause city residents to leave, move to places where terrorism seemed less of a threat? Fears of firms leaving the city in a mass exodus, fears of residents fleeing, fears of tourists staying away, fears of the end of skyscraper development and, by extension, the very self of the city were paramount, and news headlines in the weeks after 9/11 messaged the doubts: “In Wounded Financial Center, Trying to Head Off Defections,” “Reaching for the Sky, and Finding a Limit; Tall Buildings Face New Doubt as Symbols of Vulnerability,” “When the Towers Collapsed, So Did Their Desire to Live Here.”

“It was a time of desperate loss, yet also a time of distinct possibility.”

Like the fiscal crisis of the mid-1970s, the 9/11 attack on the World Trade Center shook New Yorkers’ confidence in the future of their city. Uncertainties existed across the five boroughs and beyond. Was New York still the resilient city that had overcome so many post–World War II crises—deindustrialization, disinvestment and property abandonment, racial and ethnic change, white suburban flight, social and cultural conflict, and a near brush with bankruptcy? Based on well founded and widespread fears prevalent at the time, no one was able to say for sure that the attack would not have a lasting negative economic impact on the city and the region. The city’s sense of invulnerability had been shattered, yet as historian Mike Wallace reminded readers in a special section of the Times that appeared within a week of the attack, “that sense always rested on a truncated reading of history. While the particular form of the attack was fiendishly novel, New York, over nearly four centuries, has repeatedly been the object of murderous intentions. Through a combination of luck and power, we have escaped many of the intended blows, but not all of them, and our forebears often feared that worse might yet befall them.”

On 9/11, however, New York City’s role as a symbolic target became too painfully apparent. The fantasies of urban destruction in popular culture, Wallace wrote, had been “horribly realized.” If the illusion of invulnerability had been shattered, not so the determination to rebound and reconstruct and emerge stronger and better than ever; that too was part of New York’s cultural history, part and parcel of its grit and ambition.

It was a time of desperate loss, yet also a time of distinct possibility. Tragedy delivered an exceptional opportunity and the promise of an extraordinary amount of dollars from Washington, D.C., with which to plan and rebuild lower Manhattan, its transportation and infrastructure, office inventory, housing supply, and open space amenities to match the new needs of the twenty-first century. At the same time, rebuilding could renew the district’s historic dynamic. Reinventing itself was part of lower Manhattan’s history. And since the Port Authority won the legal challenge to its development in November 1963, the original World Trade Center had shaped that history. Thirty-eight years later, the mission to rebuild these sixteen acres, emotionally raw and newly endowed with intense sensitivity for the families of those who died there, patriotic fervor for the nation at large, and profound meaning for the city’s future, history was in the making.

Featured image credit: ” A cityscape from Liberty Island in New York City, New York” by
Michael Barera c. 2001. CC BY-SA 4.0 via Wikimedia Commons.

Related Stories

]]>http://feeds.feedblitz.com/~/458176104/_/oupblogusahistory/feed/0History,Port Authority,terrorism,*Featured,one world trade center,Lynne B. Sagalyn,Manhattan,9/11,Books,Power at Ground Zero,twin towers,America,world trade center,excerpt,downtown,New York City,Politics,Economics,tradeIn the weeks following the 9/11 terrorist attacks, New York City’s position as the center of the financial world came into question. Now, 16-years after the day that could have permanently changed the course of New York’s history, downtown Manhattan rebuilt both its buildings and status of importance. Lynne B. Sagalyn examines the economic impact of the World Trade Center’s fall and rise in the following excerpt from Power at Ground Zero.
New Yorkers everywhere found themselves at a loss for words that could capture the physical magnitude and emotional trauma of what had transpired, of what hitherto was unbelievable but replayed in continuous loops on every conceivable medium until the incredulity of what had happened became too painfully real: the giant twins, unmistakable in their towering iconic presence on the skyline, seemingly invulnerable, now lay in ruin, smoldering—a profound graveyard for some twenty-eight hundred civilians who perished inside. The losses were incalculable. Everyone seemed to be in a state of shock and disbelief.
There are moments in time when cities become natural experiments and the 9/11 terrorist attack on New York provided one of them. Did corporations and businesses need to be in New York? Was Wall Street finished as the capital of international finance? Would the immediate trauma or anxiety about possible future attacks cause city residents to leave, move to places where terrorism seemed less of a threat? Fears of firms leaving the city in a mass exodus, fears of residents fleeing, fears of tourists staying away, fears of the end of skyscraper development and, by extension, the very self of the city were paramount, and news headlines in the weeks after 9/11 messaged the doubts: “In Wounded Financial Center, Trying to Head Off Defections,” “Reaching for the Sky, and Finding a Limit; Tall Buildings Face New Doubt as Symbols of Vulnerability,” “When the Towers Collapsed, So Did Their Desire to Live Here.” “It was a time of desperate loss, yet also a time of distinct possibility.”
Like the fiscal crisis of the mid-1970s, the 9/11 attack on the World Trade Center shook New Yorkers’ confidence in the future of their city. Uncertainties existed across the five boroughs and beyond. Was New York still the resilient city that had overcome so many post–World War II crises—deindustrialization, disinvestment and property abandonment, racial and ethnic change, white suburban flight, social and cultural conflict, and a near brush with bankruptcy? Based on well founded and widespread fears prevalent at the time, no one was able to say for sure that the attack would not have a lasting negative economic impact on the city and the region. The city’s sense of invulnerability had been shattered, yet as historian Mike Wallace reminded readers in a special section of the Times that appeared within a week of the attack, “that sense always rested on a truncated reading of history. While the particular form of the attack was fiendishly novel, New York, over nearly four centuries, has repeatedly been the object of murderous intentions. Through a combination of luck and power, we have escaped many of the intended blows, but not all of them, and our forebears often feared that worse might yet befall them.”
On 9/11, however, New York City’s role as a symbolic target became too painfully apparent. The fantasies of urban destruction in popular culture, Wallace wrote, had been “horribly realized.” If the illusion of invulnerability had been shattered, not so the determination to rebound and reconstruct and emerge stronger and better than ever; that too was part of New York’s cultural history, part and parcel of its grit and ambition.
It was a time of desperate loss, yet also a time of distinct possibility. Tragedy delivered an exceptional opportunity and the promise of an ... In the weeks following the 9/11 terrorist attacks, New York City’s position as the center of the financial world came into question. Now, 16-years after the day that could have permanently changed the course of New York’http://feeds.feedblitz.com/~/458176104/_/oupblogusahistory/https://blog.oup.com/2017/09/allen-ginsberg-and-ann-coulter-free-speech/Allen Ginsberg and Ann Coulter walk into an auditorium…http://feedproxy.google.com/~r/oupblogusahistory/~3/X4VS4k6uf_U/
http://feeds.feedblitz.com/~/457876584/_/oupblogusahistory/#respondSat, 09 Sep 2017 11:30:21 +0000https://blog.oup.com/?p=133375Ann Coulter, a controversial right-wing author and commentator, was tentatively scheduled to speak at UC Berkeley on April 27 until pre-speech protests turned into violent clashes, and her speech was canceled. In response, Coulter tweeted, “It’s sickening when a radical thuggish institution like Berkeley can so easily snuff out the cherished American right to free speech.”

Related Stories

]]>
Ann Coulter, a controversial right-wing author and commentator, was tentatively scheduled to speak at University of California-Berkeley on 27 April until pre-speech protests turned into violent clashes, and her speech was canceled. In response, Coulter tweeted, “It’s sickening when a radical thuggish institution like Berkeley can so easily snuff out the cherished American right to free speech.”

Not long ago left-wing institutions such as Berkeley, the home of the Free Speech Movement in the mid-sixties, were seen as bastions of free expression. At about the same time as Berkeley’s Free Speech Movement, North Carolina conservatives passed the Speaker Ban Law to prohibit Communists and other questionable figures from speaking on state campuses. Jesse Helms evoked the Speaker Ban Law in 1968 when he charged that Stokely Carmichael ought not have been permitted to speak, citing “almost unanimous resentment among citizens of this state that Carmichael was given the respectability of a forum at the taxpayers’ expense — on the campuses of the University of North Carolina.”

The debate continues over allowing purportedly offensive speakers on campus. Gail Collins, an op-ed columnist at the New York Times, noted her longtime support for free speech on college campuses. As an undergraduate student at Marquette University in 1967 she protested in the dean’s office, demanding that Allen Ginsberg be allowed to read his poems on the conservative Catholic campus. Ginsberg, who had unleashed his blistering poem “Howl” as an attack on the status quo in a renowned reading in 1955 in San Francisco, was not permitted to speak at Marquette despite Collins’ efforts.

Politically, Ann Coulter and Allen Ginsberg reside in opposing universes. She is a conservative who was raised by an FBI agent father and admires Joseph McCarthy; he was raised by Jewish parents with communist sympathies and celebrated marijuana, homosexuality, and Buddhism. Theirs are not names we might expect to find yoked in a common cause, yet they have in their own ways pushed themselves to the leading edges of freedom of speech issues.

Jesse Helms promoted FCC regulations that prevented Ginsberg, who was born on 3 June 1926, and died in 1997, from reading “Howl” — or even allowing stations to broadcast a recording of “Howl” — on the airwaves between 6:00 a.m. and 10:00 p.m. That ban remains in effect to this day. Although “Howl” presents a litany of social condemnations, the FCC’s objection technically would come down to a handful of sexual references, among them “cock and endless balls,” “who blew and were blown,” and so on. The fiercest transgression exploded in a phrase that was undoubtedly more shocking in the 1950s than it is now: “fucked in the ass.” When Ginsberg wrote those words into an early draft, he reckoned their presence would prevent the poem’s publication. He did assume, though, that he would be able to recite the poem. Ginsberg went on to specialize in live readings and performance; he complained in a 1994 interview that due the FCC broadcast restrictions, he was effectively “banned from the marketplace of ideas in my own country.”

The print publication of “Howl” maintains legendary status in the history of free speech. Lawrence Ferlinghetti, co-owner of City Lights Books, published “Howl,” along with other Ginsberg poems, in 1956. After the first printing of Howl and Other Poems sold out quickly, Ferlinghetti negotiated a second printing via an inexpensive London printer. On 25 March 1957, U.S. Customs officers in San Francisco seized the shipment en route from the London printer, citing Section 305 of the Tariff Act of 1930, which bars “any obscene book” from importation. Ironically, the most shocking phrase did not appear in the printed book. The offensive words were replaced by asterisks: ****** in the ***. Although Customs released the book, city police officers arrested Ferlinghetti and his store clerk on 3 June 1957, for selling “obscenity.” Ferlinghetti maintained that Ginsberg’s ideas, more than his language, were responsible for the arrest.

The ensuing trial garnered coast-to-coast publicity. San Francisco Municipal Judge Clayton W. Horn admitted that although some members of society could deem certain words in “Howl” course and vulgar, for others these were simply everyday phrases. Since Horn was convinced of the poem’s social importance, he supported the notion that the First Amendment protected the expression of Ginsberg’s ideas, however unorthodox, controversial, and disturbing they might be. Ferlinghetti was free to sell the book, and Ginsberg, though reviled by many, emerged as a voice for freedom of expression and individuality.

For Ginsberg, the key to his poetry is neither the graphic language nor thematic matter per se, but simply straightforward honesty, a trait inspired by his reading of Walt Whitman. Speaking to a Playboy interviewer in 1969, Ginsberg complained, “[My] poems get misinterpreted as promotion of homosexuality. Actually, it’s more like promotion of frankness. . . . When a few people get frank about homosexuality in public, it breaks the ice; then anybody can be frank about anything. That’s socially useful.”

After the cancellation of Coulter’s Berkeley appearance, university professor and commentator Robert Reich shared a discussion with Coulter on ABC’s This Week. Reich stated that “if somebody says something that is offensive, well, that is not per se a violation of any kind of university norm.” Reich maintains that university students learn by talking with people whose views test their views. Ideally, he concluded, universities should host people with “views that some people find to be offensive.”

To many readers, evocation of outrage is the sole purpose of Coulter’s output; rather than promoting honesty via poetry, as Ginsberg aims to do, she provokes indignation in her guise as media-provocateur. Coulter may well owe her celebrity status to her outrageous statements, but even dissenters ought to concede that, like Ginsberg, she should be free to speak her mind. A particular venue is within its rights, though, to refuse to host her appearance. When Marquette refused to allow Ginsberg to appear on campus in 1967, he read his poems instead at nearby University of Wisconsin-Milwaukee, drawing an audience of thousands.

Related Stories

]]>http://feeds.feedblitz.com/~/457876584/_/oupblogusahistory/feed/0Ann Coulter,beat generation,FCC,*Featured,Howl,oxford bibiographies online,Republican party,free speech,first amendment,college campuses,obscenity trial,Poetry,Allen Ginsberg,american literature,Matt Theado,America,political science,Online products,American Poetry,Literature,PoliticsAnn Coulter, a controversial right-wing author and commentator, was tentatively scheduled to speak at University of California-Berkeley on 27 April until pre-speech protests turned into violent clashes, and her speech was canceled. In response, Coulter tweeted, “It’s sickening when a radical thuggish institution like Berkeley can so easily snuff out the cherished American right to free speech.”
Not long ago left-wing institutions such as Berkeley, the home of the Free Speech Movement in the mid-sixties, were seen as bastions of free expression. At about the same time as Berkeley’s Free Speech Movement, North Carolina conservatives passed the Speaker Ban Law to prohibit Communists and other questionable figures from speaking on state campuses. Jesse Helms evoked the Speaker Ban Law in 1968 when he charged that Stokely Carmichael ought not have been permitted to speak, citing “almost unanimous resentment among citizens of this state that Carmichael was given the respectability of a forum at the taxpayers’ expense — on the campuses of the University of North Carolina.”
The debate continues over allowing purportedly offensive speakers on campus. Gail Collins, an op-ed columnist at the New York Times, noted her longtime support for free speech on college campuses. As an undergraduate student at Marquette University in 1967 she protested in the dean’s office, demanding that Allen Ginsberg be allowed to read his poems on the conservative Catholic campus. Ginsberg, who had unleashed his blistering poem “Howl” as an attack on the status quo in a renowned reading in 1955 in San Francisco, was not permitted to speak at Marquette despite Collins’ efforts.
Politically, Ann Coulter and Allen Ginsberg reside in opposing universes. She is a conservative who was raised by an FBI agent father and admires Joseph McCarthy; he was raised by Jewish parents with communist sympathies and celebrated marijuana, homosexuality, and Buddhism. Theirs are not names we might expect to find yoked in a common cause, yet they have in their own ways pushed themselves to the leading edges of freedom of speech issues. Allen Ginsberg in 1979. Michiel Hendrycks, CC BY-SA 3.0 via Wikimedia Commons.
Jesse Helms promoted FCC regulations that prevented Ginsberg, who was born on 3 June 1926, and died in 1997, from reading “Howl” — or even allowing stations to broadcast a recording of “Howl” — on the airwaves between 6:00 a.m. and 10:00 p.m. That ban remains in effect to this day. Although “Howl” presents a litany of social condemnations, the FCC’s objection technically would come down to a handful of sexual references, among them “cock and endless balls,” “who blew and were blown,” and so on. The fiercest transgression exploded in a phrase that was undoubtedly more shocking in the 1950s than it is now: “fucked in the ass.” When Ginsberg wrote those words into an early draft, he reckoned their presence would prevent the poem’s publication. He did assume, though, that he would be able to recite the poem. Ginsberg went on to specialize in live readings and performance; he complained in a 1994 interview that due the FCC broadcast restrictions, he was effectively “banned from the marketplace of ideas in my own country.”
The print publication of “Howl” maintains legendary status in the history of free speech. Lawrence Ferlinghetti, co-owner of City Lights Books, published “Howl,” along with other Ginsberg poems, in 1956. After the first printing of Howl and Other Poems sold out quickly, Ferlinghetti negotiated a second printing via an inexpensive London printer. On 25 March 1957, U.S. Customs officers in San Francisco seized the shipment en route from the London printer, citing Section 305 of the Tariff Act of 1930, which bars “any obscene ... Ann Coulter, a controversial right-wing author and commentator, was tentatively scheduled to speak at University of California-Berkeley on 27 April until pre-speech protests turned into violent clashes, and her speech was canceled.http://feeds.feedblitz.com/~/457876584/_/oupblogusahistory/https://blog.oup.com/2017/09/george-washingtons-early-love-literature-excerpt/George Washington’s early love of literature [excerpt]http://feedproxy.google.com/~r/oupblogusahistory/~3/VEbJC0DiXeo/
http://feeds.feedblitz.com/~/457681554/_/oupblogusahistory/#respondFri, 08 Sep 2017 11:30:55 +0000https://blog.oup.com/?p=133304Unlike his contemporaries Thomas Jefferson, Benjamin Franklin, and Alexander Hamilton—George Washington isn’t remembered as an intellectual. But for what he lacked in formal education, Washington made up for in enthusiasm for learning. His personal education began at an early age and continued throughout his adult life. In the following excerpt from George Washington: A Life in Books, historian Kevin J. Hayes gives insight into Washington’s early love of literature.

Related Stories

Unlike his contemporaries Thomas Jefferson, Benjamin Franklin, and Alexander Hamilton — George Washington isn’t remembered as an intellectual. But for what he lacked in formal education, Washington made up for in enthusiasm for learning. His personal education began at an early age and continued throughout his adult life. In the following excerpt from George Washington: A Life in Books, historian Kevin J. Hayes gives insight into Washington’s early love of literature.

Cubbyholes are magical spaces. Store something in a cubbyhole, and it just might transform itself into something else. Well, that object doesn’t really change, but the times change, and so does the person who stored it there, either of which amounts to the same thing. Above the topmost bookshelves in his permanent library at Mount Vernon, George Washington had dozens of built-in cubbyholes, or pigeonholes, as they were called in his day, that is, before that word became a verb and took on pejorative connotations. In those dark compartments he stored many personal papers, some dating back to his adolescence. The manuscripts that Washington saved since his school days in the 1740s became something very different by the end of his life. They transformed themselves into a blueprint of his mind.

This set of miscellaneous manuscripts contains essential documents for reconstructing George Washington’s school days. Beyond the exercises that form the lion’s share of these early manuscripts, details of Washington’s education are sparse. In an unfinished biography, his aide-de-camp, personal secretary, friend, and confidante David Humphreys says that a “domestic tutor” took charge of Washington’s education.

Official Presidential portrait of George Washington by Gilbert Stuart, circa 1797. Public domain via Wikimedia Commons.

Humphreys’s precise language offers a good idea of the kind of teacher Washington had as a boy. Typically, affluent families alone could afford domestic tutors for their children. As the job title suggests, the domestic tutor was a live-in teacher who accepted room and board, along with a modest salary. Washington’s tutor taught grammar, logic, rhetoric, geometry and higher mathematics, geography, history, and additional “studies which are not improperly termed ‘the humanities.’” The name of Washington’s domestic tutor has escaped history. Considering the eclectic nature of the surviving school exercises, his editors suggest that Washington had several teachers.

Further evidence shows that at one point in his education Washington did attend school with other boys. Friend and fellow patriot George Mason mentioned to him a man named David Piper, whom he described as “my Neighbour and Your old School-fellow.” Like Washington, Piper would turn to surveying once he left school, becoming surveyor of roads for Fairfax County. He was also something of a bad boy. Piper was repeatedly brought to court on various civil and criminal matters. Together Washington and Piper could have attended school at the Lower Church of Washington Parish, Westmoreland County, where Mattox Creek enters the Potomac River, but there is no saying for sure. The story of Washington’s education is shrouded in mystery.

His school exercises indicate what he studied inside the classroom and out. They show him mastering many different subjects, learning what he would need to make his way through colonial Virginia whether that way took him down a deer track or up Duke of Gloucester Street. Some of the exercises are dated, revealing that this set of school papers as a whole ranges from 1743 to 1748, that is, from the year Washington turned eleven to the year he turned sixteen. Other evidence demonstrates that he continued his studies beyond the latest exercises in the manuscript collection. Altogether the exercises and the books Washington read during his school days reveal his early literary interests, his fascination with mathematics, and the genesis of his career as a surveyor.

Washington became curious about poetry in his youth, as two manuscript poems that survive with the school exercises reveal. When he first read these poems, he transcribed them to create personal copies he could reread whenever he wished. His copies reveal Washington’s ambition to excel in penmanship, and their texts shed light on his state of mind at the end of adolescence.

One is titled “On Christmas Day.” Given its subject, the poem’s imagery is predictable. It is filled with happy shepherds and hymn-singing angels watching over the newborn savior. The speaker of the poem is female; she ends by reminding herself to remember Christmas always.

The bottom corner of the page containing Washington’s transcription is torn, so the last two lines of verse are damaged. His source supplies the missing words:

Oh never let my Soul this Day forget,

But pay in graitfull praise her Annual Debt

To him, whom ’tis my Trust, I shall [adore]

When Time, and Sin, and Deat[h, shall be no more!]

The picture of young Washington that emerges from his copy of this poem is that of a boy confident in his religious beliefs but pleased to have another confirm them. This Christmas poem is akin to a Christmas present, something to treasure for itself and for what it symbolizes. Washington recorded neither his source nor the poet’s name, but “On Christmas Day” comes from the February 1743 issue of the Gentleman’sMagazine. The published poem is signed “Orinthia”—the pseudonym for Elizabeth Teft, a Lincolnshire poet. The transcription also reveals Washington’s fastidiousness. In the Gentleman’s Magazine, editor and publisher Edward Cave capitalized each line of the poem but avoided capitalizing other nouns. Copying the poem for himself, Washington capitalized the nouns Cave had left uncapitalized, giving the written text a more traditional look.

“True Happiness,” the other poem that survives among the manuscripts from Washington’s school days, also appeared in the Gentleman’sMagazine, though in a much earlier issue, that of February 1734. Taken together, the two poems create a vivid picture of Washington’s boyhood reading process. After enjoying “On Christmas Day” in a recent issue of the Gentleman’s Magazine, he wished to read more verse and searched whatever back issues were available. The magazines listed in his father’s estate inventory cannot be identified precisely, but they were most likely bound volumes of the Gentleman’s Magazine, one of the few English periodicals available at the time. Each issue of the Gentleman’s Magazine contains a poetry column. After he encountered “On Christmas Day,” it seems, Washington read a number of back issues until he found another poem he liked, which he transcribed onto the opposite side of the leaf containing “On Christmas Day.”

Related Stories

]]>http://feeds.feedblitz.com/~/457681554/_/oupblogusahistory/feed/0History,literacy,*Featured,reading,Arts & Humanities,international literacy day,Books,US SS17,american history,Education,America,founding fathers,George Washington,george washington: a life in books,Kevin J. Hayes,Literature,tradeUnlike his contemporaries Thomas Jefferson, Benjamin Franklin, and Alexander Hamilton — George Washington isn’t remembered as an intellectual. But for what he lacked in formal education, Washington made up for in enthusiasm for learning. His personal education began at an early age and continued throughout his adult life. In the following excerpt from George Washington: A Life in Books, historian Kevin J. Hayes gives insight into Washington’s early love of literature.
Cubbyholes are magical spaces. Store something in a cubbyhole, and it just might transform itself into something else. Well, that object doesn’t really change, but the times change, and so does the person who stored it there, either of which amounts to the same thing. Above the topmost bookshelves in his permanent library at Mount Vernon, George Washington had dozens of built-in cubbyholes, or pigeonholes, as they were called in his day, that is, before that word became a verb and took on pejorative connotations. In those dark compartments he stored many personal papers, some dating back to his adolescence. The manuscripts that Washington saved since his school days in the 1740s became something very different by the end of his life. They transformed themselves into a blueprint of his mind.
This set of miscellaneous manuscripts contains essential documents for reconstructing George Washington’s school days. Beyond the exercises that form the lion’s share of these early manuscripts, details of Washington’s education are sparse. In an unfinished biography, his aide-de-camp, personal secretary, friend, and confidante David Humphreys says that a “domestic tutor” took charge of Washington’s education. Official Presidential portrait of George Washington by Gilbert Stuart, circa 1797. Public domain via Wikimedia Commons.
Humphreys’s precise language offers a good idea of the kind of teacher Washington had as a boy. Typically, affluent families alone could afford domestic tutors for their children. As the job title suggests, the domestic tutor was a live-in teacher who accepted room and board, along with a modest salary. Washington’s tutor taught grammar, logic, rhetoric, geometry and higher mathematics, geography, history, and additional “studies which are not improperly termed ‘the humanities.’” The name of Washington’s domestic tutor has escaped history. Considering the eclectic nature of the surviving school exercises, his editors suggest that Washington had several teachers.
Further evidence shows that at one point in his education Washington did attend school with other boys. Friend and fellow patriot George Mason mentioned to him a man named David Piper, whom he described as “my Neighbour and Your old School-fellow.” Like Washington, Piper would turn to surveying once he left school, becoming surveyor of roads for Fairfax County. He was also something of a bad boy. Piper was repeatedly brought to court on various civil and criminal matters. Together Washington and Piper could have attended school at the Lower Church of Washington Parish, Westmoreland County, where Mattox Creek enters the Potomac River, but there is no saying for sure. The story of Washington’s education is shrouded in mystery.
His school exercises indicate what he studied inside the classroom and out. They show him mastering many different subjects, learning what he would need to make his way through colonial Virginia whether that way took him down a deer track or up Duke of Gloucester Street. Some of the exercises are dated, revealing that this set of school papers as a whole ranges from 1743 to 1748, that is, from the year Washington turned eleven to the year he turned sixteen. Other evidence demonstrates that he continued his studies beyond the latest exercises in the manuscript collection. Altogether the exercises and the books Washington ... Unlike his contemporaries Thomas Jefferson, Benjamin Franklin, and Alexander Hamilton — George Washington isn’t remembered as an intellectual. But for what he lacked in formal education, Washington made up for in enthusiasm for learning.http://feeds.feedblitz.com/~/457681554/_/oupblogusahistory/https://blog.oup.com/2017/08/americas-forgotten-war/America’s forgotten warhttp://feedproxy.google.com/~r/oupblogusahistory/~3/3DifdRdLv7A/
http://feeds.feedblitz.com/~/451923888/_/oupblogusahistory/#respondTue, 29 Aug 2017 12:30:49 +0000https://blog.oup.com/?p=133012You probably don’t know it, but we are now in the centennial year of US entry into World War One. On April 2nd 1917, President Woodrow Wilson addressed a joint session of Congress to ask for a declaration of war against Germany. Wilson had narrowly won re-election the year before by campaigning under the slogan “he kept us out of the war.”

Related Stories

You probably don’t know it, but we are now in the centennial year of United States entry into World War One. On 2 April 1917, President Woodrow Wilson addressed a joint session of Congress to ask for a declaration of war against Germany. Wilson had narrowly won re-election the year before by campaigning under the slogan “he kept us out of the war.” But by the following spring, and following the resumption of Germany’s unrestricted submarine campaign against North Atlantic shipping, Wilson felt he had little choice but to join the conflict that had, for two and a half years, wrought unprecedented carnage to Europe. In announcing his intent to take the US into the war with “no selfish ends to serve,” and “no feeling towards [the German people] but one of sympathy and friendship,” he framed the basic U.S. war aim as simple and magnanimous: “the world must be made safe for democracy.” With this, Wilson set a course for a U.S. foreign policy of liberal interventionism with a global range that continues to this day. Washington was transformed from a sleepy administrative town to being the center of a complex and immensely powerful corporate and federal collaboration, the precursor to what Eisenhower would call the Cold War “military-industrial complex.” As the U.S. government drafted four million men, curtailed civil liberties and rigorously suppressed dissent, nationalized the railways, and ran a huge propaganda bureau, the relationship between ordinary Americans and their federal government was drastically changed, in many ways for good. 53,000 Americans died in battle, another 63,000 died of non-combat injuries or disease, and over 200,000 servicemen returned with a permanent disability. The racist and xenophobic energies unleashed by the war fed directly into the drastic immigration restrictions of the 1920s; the federal state it bequeathed served, in many ways, as the blueprint for the New Deal and the modern American welfare state. Yet there is a curious indifference to the legacy of this war in the U.S.

Even the website of the United States World War One Centennial Commission calls it “America’s forgotten war.” There is no national memorial to World War One on the national mall. The highest-ranking government official to attend the official commemoration of U.S. entry to WWI at the National World War One Memorial in Kansas City was not Donald Trump, or even James Mattis, but Secretary of the Army Robert M. Speer. Things are very different in the rest of the Anglophone world; in the United Kingdom, schoolchildren find it hard to avoid studying the so-called “trench poets” of WWI, who also happen to be the favorite verse of prime ministers. The installation “Blood Swept Lands and Seas of Red,” which transformed the Tower of London with nearly 900,000 ceramic poppies planted in the moat, was visited by an estimated five million people. In Canada and Australia, the battle of Vimy Ridge and the Gallipoli campaign, respectively, continue to be widely-memorialized events with central places in their stories of national formation.

“Daddy, what did YOU do in the Great War?”, by Savile Lumley. Public Domain via Wikimedia Commons.

So, why is there this gap between the war’s significance to U.S. history and the place it has in the national memory and commemorative landscape? Scholars in the past ten years have thought extensively about that question. For some, the mixed legacy of the war—the problem Americans had in settling on one consensus narrative of its effects and significance—caused it to fade from memory. For others, it was because America’s memorial construction—of extensive military cemeteries and monuments—often happened in France, far from the eyes of successive generations of Americans. And this despite WWI-era America instigating memorial practices such as the repatriation of bodies, the Tomb of the Unknown Soldier, and the Gold Star Mothers that remain significant today. For Historian Michael Kazin, in contrast to other national literatures, American war literature was simply not good enough to help sustain the war in American memory; most of the poetry produced about it was ‘doggerel,’ and only one war novel—Ernest Hemingway’s A Farewell to Arms—is still widely read.

Yet perhaps one way to reconnect with the war’s legacy to the U.S. is just to look closer at the American literature we do still read from the period. War veterans populate novels such as F. Scott Fitzgerald’s The Great Gatsby (1925), William Faulkner’s As I Lay Dying (1930), Willa Cather’s The Professor’s House (1925), and Nella Larsen’s Passing (1929). Some of e.e. cummings’ most brutally satirical political poems are about WWI. Many of the preoccupations of 1920s literature—from heartbreaking flappers to horrendous car crashes, or the urbane confidence of the Harlem Renaissance—have deep roots in trench combat and in the social and technological transformations of American war mobilization. Scratch the surface, then, and the literature of one of the great decades of American literary production—the 1920s—is saturated with reflections on the war. And perhaps a bit of such digging would be no bad thing, as the questions these authors and others considered—How does one write the truth about war, or register its horrific violence in ways noncombatants can understand? What is the value of military service to civil society? At what point do the necessary disciplines of fighting a war fatally damage the liberties one is fighting for in the first place? And how do we craft ways of remembering that are capable of including all the people who sacrificed and served?—have an undiminished relevance. Sometimes the U.S. authors who wrote about WWI had answers to these questions; sometimes not, but their struggles to address them remain, in many ways, our own.

Related Stories

]]>http://feeds.feedblitz.com/~/451923888/_/oupblogusahistory/feed/0History,Gordon Hutner,*Featured,president wilson,wwi,world war I centennial,Books,Mark Whalan,american history,US in WWI,world war I,american's forgotten war,America,America in WWI,world war i centennial commission,alh,American Literary History
You probably don’t know it, but we are now in the centennial year of United States entry into World War One. On 2 April 1917, President Woodrow Wilson addressed a joint session of Congress to ask for a declaration of war against Germany. Wilson had narrowly won re-election the year before by campaigning under the slogan “he kept us out of the war.” But by the following spring, and following the resumption of Germany’s unrestricted submarine campaign against North Atlantic shipping, Wilson felt he had little choice but to join the conflict that had, for two and a half years, wrought unprecedented carnage to Europe. In announcing his intent to take the US into the war with “no selfish ends to serve,” and “no feeling towards [the German people] but one of sympathy and friendship,” he framed the basic U.S. war aim as simple and magnanimous: “the world must be made safe for democracy.” With this, Wilson set a course for a U.S. foreign policy of liberal interventionism with a global range that continues to this day. Washington was transformed from a sleepy administrative town to being the center of a complex and immensely powerful corporate and federal collaboration, the precursor to what Eisenhower would call the Cold War “military-industrial complex.” As the U.S. government drafted four million men, curtailed civil liberties and rigorously suppressed dissent, nationalized the railways, and ran a huge propaganda bureau, the relationship between ordinary Americans and their federal government was drastically changed, in many ways for good. 53,000 Americans died in battle, another 63,000 died of non-combat injuries or disease, and over 200,000 servicemen returned with a permanent disability. The racist and xenophobic energies unleashed by the war fed directly into the drastic immigration restrictions of the 1920s; the federal state it bequeathed served, in many ways, as the blueprint for the New Deal and the modern American welfare state. Yet there is a curious indifference to the legacy of this war in the U.S.
Even the website of the United States World War One Centennial Commission calls it “America’s forgotten war.” There is no national memorial to World War One on the national mall. The highest-ranking government official to attend the official commemoration of U.S. entry to WWI at the National World War One Memorial in Kansas City was not Donald Trump, or even James Mattis, but Secretary of the Army Robert M. Speer. Things are very different in the rest of the Anglophone world; in the United Kingdom, schoolchildren find it hard to avoid studying the so-called “trench poets” of WWI, who also happen to be the favorite verse of prime ministers. The installation “Blood Swept Lands and Seas of Red,” which transformed the Tower of London with nearly 900,000 ceramic poppies planted in the moat, was visited by an estimated five million people. In Canada and Australia, the battle of Vimy Ridge and the Gallipoli campaign, respectively, continue to be widely-memorialized events with central places in their stories of national formation. “Daddy, what did YOU do in the Great War?”, by Savile Lumley. Public Domain via Wikimedia Commons.
So, why is there this gap between the war’s significance to U.S. history and the place it has in the national memory and commemorative landscape? Scholars in the past ten years have thought extensively about that question. For some, the mixed legacy of the war—the problem Americans had in settling on one consensus narrative of its effects and significance—caused it to fade from memory. For others, it was because America’s memorial construction—of extensive military cemeteries and monuments—often happened in France, far from the eyes of successive generations of Americans. And this despite WWI-era America instigating memorial ...
You probably don’t know it, but we are now in the centennial year of United States entry into World War One. On 2 April 1917, President Woodrow Wilson addressed a joint session of Congress to ask for a declaration of war against Germany.http://feeds.feedblitz.com/~/451923888/_/oupblogusahistory/https://blog.oup.com/2017/08/my-lai-massacre-50-years/Revisiting the My Lai Massacre almost 50 years laterhttp://feedproxy.google.com/~r/oupblogusahistory/~3/Hpt3GtQir_Y/
http://feeds.feedblitz.com/~/442355692/_/oupblogusahistory/#respondMon, 21 Aug 2017 11:30:25 +0000https://blog.oup.com/?p=132918How should we look at My Lai now, nearly fifty years after the events? For most Americans, it was a rude awakening to learn that “one of our own” could commit the kind of atrocities mostly associated with the nation’s enemies in war.

Related Stories

On 17 March 1968, American soldiers entered a group of hamlets located in the Son Tinh district of South Vietnam. Three hours after the GIs entered the hamlets, more than five hundred unarmed villagers lay dead, killed in cold blood.

The My Lai massacre remains one of the most devastating events in American military history. Initially covered up by military authorities, the events of that day slowly came to light and caused international outrage, eventually leading to the prosecution of nearly thirty United States Army Officers.

How should we look at My Lai now, nearly fifty years after the events? For most Americans, it was a rude awakening to learn that “one of our own” could commit the kind of atrocities mostly associated with the nation’s enemies in war. Even to those who defended the American soldier, his image changed from citizen-soldier to baby killer—from poster boy hero and virtuous protector of the defenseless to cowardly murderer and rapist. It seemed impossible to reconcile My Lai with the concept of the United States as a chosen nation—an exceptional nation—built on republican principles and predestined by God to spread freedom throughout the world. In his memoirs after he had left the presidency, Nixon expressed the opinion of many Americans when he called it an aberration, unrepresentative of our country.

From one perspective, the story of My Lai came full circle on 10 March 2008, when Pham Thanh Cong, director of the Son My War Remnant Site and a survivor of the massacre, met at My Lai with former corporal Kenneth Schiel, a participant in the killings and the first member of Charlie Company to return to the scene. Cong had lost his mother and four siblings that day in My Lai and was surprised at Schiel’s appearance. Less than a week before the proceedings commemorating the fortieth anniversary of the massacre, they spent three hours discussing the events of 16 March 1968. Cong described the meeting as tense, though he appreciated Schiel’s effort to atone for what had happened. At first he did not admit to killing Vietnamese civilians. In the end, however, Schiel apologized even though continuing to maintain that he had been following orders. In August 2009, Cong would learn that in the United States William Calley had spoken publicly for the first time about his role in the killings.

Unlike Schiel, Calley refused to return to My Lai. Like Schiel, he claimed to have been following orders and felt no personal responsibility. To his friend Al Fleming, Calley still maintained, “I did what I had to do.”

How exceptional was My Lai? In The Guns at Last Light, Rick Atkinson shows that in the closing months of World War II American troops committed a number of horrific crimes against the French populace after landing in Normandy in 1944. Atrocities also took place in America’s other wars, including the Mexican War, the Civil War, the Spanish-American War, World War I, the Korean War, and, most recently, in Iraq and Afghanistan. To many Americans, however, Vietnam seemed to offer more examples, perhaps in part due to the war’s longevity. In Tiger Force, Michael Sellah and Mitch Weiss uncovered a series of atrocities and mass killings of Vietnamese civilians just below Da Nang, committed by an elite army contingent over the course of seven months beginning in May 1967. Nick Turse, in Kill Anything That Moves, argues that US soldiers killed civilians throughout the Vietnam War as a result of government policies that made atrocities acceptable. My Lai was thus one of many.

The mass killings of civilians, Turse argues, were “the inevitable outcome of deliberate policies, dictated at the highest levels of the military,” and resulting in a “veritable system of suffering.” These policies established the conditions conducive to atrocities—a war of attrition based on body counts, search-and-destroy missions, free-fire zones, and soldiers trained to see the enemy as subhuman.

Turse draws heavily on thousands of pages of documents collected by the Vietnam War Crimes Working Group, a secret task force working out of the army chief of staff ’s office created by the Pentagon in 1970. The documents gathered by this wartime investigation, declassified in 1994, recorded hundreds of atrocities committed by US forces in Vietnam. Eight boxes of these materials, all extracts from the now-open CID and Peers Inquiry files, focused on My Lai, however, making it stand out from the others. General Westmoreland emphasized this point in his report. “The Army investigated every case, no matter who made the allegation,” but “none of the crimes even remotely approached the magnitude and horror of My Lai.” Whereas many of these atrocities in other parts of Vietnam came by air and at night, every victim at My Lai was killed during the day, many of them less than five feet away while facing their killers.

My Lai simply stands out, in part because of the numbers. 504 victims are listed on the marble plaque located near the entrance to the museum at the Son My War Remnant Site in My Lai. The victims broke down into 231 males and 273 females—seventeen of them pregnant. More than half of those killed—259—were under twenty years of age: forty-nine teenagers, 160 aged four to twelve years, and fifty who were three years old or younger. Of the remainder, eighty-four were in their twenties and thirties, and the rest ranged from their forties to the oldest at eighty. The numbers do not tell the whole story, but they say a great deal.

More than forty soldiers apparently took part in killing civilians. Of all the facts that emerged from the many investigations and reports, perhaps the most chilling is that not a single soldier on the ground tried to stop the killing.

My Lai made it imperative that the army institute major changes in training aimed at developing what Eckhardt called “professional battlefield behavior.” To understand the importance of restraint in combat, soldiers and officers must learn to disobey illegal orders. The only way to bring this about, Eckhardt insisted, was “to plainly state that the intentional killing without justification of noncombatants—old men, women, children, and babies—is murder and is illegal.” No one prior to My Lai had considered it necessary to teach US soldiers something so “obvious”; My Lai had made the obvious necessary.

Nothing today could ease the pain of what happened at My Lai, but it is crucial that we do not allow this tragedy to slip from memory.

Related Stories

]]>http://feeds.feedblitz.com/~/442355692/_/oupblogusahistory/feed/0History,*Featured,Howard Jones,military,Vietnam War,war crimes,Books,Asia,My Lai,My Lai Massacre,Vietnam 1968 the Descent into Darkness,War Literature,America,VietnamOn 17 March 1968, American soldiers entered a group of hamlets located in the Son Tinh district of South Vietnam. Three hours after the GIs entered the hamlets, more than five hundred unarmed villagers lay dead, killed in cold blood.
The My Lai massacre remains one of the most devastating events in American military history. Initially covered up by military authorities, the events of that day slowly came to light and caused international outrage, eventually leading to the prosecution of nearly thirty United States Army Officers.
In the following excerpt from My Lai: Vietnam, 1968, and the Descent into Darkness, historian Howard Jones examines the aftermath of one of the darkest days in military history.
How should we look at My Lai now, nearly fifty years after the events? For most Americans, it was a rude awakening to learn that “one of our own” could commit the kind of atrocities mostly associated with the nation’s enemies in war. Even to those who defended the American soldier, his image changed from citizen-soldier to baby killer—from poster boy hero and virtuous protector of the defenseless to cowardly murderer and rapist. It seemed impossible to reconcile My Lai with the concept of the United States as a chosen nation—an exceptional nation—built on republican principles and predestined by God to spread freedom throughout the world. In his memoirs after he had left the presidency, Nixon expressed the opinion of many Americans when he called it an aberration, unrepresentative of our country.
From one perspective, the story of My Lai came full circle on 10 March 2008, when Pham Thanh Cong, director of the Son My War Remnant Site and a survivor of the massacre, met at My Lai with former corporal Kenneth Schiel, a participant in the killings and the first member of Charlie Company to return to the scene. Cong had lost his mother and four siblings that day in My Lai and was surprised at Schiel’s appearance. Less than a week before the proceedings commemorating the fortieth anniversary of the massacre, they spent three hours discussing the events of 16 March 1968. Cong described the meeting as tense, though he appreciated Schiel’s effort to atone for what had happened. At first he did not admit to killing Vietnamese civilians. In the end, however, Schiel apologized even though continuing to maintain that he had been following orders. In August 2009, Cong would learn that in the United States William Calley had spoken publicly for the first time about his role in the killings.
Unlike Schiel, Calley refused to return to My Lai. Like Schiel, he claimed to have been following orders and felt no personal responsibility. To his friend Al Fleming, Calley still maintained, “I did what I had to do.”
How exceptional was My Lai? In The Guns at Last Light, Rick Atkinson shows that in the closing months of World War II American troops committed a number of horrific crimes against the French populace after landing in Normandy in 1944. Atrocities also took place in America’s other wars, including the Mexican War, the Civil War, the Spanish-American War, World War I, the Korean War, and, most recently, in Iraq and Afghanistan. To many Americans, however, Vietnam seemed to offer more examples, perhaps in part due to the war’s longevity. In Tiger Force, Michael Sellah and Mitch Weiss uncovered a series of atrocities and mass killings of Vietnamese civilians just below Da Nang, committed by an elite army contingent over the course of seven months beginning in May 1967. Nick Turse, in Kill Anything That Moves, argues that US soldiers killed civilians throughout the Vietnam War as a result of government policies that made atrocities acceptable. My Lai was thus one of many.
The mass killings of civilians, Turse argues, were “the inevitable outcome of deliberate policies, dictated at the highest levels of the military,” and resulting ... On 17 March 1968, American soldiers entered a group of hamlets located in the Son Tinh district of South Vietnam. Three hours after the GIs entered the hamlets, more than five hundred unarmed villagers lay dead, killed in cold blood.http://feeds.feedblitz.com/~/442355692/_/oupblogusahistory/https://blog.oup.com/2017/08/zebulon-pike-louisiana-purchase/Zebulon Pike’s journey across the Louisiana Purchasehttp://feedproxy.google.com/~r/oupblogusahistory/~3/TmHCu5IscUQ/
http://feeds.feedblitz.com/~/426668778/_/oupblogusahistory/#respondTue, 08 Aug 2017 11:30:30 +0000https://blog.oup.com/?p=132634On July 15, 1806, Lieutenant Zebulon Pike departed St. Louis at the head of a military expedition to explore America’s public lands. The recently acquired Louisiana Purchase as yet held no states and almost no private property owners—at least not in the Lockean sense by which the country conferred exclusive individual rights to pieces of land.

Related Stories

]]>
On 15 July 1806, Lieutenant Zebulon Pike departed St. Louis at the head of a military expedition to explore America’s public lands. The recently acquired Louisiana Purchase as yet held no states and almost no private property owners—at least not in the Lockean sense by which the country conferred exclusive individual rights to pieces of land. The expedition’s anniversary confers an opportunity to meditate on public lands through the eyes of early Americans such as Pike and President Thomas Jefferson.

That July, most of the several major western expeditions that Jefferson launched during his presidency had returned, bearing news of tantalizing discoveries. His most famous, headed by Meriwether Lewis and William Clark, was bobbing homeward on the rivers of Montana. The last, commanded by Pike, received its instructions from General James Wilkinson, a notorious shyster who nevertheless gave Pike instructions similar to Lewis and Clark’s: meet the people who live there, persuade them to be our friends, and see what’s useful out there. It was, essentially, an errand to turn public lands claimed into public lands controlled and to help the national government figure out what to do with them.

Pike visited Pawnees on the Great Plains, attempted to climb Pikes Peak, and explored the southern Rockies, before falling into the hands of Spaniards in that disputed space where American public lands met the Spanish crown’s. After a Spanish escort across Texas to the United State, Pike reported to Jefferson and Congress on what he had found. Native peoples, he said, considered the lands theirs. Some were friendly to the United States, and some were resistant. All would have to be wooed before the republic could use the land. Even then, it held little value for development. Missouri looked promising, but the rest reminded him of African deserts. Traveling during one of the West’s episodic droughts, he called it “barren” and “parched.”

Across the middle of the United States, country roads and other byways often follow the old gridlines of the original survey’s. Here, near Pike’s route, Jefferson’s brainchild is manifest on the central Kansas landscape. Used with permission of photographer, Jared Orsi.

Its uselessness, however, might be useful. Uninviting, infertile Louisiana might be a brake on the expansion of private land ownership and thus a reserve against the widely feared possibility that a republic could not hold the loyalties of a people flung across a large geographical extent.

Pike, then, envisioned some lands reserved for Indians, some dispersed to settlers, and some to be left alone as a buffer between them.

His ideas mirrored Jefferson’s. Famous for writing the Declaration of Independence and engineering the Louisiana Purchase, Jefferson left an equally important, though less renowned, legacy through the Land Ordinance of 1785.

Conceptualized by Jefferson, the act established a systematic process by which land owned by the national government would be surveyed into a grid of parcels of equal size and congruent shape. Most plots would be dispersed at inexpensive prices to private claimants, but one of every thirty-six would be reserved for a township and another for a school.

Variations of this combination of dispersal and reservation have unfolded ever since. Congress passed, amended, and repassed numerous homestead acts to transfer the public lands into private hands. Railroads secured grants of land, which they sold to pay off the bonds for laying track. Colleges, such as the one where I work, were founded with the proceeds of sales of federal land grants to states, supplying democratic, low-cost, practical higher education in agriculture, engineering, water development, mining, military science, and teacher education.

Arkansas River Valley in western Kansas about the time of year Pike encountered it. Pike traveled at the driest season and in a year of sub-average precipitation, leading him to determine the Great Plains if little economic value. Used with permission of photographer, Jared Orsi.

The federal government reserved some lands for water rights. Along international boundaries, a strip of public lands facilitated border enforcement. Tribal reservations were carved out of still other national holdings. In the late nineteenth-century, cutover forests, eroded soils, and tawdry tourist traps spurred the federal government to create national forests, national parks, and other landscapes managed toward publicly desired outcomes.

Various as these purposes were, all fit within Jefferson’s original flexible design of surveying public lands, reserving some, dispersing many, and managing all.

Two lessons can be found in these events. First, the architects and implementers of the U.S. public lands system envisioned it would be an engine of development but not that only. By many purposes national lands might serve the interests of the American people. The second lesson is simply history. Americans’ objectives for their public lands have changed over time. What made sense in 1785 differed from the republic’s needs during the Civil War. Neither era perfectly anticipated the Progressive Era or twenty-first century. Jefferson could not have imagined the value of railroad grants or national parks, but he designed a system flexible enough to accommodate the dreams and schemes of subsequent Americans who did.

Related Stories

]]>http://feeds.feedblitz.com/~/426668778/_/oupblogusahistory/feed/0History,Jared Orsi,*Featured,exploration,Books,General James Wilkinson,Pawnees,Citizen Explorer: The Life of Zebulon Pike,Land Ordinance of 1785,Geography,pikes peak,Thomas Jefferson,America,Zebulon Pike,Great Plains,Louisiana Purchase,Meriwether Lewis,William ClarkOn 15 July 1806, Lieutenant Zebulon Pike departed St. Louis at the head of a military expedition to explore America’s public lands. The recently acquired Louisiana Purchase as yet held no states and almost no private property owners—at least not in the Lockean sense by which the country conferred exclusive individual rights to pieces of land. The expedition’s anniversary confers an opportunity to meditate on public lands through the eyes of early Americans such as Pike and President Thomas Jefferson.
That July, most of the several major western expeditions that Jefferson launched during his presidency had returned, bearing news of tantalizing discoveries. His most famous, headed by Meriwether Lewis and William Clark, was bobbing homeward on the rivers of Montana. The last, commanded by Pike, received its instructions from General James Wilkinson, a notorious shyster who nevertheless gave Pike instructions similar to Lewis and Clark’s: meet the people who live there, persuade them to be our friends, and see what’s useful out there. It was, essentially, an errand to turn public lands claimed into public lands controlled and to help the national government figure out what to do with them.
Pike visited Pawnees on the Great Plains, attempted to climb Pikes Peak, and explored the southern Rockies, before falling into the hands of Spaniards in that disputed space where American public lands met the Spanish crown’s. After a Spanish escort across Texas to the United State, Pike reported to Jefferson and Congress on what he had found. Native peoples, he said, considered the lands theirs. Some were friendly to the United States, and some were resistant. All would have to be wooed before the republic could use the land. Even then, it held little value for development. Missouri looked promising, but the rest reminded him of African deserts. Traveling during one of the West’s episodic droughts, he called it “barren” and “parched.” Across the middle of the United States, country roads and other byways often follow the old gridlines of the original survey’s. Here, near Pike’s route, Jefferson’s brainchild is manifest on the central Kansas landscape. Used with permission of photographer, Jared Orsi.
Its uselessness, however, might be useful. Uninviting, infertile Louisiana might be a brake on the expansion of private land ownership and thus a reserve against the widely feared possibility that a republic could not hold the loyalties of a people flung across a large geographical extent.
Pike, then, envisioned some lands reserved for Indians, some dispersed to settlers, and some to be left alone as a buffer between them.
His ideas mirrored Jefferson’s. Famous for writing the Declaration of Independence and engineering the Louisiana Purchase, Jefferson left an equally important, though less renowned, legacy through the Land Ordinance of 1785.
Conceptualized by Jefferson, the act established a systematic process by which land owned by the national government would be surveyed into a grid of parcels of equal size and congruent shape. Most plots would be dispersed at inexpensive prices to private claimants, but one of every thirty-six would be reserved for a township and another for a school.
Variations of this combination of dispersal and reservation have unfolded ever since. Congress passed, amended, and repassed numerous homestead acts to transfer the public lands into private hands. Railroads secured grants of land, which they sold to pay off the bonds for laying track. Colleges, such as the one where I work, were founded with the proceeds of sales of federal land grants to states, supplying democratic, low-cost, practical higher education in agriculture, engineering, water development, mining, military science, and teacher education. Arkansas River Valley in western Kansas about the time of year Pike encountered it. Pike ... On 15 July 1806, Lieutenant Zebulon Pike departed St. Louis at the head of a military expedition to explore America’s public lands. The recently acquired Louisiana Purchase as yet held no states and almost no private property owners—http://feeds.feedblitz.com/~/426668778/_/oupblogusahistory/https://blog.oup.com/2017/08/political-legacy-andrew-jackson/The political legacy of Andrew Jacksonhttp://feedproxy.google.com/~r/oupblogusahistory/~3/Hn-pYstmrFg/
http://feeds.feedblitz.com/~/423084884/_/oupblogusahistory/#respondSat, 05 Aug 2017 09:30:38 +0000https://blog.oup.com/?p=132795Sometime after rising to international fame in 1815, Andrew Jackson lamented that his critics had him all wrong. Whether from ignorance or malice, they spread rumors and lies about his actions and motives. They also smeared his wife, Rachel, with whom he often shared his sense of persecution.

Related Stories

In response to the elitism of the Founding Fathers, Andrew Jackson shaped his legacy as a political rebel and devoted representative of the common man. Today, that legacy has become a source of controversy. His advocates view him as a hero who promised to maintain democratic tradition and protect American values. But new evidence has revealed the immorality behind his policies and legal regimes.

In the following excerpt from Avenging the People, historian J.M. Opal discusses these interpretations of Jackson’s presidency, and disputes the enduring image that Jackson painted of himself.

Sometime after rising to international fame in 1815, Andrew Jackson lamented that his critics had him all wrong. Whether from ignorance or malice, they spread rumors and lies about his actions and motives. They also smeared his wife, Rachel, with whom he often shared his sense of persecution.

Although most Americans seemed to worship him, these attacks were still painful, for they hit core parts of the general’s self-image. His “settled course” in life, he told a trusted friend, was to honor the rigid code of behavior with which his mother had once entrusted him. He was, he believed, completely devoted to just dealings and always careful to avoid insults. He was, he insisted, especially loyal to the US Constitution. Yet his enemies said that he was “a most ferocious animal, insensible to moral duty, and regardless of the laws both of God and man.”

Image credit: “Portrait d’Andrew Jackson” courtesy of The General Libraries, The University of Texas at Austin. Public Domain via Wikimedia Commons.

I take issue with many things that Jackson wanted people to think about him. In particular, I question his place in America’s democratic tradition, drawing attention to the popular efforts and egalitarian ideas that he and his allies helped to bury. I try to avoid the strong pull of his personal legend and the historical narratives that bear his name, often moving him off the center of analysis to better see the people, places, conflicts, and choices that made him. I do not doubt the sincerity of Jackson’s belief in his own lawfulness, nor even the accuracy of that belief. He really did believe in the law. He certainly wanted justice. And his efforts to inflict his versions of both defined his life and career in ways that his other roles and identities— an Irishman, a southerner, a westerner, a soldier, a slave owner, a Democrat— cannot explain. His life was a mission, the mission was just, and its enemies would be judged.

Jackson was sure that his duties were authorized at the highest levels, and for good reason. His views on civil order and property rights often aligned with those of America’s first national leaders, who were also keen to draw the new republic into a larger society of “civilized” states. In this sense he was a proper nationalist. On the other hand, Jackson took an oath to a European monarch, was implicated in two secessionist plots, accused the federal government of suicidal cowardice, and threatened to incinerate a US government building and official. He chafed at the national terms of the rule of law and had little use for competing forms of American fellowship and sovereignty. A long series of regional traumas and global crises made him a particular kind of hero in 1814–15 and again in 1818–19, a larger-than-life “avenger” with a passionate bond to the American “nation.” His very name evoked a set of feelings and stories that marked Americans in some essential way— in their very blood, as the saying quite appropriately goes.

When America set about claiming its independence, Americans had little choice but to think critically about the rule of law and its relation to natural rights. The king and his minions were waging a “most cruel and unjust war,” the first constitutions of Vermont and Pennsylvania declared. They were pursuing the good people with “unabated vengeance.” South Carolina’s new charter accused the British of conduct that would “disgrace even savage nations,” while New Jersey’s deplored a “cruel and unnatural” hostility that left the people exposed to “the fury of a cruel and relentless enemy.” The king had not just withdrawn his protection, North Carolina reported, but had also declared open season on American persons and property, risking “anarchy and confusion.” Seeking allies in Europe, Benjamin Franklin stunned his British counterparts by accusing the empire of “Barbarities” once associated with frontier scalp hunters.

Stories of imperial savagery lent narrative form and moral purpose to eight years of war, during which some 40% of the free male population over 16 served in either a Patriot militia or the Continental Army. The larger theme of existential peril reappeared for the next 50 years, framing the life of Andrew Jackson, among many others, and shaping almost everything they said about virtue and republics, society and sovereignty, nation and allegiance. Unsure if the British Empire would let them live, they wondered if the rule of law would ever replace the state of nature. Unsure if the law of nations constrained any of the “civilized nations,” especially after the French and Haitian Revolutions set the world aflame, they argued over how and if they should respect the same standard. In so doing, they also debated how and if they were a “nation,” as well as a republic or union.

When and where was vengeance just and lawful, and when and where was it cruel and criminal? Who had the right to take it on behalf of the reinvented people? For Jackson they evoked memories so awful that the usual terms of law and politics did not apply, demanding new bonds of holy wrath and redeeming blood. His arguments thrilled many Americans and disgusted others.

Jackson’s beliefs were often more vehement than popular, and though he never hid them he also learned how to change the conversation. Ultimately, the sort of nationhood Jackson came to embody left Americans with a diminished sense of the law and their right to make it, indeed with less power, to be the nation they wanted to be.

Related Stories

]]>http://feeds.feedblitz.com/~/423084884/_/oupblogusahistory/feed/0History,*Featured,avenging the people,Biography,US SS17,andrew jackson,America,J.M. Opal,tradeIn response to the elitism of the Founding Fathers, Andrew Jackson shaped his legacy as a political rebel and devoted representative of the common man. Today, that legacy has become a source of controversy. His advocates view him as a hero who promised to maintain democratic tradition and protect American values. But new evidence has revealed the immorality behind his policies and legal regimes.
In the following excerpt from Avenging the People, historian J.M. Opal discusses these interpretations of Jackson’s presidency, and disputes the enduring image that Jackson painted of himself.
Sometime after rising to international fame in 1815, Andrew Jackson lamented that his critics had him all wrong. Whether from ignorance or malice, they spread rumors and lies about his actions and motives. They also smeared his wife, Rachel, with whom he often shared his sense of persecution.
Although most Americans seemed to worship him, these attacks were still painful, for they hit core parts of the general’s self-image. His “settled course” in life, he told a trusted friend, was to honor the rigid code of behavior with which his mother had once entrusted him. He was, he believed, completely devoted to just dealings and always careful to avoid insults. He was, he insisted, especially loyal to the US Constitution. Yet his enemies said that he was “a most ferocious animal, insensible to moral duty, and regardless of the laws both of God and man.” Image credit: “Portrait d'Andrew Jackson” courtesy of The General Libraries, The University of Texas at Austin. Public Domain via Wikimedia Commons.
I take issue with many things that Jackson wanted people to think about him. In particular, I question his place in America’s democratic tradition, drawing attention to the popular efforts and egalitarian ideas that he and his allies helped to bury. I try to avoid the strong pull of his personal legend and the historical narratives that bear his name, often moving him off the center of analysis to better see the people, places, conflicts, and choices that made him. I do not doubt the sincerity of Jackson’s belief in his own lawfulness, nor even the accuracy of that belief. He really did believe in the law. He certainly wanted justice. And his efforts to inflict his versions of both defined his life and career in ways that his other roles and identities— an Irishman, a southerner, a westerner, a soldier, a slave owner, a Democrat— cannot explain. His life was a mission, the mission was just, and its enemies would be judged.
Jackson was sure that his duties were authorized at the highest levels, and for good reason. His views on civil order and property rights often aligned with those of America’s first national leaders, who were also keen to draw the new republic into a larger society of “civilized” states. In this sense he was a proper nationalist. On the other hand, Jackson took an oath to a European monarch, was implicated in two secessionist plots, accused the federal government of suicidal cowardice, and threatened to incinerate a US government building and official. He chafed at the national terms of the rule of law and had little use for competing forms of American fellowship and sovereignty. A long series of regional traumas and global crises made him a particular kind of hero in 1814–15 and again in 1818–19, a larger-than-life “avenger” with a passionate bond to the American “nation.” His very name evoked a set of feelings and stories that marked Americans in some essential way— in their very blood, as the saying quite appropriately goes.
When America set about claiming its independence, Americans had little choice but to think critically about the rule of law and its relation to natural rights. The king and his minions were waging a “most cruel and unjust war,” the first ... In response to the elitism of the Founding Fathers, Andrew Jackson shaped his legacy as a political rebel and devoted representative of the common man. Today, that legacy has become a source of controversy. His advocates view him as a hero who ... http://feeds.feedblitz.com/~/423084884/_/oupblogusahistory/https://blog.oup.com/2017/08/federal-writers-project-oral-history/That someone else: finding a new oral history ancestorhttp://feedproxy.google.com/~r/oupblogusahistory/~3/MnziE-N-h1A/
http://feeds.feedblitz.com/~/421797252/_/oupblogusahistory/#respondFri, 04 Aug 2017 09:30:46 +0000https://blog.oup.com/?p=132805Dan Kerr acknowledges in his article, “Allan Nevins Is Not My Grandfather,” that most historians of oral history tend to dismiss the Federal Writers’ Project (FWP) as a mere “prehistory” of the field, because the vast majority of FWP interviews were recorded with pen and paper rather than with machine.

“Ever since the Federal Writers’ Project interviews with former slaves in the 1930s, oral history has been about the fact that there’s more to history than presidents and generals.” –Alessandro Portelli

Dan Kerr acknowledges in his article, “Allan Nevins Is Not My Grandfather,” that most historians of oral history tend to dismiss the Federal Writers’ Project (FWP) as a mere “prehistory” of the field, because the vast majority of FWP interviews were recorded with pen and paper rather than with machine. However, in the research that I conducted towards my M.A. thesis in oral history, I discovered for myself the untapped potency that the FWP holds for oral historians who seek an origin story more closely aligned with the field’s impulse towards effecting social change.

Started in 1935 as part of the New Deal’s Works Progress Administration, the Federal Writers’ Project put thousands of unemployed writers to work on assignments that served the FWP’s ambitious cultural agenda: to foster a badly needed renewal of the United States’ self-image, and to forge a new American unity through celebration of unrecognized American diversity. As Jerrold Hirsch writes, the cohort of public intellectuals directing the Project—Henry Alsberg (national director), Sterling A. Brown (editor of Negro affairs), Morton Royse (social-ethnic studies editor) and Benjamin A. Botkin (folklore editor)—sought to imbue the nation’s public life with “a cosmopolitanism that encouraged Americans to value their own provincial traditions and to show an interest in the traditions of their fellow citizens.”

FWP writers pursued this pluralistic aim through a practice that I think of as proto-oral-history fieldwork. All across the country, the writers spent much of their workdays conducting interviews with people traditionally excluded from the process of history-writing: the working poor, immigrants, women, and people of color (including those who had been born slaves). The Project intended to use the testimony furnished by the interviews as fodder for both an American Guide Series—a set of guidebooks, one for each of the state in the Union—and Composite America, a series of cultural anthologies that would reveal overlooked strands and narratives of American culture to the wider public.

If you are an oral historian seeking a new grandfather—one with greater aesthetic concerns, democratic objectives, and community-based ethics than Allan Nevins—I recommend you check out the leading soul and intellect of the FWP’s interviewing program: B. A. Botkin (1901 – 1975). I first encountered Botkin in the introduction to Ann Banks’ First Person America, a book that curates about eighty extracts from the almost 10,000 interviews produced by FWP fieldworkers, and was the result of Banks’ own pioneering effort to survey and catalogue the entire collection of interviews, which had sat unexamined in a set of file cabinets at the Library of Congress for more than thirty years after the Project was disbanded.

In the introduction to her book, Banks celebrates Botkin’s “unconventional approach to the subject of folklore” as a crucial influence on the Federal Writers’ interview methodology. Botkin “wanted to explore the rough texture of everyday life,” Banks writes, “to collect what he called ‘living lore’…Again and again, he stressed the importance of the process of collecting narratives. The best results, he wrote, were obtained ‘when a good informant and a good interviewer got together and the narrative is the process of the conscious or unconscious collaboration of the two.’”

Banks goes on, “Benjamin Botkin called for an emphasis on ‘history from the bottom up,’ in which the people become their own historians. He believed that ‘history must study the inarticulate many as well as the articulate few.’ The advent of tape recorders in the years following the 1930s has refined the practice of what has come to be called oral history and made it possible for Botkin’s goals to be pursued more easily.”

In other words, Botkin instructed the Federal Writers to approach their interviews dialogically, as intersubjective exchanges built upon a shared authority, decades before these central concepts were so named in the field of oral history. Botkin saw the potential for this interview technique to drive a radically inclusive rehabilitation of American life, decades before the popular education and people’s history movements that Kerr recovers in his article.

Botkin instructed the Federal Writers to approach their interviews dialogically, as intersubjective exchanges built upon a shared authority, decades before these central concepts were so named in the field of oral history.

Botkin deeply appreciated the pedagogical and integrative function of the work that we now call oral history. His desire to make the archive produced by FWP fieldworkers accessible to an “ever-widening public,” to “give back to the people what we have taken from them and what rightfully belongs to them in a form that they can understand and use,” led him to declare the FWP’s interview program “the greatest educational as well as social experiment of our time.” While the outcomes of this experiment varied in quality, social justice-oriented oral historians will continue to find Botkin’s impressive body of thought a particularly germane touchstone for their work. Why? Because Botkin’s method and theory of interviewing took relationships seriously. Botkin prized the meaningful encounter—the “mutual sighting,” to use Portelli’s phrase—as the foundation for not only a successful interview, but also a healthy democracy.

Botkin refined this ideology in the years following his tenure with the FWP, when he elaborated a public-facing research practice that he called “applied folklore.” Botkin used this term broadly, “to designate the use of folklore to some end beyond itself…into social or literary history, education, recreation, or the arts.” He identified the basic impulse of applied folklore as “the celebration of our ‘commonness’—the ‘each’ in all of us and the ‘all’ in each of us…an interchange between cultural groups or levels, between the folk and the student of folklore.” And anticipating the highest aims of contemporary historical dialogue work, Botkin writes, “The ultimate aim of applied folklore is the restoration to American life of the sense of community—a sense of thinking, feeling, and acting along similar, though not the same, lines—that is in danger of being lost today. Thus applied folklore goes beyond cultural history to cultural strategy.”

In my recent work as Project Trainer for the DC Oral History Collaborative, I have constantly recalled Botkin as a personal guide. I have encouraged my interviewers to be themselves in the encounter; to relax their impulse to control the dialogue and instead follow, as Botkin instructed his Federal Writers, “the natural association of ideas and memories”; and to practice framing their narrators as valuable witnesses of their neighborhood, school, and migration histories. I have done this in the spirit of fostering what the Federal Writers’ Project aimed for nationally—“an inter-regional synthesis”—within the densely diverse and still too segregated scope of our nation’s capital.

Featured image credit: “Federal Writers’ Project presentation of Who’s who at the zoo” by unknown, Public Domain via Wikimedia Commons.

]]>http://feeds.feedblitz.com/~/421797252/_/oupblogusahistory/feed/0*Featured,Oral History Review,Benji de la Piedra,Journals,OHR,WPA,Benjamin Botkin,The Oral History Review,american history,FWP,America,Workers' Progress Administration,Oral History,oral history,Federal Writers’ ProjectIn April Allison Corbett shared her reaction to Dan Kerr’s article “Allen Nevins Is Not My Grandfather: The Roots of Radical Oral History Practice in the United States,” explaining the roots of her own radical oral history practice. Today we hear from Benji de la Piedra, as he shares another oral history origin story from his research on the Federal Writers’ Project. Enjoy his insights, and check out our call for submissions here, if you’d like to contribute your own reflections.
“Ever since the Federal Writers’ Project interviews with former slaves in the 1930s, oral history has been about the fact that there’s more to history than presidents and generals.” –Alessandro Portelli
Dan Kerr acknowledges in his article, “Allan Nevins Is Not My Grandfather,” that most historians of oral history tend to dismiss the Federal Writers’ Project (FWP) as a mere “prehistory” of the field, because the vast majority of FWP interviews were recorded with pen and paper rather than with machine. However, in the research that I conducted towards my M.A. thesis in oral history, I discovered for myself the untapped potency that the FWP holds for oral historians who seek an origin story more closely aligned with the field’s impulse towards effecting social change.
Started in 1935 as part of the New Deal’s Works Progress Administration, the Federal Writers’ Project put thousands of unemployed writers to work on assignments that served the FWP’s ambitious cultural agenda: to foster a badly needed renewal of the United States’ self-image, and to forge a new American unity through celebration of unrecognized American diversity. As Jerrold Hirsch writes, the cohort of public intellectuals directing the Project—Henry Alsberg (national director), Sterling A. Brown (editor of Negro affairs), Morton Royse (social-ethnic studies editor) and Benjamin A. Botkin (folklore editor)—sought to imbue the nation’s public life with “a cosmopolitanism that encouraged Americans to value their own provincial traditions and to show an interest in the traditions of their fellow citizens.”
FWP writers pursued this pluralistic aim through a practice that I think of as proto-oral-history fieldwork. All across the country, the writers spent much of their workdays conducting interviews with people traditionally excluded from the process of history-writing: the working poor, immigrants, women, and people of color (including those who had been born slaves). The Project intended to use the testimony furnished by the interviews as fodder for both an American Guide Series—a set of guidebooks, one for each of the state in the Union—and Composite America, a series of cultural anthologies that would reveal overlooked strands and narratives of American culture to the wider public.
If you are an oral historian seeking a new grandfather—one with greater aesthetic concerns, democratic objectives, and community-based ethics than Allan Nevins—I recommend you check out the leading soul and intellect of the FWP’s interviewing program: B. A. Botkin (1901 – 1975). I first encountered Botkin in the introduction to Ann Banks’ First Person America, a book that curates about eighty extracts from the almost 10,000 interviews produced by FWP fieldworkers, and was the result of Banks’ own pioneering effort to survey and catalogue the entire collection of interviews, which had sat unexamined in a set of file cabinets at the Library of Congress for more than thirty years after the Project was disbanded.
In the introduction to her book, Banks celebrates Botkin’s “unconventional approach to the subject of folklore” as a crucial influence on the Federal Writers’ interview methodology. Botkin “wanted to explore the rough texture of ... In April Allison Corbett shared her reaction to Dan Kerr’s article “Allen Nevins Is Not My Grandfather: The Roots of Radical Oral History Practice in the United States,” explaining the roots of her own radical oral history practice.http://feeds.feedblitz.com/~/421797252/_/oupblogusahistory/https://blog.oup.com/2017/08/dangerous-mission-loyalty-treason-american-revolution/A dangerous mission: loyalty and treason during the American Revolutionhttp://feedproxy.google.com/~r/oupblogusahistory/~3/HJzQOJRteBg/
http://feeds.feedblitz.com/~/418369102/_/oupblogusahistory/#respondTue, 01 Aug 2017 12:30:15 +0000https://blog.oup.com/?p=132725The American Revolution was at once a national, a continental, and an imperial phenomenon. It produced a new American republic, rearranged power relations and territorial claims across North America, and altered Europeans’ global empires. It inspired stirring statements about universal rights and liberties even as it exposed disturbing divisions rooted in distinctions of class, ethnicity, race, and gender.

In September 1776, Nathan Hale and Moses Dunbar set out to support opposing forces in the American Revolution. Hale, a spy for the Continental Army, had volunteered to gather intelligence against the British. Dunbar had enlisted in the King’s Army and was commissioned to convince other young men to turn against the United States.

Both men were caught and executed before completing their missions—one remembered as a martyr and the other as a traitor to the American cause.

In the following excerpt from The Martyr and the Traitor, Virginia DeJohn Anderson compares the lives of these two men, and explores the differences that led them to a similar fate.

The American Revolution was at once a national, a continental, and an imperial phenomenon. It produced a new American republic, rearranged power relations and territorial claims across North America, and altered Europeans’ global empires. It inspired stirring statements about universal rights and liberties even as it exposed disturbing divisions rooted in distinctions of class, ethnicity, race, and gender. It affected—directly or indirectly, and often adversely—not only American colonists and Britons, but also French, Spanish, and even Russian colonists, Native Americans, and Africans and African Americans. The more we learn about it, the more complicated the Revolution appears.

For people who lived through it, the Revolution was even more confusing. Political upheaval and warfare intruded upon their households and communities, causing unprecedented disruptions and forcing them to take actions with unpredictable consequences. Driven by high-minded principles, self-interest, or a mixture of both, participants reacted to a multitude of factors—many of them local and highly personal—that loomed large for them but barely registered in subsequent grand narratives of the Revolution. This was as true for Indian peoples weighing the relative merits of neutrality versus alliance with one of the contending sides, African slaves pondering British invitations to seek their freedom, and French and Spanish officials along the Gulf Coast tracking developments on distant battlefields as it was for British Americans in the 13 rebellious colonies.

Statue of Nathan Hale, the first American executed for spying for his country. This statue is a copy of the original work created in 1914 for Yale University, Hale’s alma mater. Image credit: image provided by the Central Intelligence Agency. Public Domain via Wikimedia Commons.
[/pull ]

Anxiety led many of those Americans to look beyond their British adversaries and detect secret enemies closer to home, thereby transforming the War for Independence into a civil as well as an imperial conflict. Internecine strife erupted in such places as the southern backcountry and parts of Pennsylvania, New Jersey, New York, and Long Island, fracturing communities and even families. It also broke out in Connecticut, perhaps the least likely setting for such internal discord. Yet even there, neighbors who shared similar backgrounds in terms of religion, race, ethnicity, and economic status found occasion during the revolutionary tumult to fear and hate one another.

The stories of Moses Dunbar and Nathan Hale were deeply rooted in that Connecticut countryside and those unsettled times. Both men started out in life as sons of striving farmers laboring in agrarian villages whose inhabitants took for granted their membership in Britain’s empire. Although Hale and Dunbar never met, Connecticut was a small enough place that they had common acquaintances. Neither man’s choice of allegiance during the Revolution was foreordained; rather, it developed fitfully in the context of preexisting social relationships that initially had nothing to do with politics. Those personal connections became politicized as armed conflict neared, driving Dunbar to oppose American independence and Hale to support it. The challenge of balancing private responsibilities toward friends and family against the public demands of politics and war during such perilous times vexed many—if not most—colonists, no matter which side they were on. For very few of them, however, did engagement with that struggle lead to the gallows, as it did with these two men.

The deaths of Nathan Hale and Moses Dunbar might have been exceptional, but their lives were not. Their tragic stories offer a particularly dramatic demonstration of a common experience, showing how a welter of personal and political factors could confound people’s efforts to exert control over their lives in the midst of the Revolution.

Matters of timing were especially crucial to Hale and Dunbar and their posthumous reputations. Each man undertook the action that led to his death at a moment when nearly everyone believed that Britain stood poised to win the war. Had that happened, their respective roles as martyr and traitor would have been reversed. Posterity often takes America’s victory for granted; neither of these men—nor others in their communities—dared to do so.

The War for Independence is often seen as a “good war” with righteous patriots pitted against misguided, if not evil-minded, Britons and loyalists. Such an oversimplified popular version of events distorts what was a far more tangled history and ignores the participation of a far larger cast of characters, many of them living well beyond the bounds of the 13 original colonies. It does not even apply to the experiences of revolutionaries and loyalists in a small place like Connecticut, where no faction held a monopoly on principle. Each man met his death for acting in accordance with his beliefs. Nathan Hale and Moses Dunbar are both worth remembering because their tragic fates represent two sides of the same coin. They are equally part of America’s Revolutionary story.

]]>http://feeds.feedblitz.com/~/418369102/_/oupblogusahistory/feed/0History,*Featured,Nathan Hale,Moses Dunbar,Biography,Books,connecticut,american history,American Revolution,America,Virginia DeJohn Anderson,The Martyr and the TraitorIn September 1776, Nathan Hale and Moses Dunbar set out to support opposing forces in the American Revolution. Hale, a spy for the Continental Army, had volunteered to gather intelligence against the British. Dunbar had enlisted in the King’s Army and was commissioned to convince other young men to turn against the United States.
Both men were caught and executed before completing their missions—one remembered as a martyr and the other as a traitor to the American cause.
In the following excerpt from The Martyr and the Traitor, Virginia DeJohn Anderson compares the lives of these two men, and explores the differences that led them to a similar fate.
The American Revolution was at once a national, a continental, and an imperial phenomenon. It produced a new American republic, rearranged power relations and territorial claims across North America, and altered Europeans’ global empires. It inspired stirring statements about universal rights and liberties even as it exposed disturbing divisions rooted in distinctions of class, ethnicity, race, and gender. It affected—directly or indirectly, and often adversely—not only American colonists and Britons, but also French, Spanish, and even Russian colonists, Native Americans, and Africans and African Americans. The more we learn about it, the more complicated the Revolution appears.
For people who lived through it, the Revolution was even more confusing. Political upheaval and warfare intruded upon their households and communities, causing unprecedented disruptions and forcing them to take actions with unpredictable consequences. Driven by high-minded principles, self-interest, or a mixture of both, participants reacted to a multitude of factors—many of them local and highly personal—that loomed large for them but barely registered in subsequent grand narratives of the Revolution. This was as true for Indian peoples weighing the relative merits of neutrality versus alliance with one of the contending sides, African slaves pondering British invitations to seek their freedom, and French and Spanish officials along the Gulf Coast tracking developments on distant battlefields as it was for British Americans in the 13 rebellious colonies. Statue of Nathan Hale, the first American executed for spying for his country. This statue is a copy of the original work created in 1914 for Yale University, Hale's alma mater. Image credit: image provided by the Central Intelligence Agency. Public Domain via Wikimedia Commons. [/pull ]
Anxiety led many of those Americans to look beyond their British adversaries and detect secret enemies closer to home, thereby transforming the War for Independence into a civil as well as an imperial conflict. Internecine strife erupted in such places as the southern backcountry and parts of Pennsylvania, New Jersey, New York, and Long Island, fracturing communities and even families. It also broke out in Connecticut, perhaps the least likely setting for such internal discord. Yet even there, neighbors who shared similar backgrounds in terms of religion, race, ethnicity, and economic status found occasion during the revolutionary tumult to fear and hate one another.
The stories of Moses Dunbar and Nathan Hale were deeply rooted in that Connecticut countryside and those unsettled times. Both men started out in life as sons of striving farmers laboring in agrarian villages whose inhabitants took for granted their membership in Britain’s empire. Although Hale and Dunbar never met, Connecticut was a small enough place that they had common acquaintances. Neither man’s choice of allegiance during the Revolution was foreordained; rather, it developed fitfully in the context of preexisting social relationships that initially had nothing to do with politics. Those personal connections became politicized as armed conflict neared, driving Dunbar to oppose American independence and Hale to support it. ... In September 1776, Nathan Hale and Moses Dunbar set out to support opposing forces in the American Revolution. Hale, a spy for the Continental Army, had volunteered to gather intelligence against the British. Dunbar had enlisted in the King’http://feeds.feedblitz.com/~/418369102/_/oupblogusahistory/https://blog.oup.com/2017/08/america-pacific-war/Conquering distance: America in the Pacific Warhttp://feedproxy.google.com/~r/oupblogusahistory/~3/2VyrYQF_bwY/
http://feeds.feedblitz.com/~/418309602/_/oupblogusahistory/#respondTue, 01 Aug 2017 11:30:12 +0000https://blog.oup.com/?p=132400Following a wave of Japanese attacks, the American, British, Canadian, and Dutch forces entered the Pacific War on 8 December 1941. As American forces moved across the Pacific they encountered a determined and desperate enemy and a harsh inhospitable environment. By early 1944, armed with new fast carriers, the Americans stepped up the pace of operations and launched the campaigns that would bring them to the doorstep of the Japanese homeland. But every step closer to Japan was a step farther from the United States.

]]>
Following a wave of Japanese attacks, the American, British, Canadian, and Dutch forces entered the Pacific War on 8 December 1941. As American forces moved across the Pacific they encountered a determined and desperate enemy and a harsh inhospitable environment. By early 1944, armed with new fast carriers, the Americans stepped up the pace of operations and launched the campaigns that would bring them to the doorstep of the Japanese homeland. But every step closer to Japan was a step farther from the United States.

The Americans confronted a host of obstacles in waging war across the Pacific, but of these, perhaps the most elemental and intractable was distance. The vast expanses of the Pacific mocked American efforts to bring the war to a speedy conclusion. As Hanson Baldwin, the New York Times military analyst noted, “[I]n the Pacific, distance is against us, and the problems of supply are mammoth.” The accompanying map, “Comparative areas of the United States and Southwest Pacific,” vividly illustrates the scope of the problem.

Distance strained American productive capacity and tied up shipping. A round trip voyage from San Francisco to Manila took on average sixteen weeks, three times as long as a circuit from the East Coast to France. That meant that the U.S. needed three ships in the Pacific to do the work of one in the Atlantic.

Hanson Baldwin was the military editor of the New York Times. He won a Pulitzer Prize for his work during WWII. Credit: “Hanson Baldwin” via the US Army Pictoral Center. Public domain via Wikimedia Commons.

When the Allies launched the invasion of Normandy, they had the advantage of gathering their forces in England, an industrially advanced nation, roughly fifty miles across the channel from their destination. Luzon, from which American forces would stage in preparation for the invasion of Japan, was approximately 1,400 miles from Kyushu, the southernmost of Japan’s main islands.

War in the Pacific was a war of bases, air and naval, but the Americans had to build them as they advanced, bringing with them the necessary construction materials, equipment, and manpower as they moved forward. In the Pacific, once the Allies moved beyond Hawaii and Australia, there were no modern ports capable of handling the large quantities of men and materials needed to sustain an advancing army. Often, supplies had to be shuttled in from ships anchored off the coast of an island and then hauled across the beaches on tracked vehicles. Scarcity of shipping and shortages of trained construction and engineer battalions hampered movement and became increasingly serious as American forces closed in on Japan in the final year of the war.

As Americans broke into the inner ring of Japan’s island defenses they prepared to redeploy millions of men and enormous amounts of equipment from Europe to the Pacific. The Army’s head of the Service of Supply succinctly captured the herculean scope of this effort when he compared the transfer of men and equipment from Europe to the Pacific to “moving all of Philadelphia to the Philippines.”

Comparative Areas of the United States and the Southwest Pacific. Image Credit: Office of the Chief Engineer, General Headquarters, Army Forces Pacific, Engineers of the Southwest Pacific, 1941-1945, vol. 1, Engineers in Theater Operations, Reports of Operations, United States Army Forces in the Far East, Southwest Pacific Area, Army Forces Pacific, Historical Reports (Washington, D.C.: U.S. Government Printing Office, 1947).

In the summer of 1944, the war entered a new phase with the capture of the Marianas Islands. In September, only a month past the end of organized resistance in the Marianas, the Navy seized a new fleet anchorage, Ulithi, an atoll in the middle of the Philippine Sea large enough and deep enough to hold over 600 ships. Fleet service vessels—oilers, store ships, water tankers, ammunition ships, vessels for spare parts and temporary repairs—now moved forward to Ulithi from Eniwetok, thereby positioning themselves 1,400 miles closer to the enemy. When the fleet kept at sea, re-fueling took place at rendezvous points on the water just out of range of the enemy. Tankers also transferred mail, and escort carriers transferred replacement planes and pilots. Seagoing tugs were added to tow disabled ships to floating dry docks at Seeadler Harbor in the Admiralties or Guam. With an extended and extending supply system, the fleet had extraordinary reach.

So too did the Army Air Forces, although the limitations imposed by the Pacific’s vast expanses continued to tax American resources. In late November, the first B-29 bombers were attacking Japan from the Marianas. The Marianas were 6,000 miles from San Francisco and 1,500 miles from Tokyo. An enormous effort was required to build airfields and satisfy the B-29’s insatiable appetite for bombs, parts, and fuel. Americans also struggled to provide the crews needed to keep the B-29’s flying. A round trip flight to Japan took twelve hours and crews were expected to fly no more than 75 hours per month. Even after that ceiling was raised to about 100 hours, the shortage of trained crews threatened to limit the effectiveness of the Twentieth Air Force.

Hanson Baldwin rightly described some of the numbers mentioned above as “the dry but eloquent statistics of the Pacific war.” The American war in the Pacific was a matter of establishing a functioning fighting system, of taking hold of all the complex and intricate war components and moving thousands of miles across the Pacific to the shores of Japan. It was all put together and transformed, again and again, in the crucible of battle.

]]>http://feeds.feedblitz.com/~/418309602/_/oupblogusahistory/feed/0History,Asia-Pacific War,*Featured,Implacable Foes,Japan,Books,War in the Pacific,Asia,american history,Waldo Heinrichs,WWII,America,America in WWI,Marc Gallicchio,military history,1944-1945Following a wave of Japanese attacks, the American, British, Canadian, and Dutch forces entered the Pacific War on 8 December 1941. As American forces moved across the Pacific they encountered a determined and desperate enemy and a harsh inhospitable environment. By early 1944, armed with new fast carriers, the Americans stepped up the pace of operations and launched the campaigns that would bring them to the doorstep of the Japanese homeland. But every step closer to Japan was a step farther from the United States.
The Americans confronted a host of obstacles in waging war across the Pacific, but of these, perhaps the most elemental and intractable was distance. The vast expanses of the Pacific mocked American efforts to bring the war to a speedy conclusion. As Hanson Baldwin, the New York Times military analyst noted, “[I]n the Pacific, distance is against us, and the problems of supply are mammoth.” The accompanying map, “Comparative areas of the United States and Southwest Pacific,” vividly illustrates the scope of the problem.
Distance strained American productive capacity and tied up shipping. A round trip voyage from San Francisco to Manila took on average sixteen weeks, three times as long as a circuit from the East Coast to France. That meant that the U.S. needed three ships in the Pacific to do the work of one in the Atlantic. Hanson Baldwin was the military editor of the New York Times. He won a Pulitzer Prize for his work during WWII. Credit: “Hanson Baldwin” via the US Army Pictoral Center. Public domain via Wikimedia Commons.
When the Allies launched the invasion of Normandy, they had the advantage of gathering their forces in England, an industrially advanced nation, roughly fifty miles across the channel from their destination. Luzon, from which American forces would stage in preparation for the invasion of Japan, was approximately 1,400 miles from Kyushu, the southernmost of Japan’s main islands.
War in the Pacific was a war of bases, air and naval, but the Americans had to build them as they advanced, bringing with them the necessary construction materials, equipment, and manpower as they moved forward. In the Pacific, once the Allies moved beyond Hawaii and Australia, there were no modern ports capable of handling the large quantities of men and materials needed to sustain an advancing army. Often, supplies had to be shuttled in from ships anchored off the coast of an island and then hauled across the beaches on tracked vehicles. Scarcity of shipping and shortages of trained construction and engineer battalions hampered movement and became increasingly serious as American forces closed in on Japan in the final year of the war.
As Americans broke into the inner ring of Japan’s island defenses they prepared to redeploy millions of men and enormous amounts of equipment from Europe to the Pacific. The Army’s head of the Service of Supply succinctly captured the herculean scope of this effort when he compared the transfer of men and equipment from Europe to the Pacific to “moving all of Philadelphia to the Philippines.” Comparative Areas of the United States and the Southwest Pacific. Image Credit: Office of the Chief Engineer, General Headquarters, Army Forces Pacific, Engineers of the Southwest Pacific, 1941-1945, vol. 1, Engineers in Theater Operations, Reports of Operations, United States Army Forces in the Far East, Southwest Pacific Area, Army Forces Pacific, Historical Reports (Washington, D.C.: U.S. Government Printing Office, 1947).
In the summer of 1944, the war entered a new phase with the capture of the Marianas Islands. In September, only a month past the end of organized resistance in the Marianas, the Navy seized a new fleet anchorage, Ulithi, an atoll in the middle of the Philippine Sea large enough and deep enough to hold over 600 ships. Fleet service vessels—oilers, store ships, water tankers, ... Following a wave of Japanese attacks, the American, British, Canadian, and Dutch forces entered the Pacific War on 8 December 1941. As American forces moved across the Pacific they encountered a determined and desperate enemy and a harsh ... http://feeds.feedblitz.com/~/418309602/_/oupblogusahistory/https://blog.oup.com/2017/07/silent-movies-deafness/Curing (silent) movies of deafness?http://feedproxy.google.com/~r/oupblogusahistory/~3/VNnpTay6OdU/
http://feeds.feedblitz.com/~/417388016/_/oupblogusahistory/#respondMon, 31 Jul 2017 16:30:04 +0000https://blog.oup.com/?p=132450Conventional wisdom holds that many of the favorite silent movie actors who failed to survive the transition to sound films—or talkies—in the late-1920s/early-1930s were done in by voices in some way unsuited to the new medium. Talkies are thought to have ruined the career of John Gilbert, for instance, because his “squeaky” voice did not match his on-screen persona as a leading male sex symbol. Audiences reportedly laughed the first time they heard Gilbert’s voice on screen.

]]>
Conventional wisdom holds that many of the favorite silent movie actors who failed to survive the transition to sound films—or talkies—in the late-1920s/early-1930s were done in by voices in some way unsuited to the new medium. Talkies are thought to have ruined the career of John Gilbert, for instance, because his “squeaky” voice did not match his on-screen persona as a leading male sex symbol. Audiences reportedly laughed the first time they heard Gilbert’s voice on screen. And in the case of the late silent era’s most popular female performer, the original “It girl” Clara Bow, a voice sometimes described as a “honk,” along with a strong Brooklyn accent and careless diction are often said to have forced her into retirement at the relatively young age of twenty-eight.

The real issue, however, was less the voices than the essence of the art. Despite our early-twenty-first century use of the word “movie” to refer to any cinematic production, silent movies and talkies differed substantially. Where talkies relied upon spoken words to communicate plot, ideas, and emotions, silent movies communicated visually using, as the name suggests, pictures that moved. In short, as seeing differs from hearing, so too movies differed from talkies.

The essence of silent movies was visual. As one observer at the time put it, in silent movies “People are doing something. We see them do it; even if they are only thinking or feeling…, we still see it.” Writing a movie column for a Chicago newspaper in the 1920s, Carl Sandburg marveled especially at actor Charlie Chaplin’s ability to convey complex ideas and emotions “with shrugs, smiles, solemnities, insinuations, blandishments.” Chaplin’s visual “sentences” were so “alive with gesture and intonation,” that Sandburg—one of the great American writers—could not imagine “reproduc[ing] any story Charlie Chaplin tells verbally.”

The visual nature of silent cinema made it particularly interesting to deaf Americans, for whom visual forms of communication were more natural than audible ones. In fact, historian John Schuchman argues that the silent movie era “represents the only time in the cultural history of the United States when deaf persons could participate in one of the performing arts with their hearing peers on a comparatively equal basis.” Alice T. Terry, a leading figure promoting a uniquely Deaf cultural viewpoint in the early-twentieth century, explained why: “[T]he movies [are] pre-eminently the place for pantomime or signs;” they have “no use for speech and lip-reading.”

Lon Chaney and Mary Philbin in the 1925 silent movie The Phantom of the Opera— although the “grisly mask” he wore limited his facial expressiveness, “Chaney’s best acting, with his hands, is to be seen in this picture,” according to reviewer Carl Sandburg. Public Domain via Wikimedia Commons.

On the other hand, deaf Americans in the late-1920s understood better than anyone the vulnerability of visual, non-verbal and non-audible, forms of communication confronted by the forces of normality. Beginning after the American Civil War, a number of individuals interested in deaf education—people such as telephone inventor Alexander Graham Bell—stressed the need to eliminate sign language and other visual/manual forms of communication from the nation’s deaf schools in favor of teaching verbal/audible communication in English. At the risk of over-simplification, their reasons—which included evolutionary theory, eugenics, and assimilation—boil down to an assertion that oralism could effectively cure deafness enabling deaf people to live more effectively in “the normal world.”

By the end of the 1920s—a decade famously proclaimed an era for “normalcy” by then presidential candidate Warren G. Harding—the oralist triumph in deaf education was nearly complete. According to data compiled in 1928, over 90 percent of deaf children received some or all of their education via oral methods and just two exclusively manualist schools remained (both were segregated institutions for the African American deaf). Oralists were much less successful reaching people outside the classroom, however. Sign language stayed alive in the Deaf community until the pendulum swung back in the manualist direction later in the twentieth century.

The normalizing attitudes and beliefs that underpinned oralist efforts in deaf education also afflicted ideas about modes of communication at the cinema. Even before talkies became a reality, the world depicted in silent movies was being described as “a dessicated [sic] and dehumanized world, from which all intrinsic worth has departed.” Similarly, the premiere of The Jazz Singer (1927)—widely considered the key film in the transition to talkies—was hailed at the time as “an upward step in [human] racial development which is dependent, absolutely, on the arts of communication.” The silent productions which continued to be produced during the next few years soon began to be referred to as “dummies” or “dumbies.” This, of course, recalls a common slur directed at deaf people, implying that a lack of oral and audible communication was connected to a lack of intelligence.

As in deaf education, the shift from movies to talkies met resistance. Charlie Chaplin was the most prominent resister, insisting that “I can say far more with a gesture than I can with words” and vowing “never [to] use dialogue in my pictures.” “It stands to reason,” added “It girl” Clara Bow, “that you can’t act as well when you have lines to think about, particularly those of us who have never been trained to talk while we act.” No less a movie authority than inventor Thomas A. Edison concurred, lamenting that because talkies required actors to “concentrate on the voice” they had “forgotten how to act.”

Nevertheless, within three years of The Jazz Singer’s appearance, virtually all new production of silent movies in Hollywood ceased. Eventually even Chaplin relented and began using audible dialogue to tell his filmed stories—though not until 1936. In the years since, only The Artist (2011) stands out as an attempt to make a true movie, where the pictures and actions of the performers, rather than their spoken dialogue, tell the story, but even the success of that Academy Award winning picture has not spawned copycats.

The movies had been cured of their deafness—or at least taught to “talk like living people,” as an advertisement for one theater sound system put it—but at what cost?

Featured image: Joyce Compton and Clara Bow in The Wild Party. Public Domain via Wikimedia Commons.

]]>http://feeds.feedblitz.com/~/417388016/_/oupblogusahistory/feed/0History,Russell L Johnson,silent film,*Featured,1920s,Deaf culture,Journals,oralism,deafness,oxford journals,talkies,silent cinema,Clara Bow,America,Journal of Social History,TV & Film,american cinemaConventional wisdom holds that many of the favorite silent movie actors who failed to survive the transition to sound films—or talkies—in the late-1920s/early-1930s were done in by voices in some way unsuited to the new medium. Talkies are thought to have ruined the career of John Gilbert, for instance, because his “squeaky” voice did not match his on-screen persona as a leading male sex symbol. Audiences reportedly laughed the first time they heard Gilbert’s voice on screen. And in the case of the late silent era’s most popular female performer, the original “It girl” Clara Bow, a voice sometimes described as a “honk,” along with a strong Brooklyn accent and careless diction are often said to have forced her into retirement at the relatively young age of twenty-eight.
The real issue, however, was less the voices than the essence of the art. Despite our early-twenty-first century use of the word “movie” to refer to any cinematic production, silent movies and talkies differed substantially. Where talkies relied upon spoken words to communicate plot, ideas, and emotions, silent movies communicated visually using, as the name suggests, pictures that moved. In short, as seeing differs from hearing, so too movies differed from talkies.
The essence of silent movies was visual. As one observer at the time put it, in silent movies “People are doing something. We see them do it; even if they are only thinking or feeling…, we still see it.” Writing a movie column for a Chicago newspaper in the 1920s, Carl Sandburg marveled especially at actor Charlie Chaplin’s ability to convey complex ideas and emotions “with shrugs, smiles, solemnities, insinuations, blandishments.” Chaplin’s visual “sentences” were so “alive with gesture and intonation,” that Sandburg—one of the great American writers—could not imagine “reproduc[ing] any story Charlie Chaplin tells verbally.”
The visual nature of silent cinema made it particularly interesting to deaf Americans, for whom visual forms of communication were more natural than audible ones. In fact, historian John Schuchman argues that the silent movie era “represents the only time in the cultural history of the United States when deaf persons could participate in one of the performing arts with their hearing peers on a comparatively equal basis.” Alice T. Terry, a leading figure promoting a uniquely Deaf cultural viewpoint in the early-twentieth century, explained why: “[T]he movies [are] pre-eminently the place for pantomime or signs;” they have “no use for speech and lip-reading.” Lon Chaney and Mary Philbin in the 1925 silent movie The Phantom of the Opera— although the “grisly mask” he wore limited his facial expressiveness, “Chaney’s best acting, with his hands, is to be seen in this picture,” according to reviewer Carl Sandburg. Public Domain via Wikimedia Commons.
On the other hand, deaf Americans in the late-1920s understood better than anyone the vulnerability of visual, non-verbal and non-audible, forms of communication confronted by the forces of normality. Beginning after the American Civil War, a number of individuals interested in deaf education—people such as telephone inventor Alexander Graham Bell—stressed the need to eliminate sign language and other visual/manual forms of communication from the nation’s deaf schools in favor of teaching verbal/audible communication in English. At the risk of over-simplification, their reasons—which included evolutionary theory, eugenics, and assimilation—boil down to an assertion that oralism could effectively cure deafness enabling deaf people to live more effectively in “the normal world.”
By the end of the 1920s—a decade famously proclaimed ... Conventional wisdom holds that many of the favorite silent movie actors who failed to survive the transition to sound films—or talkies—in the late-1920s/early-1930s were done in by voices in some way unsuited to the new medium.http://feeds.feedblitz.com/~/417388016/_/oupblogusahistory/https://blog.oup.com/2017/07/american-revolution-political-division/The perils of political polarizationhttp://feedproxy.google.com/~r/oupblogusahistory/~3/ljI3DvT1n8U/
http://feeds.feedblitz.com/~/410282298/_/oupblogusahistory/#respondTue, 25 Jul 2017 11:30:59 +0000https://blog.oup.com/?p=132410Political polarization in the United States seems to intensify by the day. In June 2016, surveys conducted by the Pew Research Center revealed that majorities in both parties held highly unfavorable opinions of their opponents. Many Democrats and Republicans even admitted to fearing the rival party’s political agenda. Such strong feelings have scarcely dissipated—and likely escalated—since those surveys were completed.

]]>
Political polarization in the United States seems to intensify by the day. In June 2016, surveys conducted by the Pew Research Center revealed that majorities in both parties held highly unfavorable opinions of their opponents. Many Democrats and Republicans even admitted to fearing the rival party’s political agenda. Such strong feelings have scarcely dissipated—and likely escalated—since those surveys were completed.

This is hardly the first time in American history when polarization has plagued the nation. Divisions over slavery sparked a civil war in the 1860s. A century later, the war in Vietnam and the civil rights movement generated fierce dissension. Less well known (except to historians), the 1790s witnessed such ferocious discord between Federalists and Democratic-Republicans regarding the French Revolution and domestic policy that some contemporaries feared the United States might not survive into the nineteenth century.

America’s experience with partisanship actually goes back even farther than that, right to the birth of the nation. The Revolution was not only a struggle against Britain; it was also a civil war, dividing colonists who supported independence from those who did not. Then, as now, one of the greatest dangers of extreme partisanship is the way it drives people to question the motives of their adversaries and even to contemplate violence against them. At the time of the Revolution, anyone who advocated compromise or tried to remain neutral was suspected of being a secret ally of one of the rival groups. For one Connecticut farmer who sought to distance himself from both patriots and loyalists, failure in this endeavor to find a middle ground meant death

Newgate Prison was used to house prisoners of war during the American Revolutionary War. Credit: “Old Newgate Prison, Connecticut.” by Paul Gagnon. CC BY 3.0 via Wikimedia Commons.

Moses Dunbar made no secret of his allegiance to the king, but that did not mean that he endorsed the Parliamentary taxes and other measures that drove patriots into opposition to Britain. What he rejected, as the imperial crisis escalated in 1774, was the patriots’ eagerness to resort to arms instead of seeking a peaceful resolution to what Dunbar called the “Unhappy Misunderstanding” between Britain and the colonies. Once the war began in April 1775, such calls for moderation had even less chance of being heard.

Fear of British military retaliation drove Connecticut patriots to try to identify suspected loyalists and subdue them by any means possible before they could aid the enemy. Some were confined deep underground in the infamous Newgate prison, an old copper mine. Others—including Moses Dunbar—suffered beatings from patriot gangs. After such an attack and repeated attempts to imprison him, Dunbar beseeched authorities to let him retreat to his farm and avoid the political fray. Neutrality, however, was no longer an option.

Dunbar’s decision to go to British-occupied New York in September 1776 and enlist in a loyalist regiment had far less to do with his allegiance to Britain than with a beleaguered man’s quest for safety for himself and his family. While he was in New York, Connecticut’s legislature declared such an enlistment to be a capital crime. When Dunbar returned home in late December 1776 to fetch his wife and children, a neighbor—likely induced by a desire to deflect patriot concerns about his own political allegiance—alerted authorities to Dunbar’s presence. Dunbar was arrested, convicted of treason, and executed on 19 March 1777.

As it turned out, Moses Dunbar was the only loyalist executed for treason by the state of Connecticut. The authorities clearly wished to make an example of him, but having done so were willing to retreat from using extreme measures against suspected British sympathizers. Perhaps unnerved by the deadly consequences of turning neighbor against neighbor, officials exercised greater restraint in assessing true threats and acknowledged that some people genuinely wished to remain neutral.

Passions inflamed by war typically exceed those generated by peacetime partisanship. Yet the fate of Moses Dunbar in Revolutionary Connecticut offers an unusually vivid example of what can happen when fear and suspicion lead people to translate a far more complex situation into a stark opposition of friends versus foes. At the same time, what happened in the aftermath of Dunbar’s execution demonstrates how a fractured society could begin to heal its wounds.

Once the War for Independence ended, many loyalists fled the United States to live in England or one of its remaining colonial possessions. But many more loyalists—including would-be neutrals mis-characterized by their opponents—stayed in the new nation. Dunbar’s descendants belonged to this group, and some of them continued to reside among the very people who had harassed and betrayed him. Although they never recorded their reasons in doing so, former adversaries exhausted by years of conflict evidently made a decision to cease making enemies of their neighbors. Concentrating on what they had in common rather than what drove them apart promised to be a far more productive way to escape a divisive past and head into the future.

]]>http://feeds.feedblitz.com/~/410282298/_/oupblogusahistory/feed/0History,American Colonial History,*Featured,Nathan Hale,Moses Dunbar,spies,U.S. Military history,United States,Books,american history,America,politics,Virginia DeJohn Anderson,Amercan Revolutionary War,and the American Revolution,The Martyr and the Traitor,Politics,military historyPolitical polarization in the United States seems to intensify by the day. In June 2016, surveys conducted by the Pew Research Center revealed that majorities in both parties held highly unfavorable opinions of their opponents. Many Democrats and Republicans even admitted to fearing the rival party’s political agenda. Such strong feelings have scarcely dissipated—and likely escalated—since those surveys were completed.
This is hardly the first time in American history when polarization has plagued the nation. Divisions over slavery sparked a civil war in the 1860s. A century later, the war in Vietnam and the civil rights movement generated fierce dissension. Less well known (except to historians), the 1790s witnessed such ferocious discord between Federalists and Democratic-Republicans regarding the French Revolution and domestic policy that some contemporaries feared the United States might not survive into the nineteenth century.
America’s experience with partisanship actually goes back even farther than that, right to the birth of the nation. The Revolution was not only a struggle against Britain; it was also a civil war, dividing colonists who supported independence from those who did not. Then, as now, one of the greatest dangers of extreme partisanship is the way it drives people to question the motives of their adversaries and even to contemplate violence against them. At the time of the Revolution, anyone who advocated compromise or tried to remain neutral was suspected of being a secret ally of one of the rival groups. For one Connecticut farmer who sought to distance himself from both patriots and loyalists, failure in this endeavor to find a middle ground meant death Newgate Prison was used to house prisoners of war during the American Revolutionary War. Credit: “Old Newgate Prison, Connecticut.” by Paul Gagnon. CC BY 3.0 via Wikimedia Commons.
Moses Dunbar made no secret of his allegiance to the king, but that did not mean that he endorsed the Parliamentary taxes and other measures that drove patriots into opposition to Britain. What he rejected, as the imperial crisis escalated in 1774, was the patriots’ eagerness to resort to arms instead of seeking a peaceful resolution to what Dunbar called the “Unhappy Misunderstanding” between Britain and the colonies. Once the war began in April 1775, such calls for moderation had even less chance of being heard.
Fear of British military retaliation drove Connecticut patriots to try to identify suspected loyalists and subdue them by any means possible before they could aid the enemy. Some were confined deep underground in the infamous Newgate prison, an old copper mine. Others—including Moses Dunbar—suffered beatings from patriot gangs. After such an attack and repeated attempts to imprison him, Dunbar beseeched authorities to let him retreat to his farm and avoid the political fray. Neutrality, however, was no longer an option.
Dunbar’s decision to go to British-occupied New York in September 1776 and enlist in a loyalist regiment had far less to do with his allegiance to Britain than with a beleaguered man’s quest for safety for himself and his family. While he was in New York, Connecticut’s legislature declared such an enlistment to be a capital crime. When Dunbar returned home in late December 1776 to fetch his wife and children, a neighbor—likely induced by a desire to deflect patriot concerns about his own political allegiance—alerted authorities to Dunbar’s presence. Dunbar was arrested, convicted of treason, and executed on 19 March 1777.
As it turned out, Moses Dunbar was the only loyalist executed for treason by the state of Connecticut. The authorities clearly wished to make an example of him, but having done so were willing to retreat from using extreme measures against suspected British sympathizers. Perhaps unnerved by the deadly ... Political polarization in the United States seems to intensify by the day. In June 2016, surveys conducted by the Pew Research Center revealed that majorities in both parties held highly unfavorable opinions of their opponents.http://feeds.feedblitz.com/~/410282298/_/oupblogusahistory/https://blog.oup.com/2017/07/mailboxes-us-mail/Why can mailboxes only be used for U.S. mail?http://feedproxy.google.com/~r/oupblogusahistory/~3/9STM6bD7H_8/
http://feeds.feedblitz.com/~/402139894/_/oupblogusahistory/#commentsTue, 18 Jul 2017 11:30:15 +0000https://blog.oup.com/?p=132050Because it is against Federal law to put anything in a mailbox, “on which no postage has been paid,”. If a person is caught doing so, they could be fined up to $5,000 and an organization could be fined up to $10,000. This is called the "Mailbox Restriction Law", which does not exist in most countries.

]]>
Because it is against Federal law to put anything in a mailbox, “on which no postage has been paid,” and if caught doing so a person could be fined up to $5,000 and an organization $10,000. Called the Mailbox Restriction Law, most countries do not have such legislation. But there is more. In the US, people receiving mail must pay for mailboxes, and these have to meet government specifications, or provide slots in their front doors through which postal carriers deliver mail. The Postal Service also “owns” our mailboxes and sets all the regulations involving them. Why?

If you go to the USPS website, the answer you get is that mailboxes could get so full with other items and papers that there would be no room to put mail into these. Second, the USPS says it wants “to ensure the integrity of our customer’s mailbox,” meaning only postal workers are allowed to place or remove mail from our mailboxes. History teaches us that while all this is true, there is always more to the story.

Use of First Class mail and package delivery expanded sharply in the early 1900s. Commercial users of postal services found the expense of postage higher than if they delivered their own mail, such as utility companies delivering water and telephone bills, newspapers the daily paper, and department stores their advertisements. So they began using their own carriers to deliver what otherwise would be largely First Class Mail, avoiding paying US postage. At the time, the biggest source of revenue for the Post Office was First Class Mail, and so private carriers were reducing the revenue coming into the postal agency. The US Post Office went to Congress and asked for a law to constrain this competition by making it against the law for anyone else to use a mailbox. In 1934, the New Deal Democratic Congress complied, as the postal system had enormous political power within the Democratic Party because every town and city had postal employees and they voted! The “mailbox restriction” law as it is often called (18 U.S.C. 1725) gave the Post Office what one government official observed was “a virtual monopoly over mailboxes.” Also, if it found any flyer or other item in the mailbox without postage, the Post Office could force the person putting it in there to pay postage for it even if not delivered by postal carriers.

Did it work? Yes, more or less. It seemed the Post Office had crushed its competition—at least for a while. Flyers, advertisements and newspapers continued to be delivered, just now stuck inside front doors, underneath welcome mats, and left on stoops and front yards. First Class mail still had to go through the Post Office.

Then came e-mail in the 1980s, followed by online shopping and banking in the 1990s and early 2000s. The volume of First Class mail dropped every year and, as the quantity dropped, the US Postal Service increased the price of a First Class stamp, which then motivated people to use more e-mail and to start paying their bills online, which further reduced the demand for First Class stamps. Increasingly, those utility companies that had created the problem for the Post Office in the first place made it increasingly possible to be paid online. Package delivery services, which offered better services and often cost less than the Post Office, became widely available in the 1990s. This took further business away from the Postal Service.

Until the arrival of the Internet and e-mail, the American postal system was the nation’s largest, most sophisticated information delivery infrastructure. Its role remained central to the movement of facts and all manner of paper-based reading materials, so its power was enormous. Its legacy is too. For example, a postal employee designed those round “tunnel type” mailboxes in 1915 used in front of homes and businesses. A century later they still are the most widely used. Today, the Postal Service still delivers to over 150 million addresses, but on average American adults have nearly two e-mail addresses, too, outnumbering their physical addresses. So still, the US Postal Service remains an important part of the nation’s information infrastructure.

In 2016, it handled 154 billion pieces of mail, employed 600,000 people, and operated out of over 31,000 post offices. The total revenue from the US mailing industry was $1.4 trillion. Of that, $71.4 billion came from the US Postal Service. First Class mail brought in $27.3 billion, still the biggest part of its revenues. As the Postal Service likes to point out, “If it were a private sector company, the US Postal Service would rank 39th in the 2016 Fortune 500.” The humble mailbox continues to be an integral part of our twenty-first century information infrastructure, even if the Post Office no longer has a lock on mail delivery. And you still cannot stuff mailboxes with neighborhood garage sale flyers or other things you want to conveniently leave a neighbor.

]]>http://feeds.feedblitz.com/~/402139894/_/oupblogusahistory/feed/2History,cortada,*Featured,first class mail,package,Post Office,All the Facts: A History of Information in the United States since 1870,mail,mailbox,USPS,Books,America,James W. Cortada,U.S. postage,deliveryBecause it is against Federal law to put anything in a mailbox, “on which no postage has been paid,” and if caught doing so a person could be fined up to $5,000 and an organization $10,000. Called the Mailbox Restriction Law, most countries do not have such legislation. But there is more. In the US, people receiving mail must pay for mailboxes, and these have to meet government specifications, or provide slots in their front doors through which postal carriers deliver mail. The Postal Service also “owns” our mailboxes and sets all the regulations involving them. Why?
If you go to the USPS website, the answer you get is that mailboxes could get so full with other items and papers that there would be no room to put mail into these. Second, the USPS says it wants “to ensure the integrity of our customer’s mailbox,” meaning only postal workers are allowed to place or remove mail from our mailboxes. History teaches us that while all this is true, there is always more to the story.
Use of First Class mail and package delivery expanded sharply in the early 1900s. Commercial users of postal services found the expense of postage higher than if they delivered their own mail, such as utility companies delivering water and telephone bills, newspapers the daily paper, and department stores their advertisements. So they began using their own carriers to deliver what otherwise would be largely First Class Mail, avoiding paying US postage. At the time, the biggest source of revenue for the Post Office was First Class Mail, and so private carriers were reducing the revenue coming into the postal agency. The US Post Office went to Congress and asked for a law to constrain this competition by making it against the law for anyone else to use a mailbox. In 1934, the New Deal Democratic Congress complied, as the postal system had enormous political power within the Democratic Party because every town and city had postal employees and they voted! The “mailbox restriction” law as it is often called (18 U.S.C. 1725) gave the Post Office what one government official observed was “a virtual monopoly over mailboxes.” Also, if it found any flyer or other item in the mailbox without postage, the Post Office could force the person putting it in there to pay postage for it even if not delivered by postal carriers.
Did it work? Yes, more or less. It seemed the Post Office had crushed its competition—at least for a while. Flyers, advertisements and newspapers continued to be delivered, just now stuck inside front doors, underneath welcome mats, and left on stoops and front yards. First Class mail still had to go through the Post Office. Image credit: “Mailbox” by ms.akr via CC BY-SA 2.0 via Flickr.
Then came e-mail in the 1980s, followed by online shopping and banking in the 1990s and early 2000s. The volume of First Class mail dropped every year and, as the quantity dropped, the US Postal Service increased the price of a First Class stamp, which then motivated people to use more e-mail and to start paying their bills online, which further reduced the demand for First Class stamps. Increasingly, those utility companies that had created the problem for the Post Office in the first place made it increasingly possible to be paid online. Package delivery services, which offered better services and often cost less than the Post Office, became widely available in the 1990s. This took further business away from the Postal Service.
Until the arrival of the Internet and e-mail, the American postal system was the nation’s largest, most sophisticated information delivery infrastructure. Its role remained central to the movement of facts and all manner of paper-based reading materials, so its power was enormous. Its legacy is too. For example, a postal employee designed those round “tunnel type” mailboxes in 1915 used in front of homes and businesses. A ... Because it is against Federal law to put anything in a mailbox, “on which no postage has been paid,” and if caught doing so a person could be fined up to $5,000 and an organization $10,000. Called the Mailbox Restriction Law, most ... http://feeds.feedblitz.com/~/402139894/_/oupblogusahistory/https://blog.oup.com/2017/07/war-of-1812-relevance/Should we still care about the War of 1812?http://feedproxy.google.com/~r/oupblogusahistory/~3/UjgMOgavZpA/
http://feeds.feedblitz.com/~/398607790/_/oupblogusahistory/#commentsSat, 15 Jul 2017 10:30:37 +0000https://blog.oup.com/?p=132214This summer marks 205 years since the United States declared war on the British Empire, a brief, but critical, conflict that became known as the War of 1812. This is a good opportunity to pause and take stock of its historical significance and relevance today.

]]>
This summer marks 205 years since the United States declared war on the British Empire, a brief, but critical, conflict that became known as the War of 1812. This is a good opportunity to pause and take stock of its historical significance and relevance today.

The explosion in historical studies prompted by the bicentennial rehabilitated the War of 1812 from a widely disregarded conflict studied by a handful of specialists into the mainstream. The War of 1812 has received a modern makeover: scholars probed the conflict from every angle, considering the roles of race, gender, religion, technology, sectionalism, public opinion, nationalism, Atlantic and global contexts, and more. Included in these studies is some of the best historical scholarship of our young century, and historians and their students unquestionably have a better understanding of the complexities and significance of the war and the era as a whole than ever.

But will the War of 1812 slip back into historical irrelevance in the decades to come?

It might, but it should not. For starters, the War of 1812 provides useful lessons about the relationships between military power, public opinion, and wars’ outcomes. Britain was unquestionably the superior power in 1812, yet it failed to achieve a decisive victory primarily due to the constraints of domestic politics and public opinion. Even tied down by ongoing wars with Napoleonic France, the British had enough capable officers, well-trained men, and equipment to easily defeat a series of American invasions of Canada. In fact, in the opening salvos of the war, the American forces invading Upper Canada were pushed so far back that they ended up surrendering Michigan Territory. The difference between the two navies was even greater. While the Americans famously (shockingly for contemporaries on both sides of the Atlantic) bested British ships in some one-on-one actions at the war’s start, the Royal Navy held supremacy throughout the war, blockading the U.S. coastline and ravaging coastal towns, including Washington, D.C. Yet in late 1814, the British offered surprisingly generous peace terms despite having amassed a large invasion force of veteran troops in Canada, naval supremacy in the Atlantic, an opponent that was effectively bankrupt, and an open secessionist movement in New England.

Why did Britain quit while it was ahead? The reigning Liverpool ministry in Britain held a loose grip on power and feared the war-weary, tax-exhausted public. The War of 1812 had never been popular, particularly in the central and northern manufacturing regions of England, who relied heavily on American markets. Following peace with France, the government feared the true cost of the war with America would be exposed. So the British abandoned their initially harsh terms (which included massive forfeiture of land to Canada and the American Indians) in favor of a quick peace.

The War of 1812 also debunks long-held suppositions that freely elected governments and economic partners do not go to war against each other. The United States and Britain were both governed by elected governments (with the very large caveat that women, slaves, and the poor were excluded from formal participation) that were acutely sensitive to public opinion. The British colonies that would become Canada also enjoyed elected colonial assemblies—some of whose members opted to fight on the side of the United States! They were tied by a common culture and kinship, with the vast majority of Americans tracing their roots to the British Isles and many of Canada’s inhabitants tracing their roots to the United States, and tightly bound economically. The United States was Britain’s overseas market and breadbasket, acting as the main supplier of grain for Britain’s forces in European and West Indian slave colonies, even after the War of 1812 started. Meanwhile, Canada was an economic satellite of the U.S. The bulk of Upper Canada’s (now Ontario’s) settler population were ‘late loyalists’—Americans seeking economic opportunities who emigrated in the decades following American independence. Such ties partly led to the widespread assumption of an easy conquest, or what Thomas Jefferson boasted would be ‘a mere matter of marching’.

Perhaps most importantly, the War of 1812 is a poignant reminder that the subjectivity of ‘facts’ has a long history. Then, as now, public perception could trump reality. The war’s conclusion and immediate legacy is a clear example. The Treaty of Ghent, which brought peace between the U.S. and the British Empire, declared no formal winner and called for a reinstatement of borders to their prewar status. Technically, this meant British victory, because the U.S. failed to achieve the aims listed in its declaration of war. Contemporaries, however, saw it otherwise. Few in Britain declared the war a success, with the London Times, the most popular newspaper of the day, reflected popular sentiment in a long series of editorials that bitterly lamented Britain’s defeat. In Canada, savvy colonists sought to boost their standing by propagating the false notion that Canada’s survival was owed to the inhabitants’ loyalty, unity, and stoic endurance of great hardships—forging the heart of Canadian founding mythology. In reality, many Canadians fought alongside the Americans, militia turnout was abysmal, and colonists often resented British forces, whose presence disrupted trade and resulted in forced requisitioning of food from hard-pressed farmers, as much as the American ‘invaders’.

In the U.S., President Madison and his supporters declared victory with celebrations that embraced the War of 1812 as a second war of independence. The interpretation of the war as an American success had significant consequences. The hero of these victory legends became Andrew Jackson. A popular Boston broadside exclaimed at the news of peace by calling Jackson “a second Washington”. Populist Andrew Jackson personified many qualities of the new American spirit; President Trump, eager to draw similarities between his paradigm-shifting agenda and America’s past, has recently embraced Jackson as a kindred spirit. To Jackson’s supporters and perhaps to himself, he was a no-nonsense, messiah-like outsider who would cleanse the capitol of corruption and lead the U.S. to its ‘manifest destiny’ to dominate North America. To Jackson’s opponents and victims, he was a crass bully who violently doled out his beliefs on his political opponents, African Americans, American Indians, and Spanish colonists who he insulted, enslaved, killed, and dispossessed both during the War of 1812 and afterwards as the seventh President of the United States. Unlike Trump, Jackson’s victory of the popular vote, left the opposition in tatters and his own party supplicant, enabling him to easily secure a second term and lasting legacy.

Featured Image credit: Action between U.S. Frigate Constitution and HMS Java, 29 December 1812, Painting in oils by Charles Robert Patterson. Official U.S. Navy Photograph, now in the collections of the National Archives. Public Domain via Wikimedia Commons.

]]>http://feeds.feedblitz.com/~/398607790/_/oupblogusahistory/feed/1History,*Featured,British history,war of 1812,British,american history,andrew jackson,America,james madisonThis summer marks 205 years since the United States declared war on the British Empire, a brief, but critical, conflict that became known as the War of 1812. This is a good opportunity to pause and take stock of its historical significance and relevance today.
The explosion in historical studies prompted by the bicentennial rehabilitated the War of 1812 from a widely disregarded conflict studied by a handful of specialists into the mainstream. The War of 1812 has received a modern makeover: scholars probed the conflict from every angle, considering the roles of race, gender, religion, technology, sectionalism, public opinion, nationalism, Atlantic and global contexts, and more. Included in these studies is some of the best historical scholarship of our young century, and historians and their students unquestionably have a better understanding of the complexities and significance of the war and the era as a whole than ever.
But will the War of 1812 slip back into historical irrelevance in the decades to come?
It might, but it should not. For starters, the War of 1812 provides useful lessons about the relationships between military power, public opinion, and wars’ outcomes. Britain was unquestionably the superior power in 1812, yet it failed to achieve a decisive victory primarily due to the constraints of domestic politics and public opinion. Even tied down by ongoing wars with Napoleonic France, the British had enough capable officers, well-trained men, and equipment to easily defeat a series of American invasions of Canada. In fact, in the opening salvos of the war, the American forces invading Upper Canada were pushed so far back that they ended up surrendering Michigan Territory. The difference between the two navies was even greater. While the Americans famously (shockingly for contemporaries on both sides of the Atlantic) bested British ships in some one-on-one actions at the war's start, the Royal Navy held supremacy throughout the war, blockading the U.S. coastline and ravaging coastal towns, including Washington, D.C. Yet in late 1814, the British offered surprisingly generous peace terms despite having amassed a large invasion force of veteran troops in Canada, naval supremacy in the Atlantic, an opponent that was effectively bankrupt, and an open secessionist movement in New England.
Why did Britain quit while it was ahead? The reigning Liverpool ministry in Britain held a loose grip on power and feared the war-weary, tax-exhausted public. The War of 1812 had never been popular, particularly in the central and northern manufacturing regions of England, who relied heavily on American markets. Following peace with France, the government feared the true cost of the war with America would be exposed. So the British abandoned their initially harsh terms (which included massive forfeiture of land to Canada and the American Indians) in favor of a quick peace.
The War of 1812 also debunks long-held suppositions that freely elected governments and economic partners do not go to war against each other. The United States and Britain were both governed by elected governments (with the very large caveat that women, slaves, and the poor were excluded from formal participation) that were acutely sensitive to public opinion. The British colonies that would become Canada also enjoyed elected colonial assemblies—some of whose members opted to fight on the side of the United States! They were tied by a common culture and kinship, with the vast majority of Americans tracing their roots to the British Isles and many of Canada’s inhabitants tracing their roots to the United States, and tightly bound economically. The United States was Britain's overseas market and breadbasket, acting as the main supplier of grain for Britain's forces in European and West Indian slave colonies, even after the War of 1812 started. Meanwhile, Canada was an economic satellite of the U.S. The bulk of Upper Canada's ... This summer marks 205 years since the United States declared war on the British Empire, a brief, but critical, conflict that became known as the War of 1812. This is a good opportunity to pause and take stock of its historical significance and ... http://feeds.feedblitz.com/~/398607790/_/oupblogusahistory/https://blog.oup.com/2017/07/united-states-world-war-one-history-podcast/The United States in World War I podcast serieshttp://feedproxy.google.com/~r/oupblogusahistory/~3/whORyakSpIc/
http://feeds.feedblitz.com/~/398518788/_/oupblogusahistory/#commentsSat, 15 Jul 2017 08:30:37 +0000https://blog.oup.com/?p=1320942017 marks the centennial of the United States joining World War I. To commemorate this historic occasion, Oxford University Press put together a podcast series discussing various aspects of America's involvement in the war.

]]>
2017 marks the centennial of the United States joining World War I. To commemorate this historic occasion, Oxford University Press put together a podcast series discussing various aspects of America’s involvement in the war. Each episode is hosted by Thomas Zeiler, editor of Beyond 1917: The United States and the Global Legacies of the Great War, and features a wide variety of Oxford authors including Christopher Capozzola, David Ekbladh, Andrew Huebner, David M. Kennedy, Susan R. Grayzel, Pellom McDaniels III, Tammy Proctor, Benjamin Coates, Julia F. Irwin, Charlie Laderman, Erez Manela, and Michael Neiberg. We look forward to sharing these discussions with you.

]]>http://feeds.feedblitz.com/~/398518788/_/oupblogusahistory/feed/1History,Oxford History podcast,Susan R. Grayzel,*Featured,world war 1,wwi,history podcast,Pellom McDaniels III,Charlie Laderman,David Ekbladh,Tammy Proctor,Thomas Zeiler,Arts & Humanities,race,Books,world war I,Christopher Capozzola,America,Andrew Huebner,podcast series,WW1,1917 centennial,Benjamin Coates,Dr. Julia F. Irwin,Michael Neiberg,David M. Kennedy,Erez Manela,gender2017 marks the centennial of the United States joining World War I. To commemorate this historic occasion, Oxford University Press put together a podcast series discussing various aspects of America's involvement in the war. Each episode is hosted by Thomas Zeiler, editor of Beyond 1917: The United States and the Global Legacies of the Great War, and features a wide variety of Oxford authors including Christopher Capozzola, David Ekbladh, Andrew Huebner, David M. Kennedy, Susan R. Grayzel, Pellom McDaniels III, Tammy Proctor, Benjamin Coates, Julia F. Irwin, Charlie Laderman, Erez Manela, and Michael Neiberg. We look forward to sharing these discussions with you.
Episode One: Domestic Politics and WWI
Episode Two: Gender and Race in WWI
Episode Three: U.S. Involvement Overseas During WWI
Headline image credit: RMS Lusitania coming into port, possibly in New York, 1907-13 by George Grantham Bain. Public Domain via Wikimedia Commons.
The post The United States in World War I podcast series appeared first on OUPblog. 2017 marks the centennial of the United States joining World War I. To commemorate this historic occasion, Oxford University Press put together a podcast series discussing various aspects of America's involvement in the war.http://feeds.feedblitz.com/~/398518788/_/oupblogusahistory/https://blog.oup.com/2017/07/pride-month-reading-list/Pride 2017: a reading listhttp://feedproxy.google.com/~r/oupblogusahistory/~3/5fIzggXvg70/
http://feeds.feedblitz.com/~/397365070/_/oupblogusahistory/#respondFri, 14 Jul 2017 09:30:34 +0000https://blog.oup.com/?p=132080Happy Pride Month from the OUP Philosophy team! To celebrate the LGBT Pride 2017 happening in cities across the world, including the New York City and London Prides this summer, OUP Philosophy is shining a spotlight on books that explore issues in LGBTQ rights and culture.

]]>
Happy Pride Month from the OUP Philosophy team! To celebrate the LGBT Pride 2017 happening in cities across the world, including the New York City and London Prides this summer, OUP Philosophy is shining a spotlight on books that explore issues in LGBTQ rights and culture. Our selection is diverse and multidisciplinary, covering philosophy, politics, history, biography, linguistics, and cultural studies as it is important to highlight different titles to all audiences.

Can religious liberty justify the right to discriminate against others? How can society strike a balance between achieving a positive and fair society while respecting conscience and beliefs? The point-counterpoint book brings together the leading voices of John Corvino, a longtime LGBT-rights advocate, opposite Ryan T. Anderson and Sherif Girgis, prominent young defenders of the traditional view of marriage. It provides thus an overview of the main issues in cases concerning religious liberty, for example, many of the debate concerns the question of same sex marriage: a county clerk who refuses to authorise same-sex marriage, or bakers, photographers, printers who refuse to provide same sex wedding services? Moving beyond the LGBT rights debate, it also addresses the wider questions over religious beliefs such as the value of religion, the role of government and the challenges of living in a diverse and free society. Should these people be exempted from discrimination? To what extent can religious liberty be unlawful?

The book is concerned with the ever important and contentious issue of same sex marriage and takes a form of debate between John Corvino and Maggie Gallagher; John Corvino, a philosopher and a writer well-known for his writing on LGBT equality sets forth his arguments eloquently and persuasively that allowing same sex marriage is good for both the couples and as a society at large because we have an interest in supporting a loving and committed relationship marriage equality for all its citizens. Marriage needs not to be exclusively between man and woman. Maggie Gallagher, on the other hand, takes the traditional view that marriage is the union between a husband and wife, ensuring that natural link between father-mother and children.

This book explains how support for same sex marriage has grown sharply from 11% to 60% majority of American population from 1988 and 2016. The authors show that by priming common social identities or focussing on similarities of interests, opponents who hold strong views against contentious issue such as same sex marriage can be more receptive to listen and change their attitude.

This book provides an insight into the methodological practices of queer oral historians and examines themes such as desire, sexuality and gender, sexual self-disclosure and voyeurism in documenting LGBTQ lives and experience. It gives raw transcribed interviews followed by commentaries and analysis. It also takes a look at the historiography 1950s and ’60s lesbian bar culture; social life after the Cuban revolution; the organization of transvestite social clubs in the US midwest in the 1960s; Australian gay liberation activism in the 1970s; San Francisco electoral politics and the career of Harvey Milk; Asian American community organizing in pre-AIDS Los Angeles; lesbian feminist “sex war ” cultural politics; 1980s and ’90s Latina/o transgender community memory and activism in San Francisco; and the war in Iraq and Afghanistan.

This is part of the OUP Studies in Feminist Philosophy. It looks at the transgender experience from philosophical and conceptual perspectives and asks questions such as whether we do have a true sex and whether sex and gender is an alterable characteristic? Does the old self disappears when a person’s sex assignment changes? A rich and thoughtful collection of essays exploring the issue of personal identity from various standpoints: queer theory, gender and sexuality, feminism, disability and science studies

Rosalind Rosenberg does justice to the key figure in the civil rights and women’s movements in this sensitive and thoughtful biography. She was a remarkable woman who overcame various obstacles and fought valiantly against prejudices throughout her life; coming from an unhappy family, Murray earned a college degree in New York city and was rejected for graduate studies at the University of North Carolina because of her mixed –race heritage. After graduating with a first class degree from Howard Law school, she was rejected by Harvard University on account of her sex. Undaunted, she went on, however, to carve out a successful career in law, in particular her work on anti-discrimination which abolished the segregation of schools and other landmark cases influencing government to provide the constitutional rights for women and other minority groups from discrimination. Murray often considered herself as queer in terms of sexuality and believed that she was a male by birth. In today term she would be identified as transgender .

A fascinating biography of a mathematician genius and WWII codebreaker, Alan Turing was prosecuted for being gay and chemically castrated. A must read for anyone interested in life of one of the most accomplished scientist of the twentieth century.

Through primary sources such as diaries, letters and poetry, Rachel Hope Cleves told a compelling and extraordinary history of two women who were in love with each other and took the role of husband and wife in 19th century Vermont during the civil war, overturning society conventions. They were also philanthropists and revered by their local community. As the book reveals, same sex marriage was not a 21st century innovation and originate in America much earlier than we imagine.

For anyone interested in linguistic and cultural studies, this very insightful title examines the use of language in various gay male subcultures: drag queens, radical faeries, bears, circuit boys, barebackers, and leathermen as a way to construct gay male sexual identities and desires, for example, the word ‘bear’ in the gay culture in the late 1980s to classify certain types of gay men (heavyset and hairy men) is an appropriation of linguistic stereotypes of Southern masculinity, or the term ‘leathermen’ signifies militaristic masculinity, patriotism and BDSM sexual practice.

How does rock culture with its music, performance, fashion and imagery provide the platforms for artists and consumers to experiment with gender and alternative identities? This brings together different perspectives from queer and feminist studies, performance studies, and cultural studies to linguistics.

The author asks important questions how queer dance shifts choreographic practice, challenges gender binaries and normative conventions as well as exploring how it relates to gender, identity, the body, the realm of affect and touch.

]]>http://feeds.feedblitz.com/~/397365070/_/oupblogusahistory/feed/0History,gay literature,*Featured,John Corvino,same-sex marriage,gay pride,Philosophy,lesbian gay bisexual trans queer,gay marriage,lgbtq,Pride 2017,Language,British,gay activism,AmericaHappy Pride Month from the OUP Philosophy team! To celebrate the LGBT Pride 2017 happening in cities across the world, including the New York City and London Prides this summer, OUP Philosophy is shining a spotlight on books that explore issues in LGBTQ rights and culture. Our selection is diverse and multidisciplinary, covering philosophy, politics, history, biography, linguistics, and cultural studies as it is important to highlight different titles to all audiences.
Philosophy, Politics and History
Debating Religious Liberty and Discrimination
John Corvino, Ryan T. Anderson, and Sheriff Girgis
Can religious liberty justify the right to discriminate against others? How can society strike a balance between achieving a positive and fair society while respecting conscience and beliefs? The point-counterpoint book brings together the leading voices of John Corvino, a longtime LGBT-rights advocate, opposite Ryan T. Anderson and Sherif Girgis, prominent young defenders of the traditional view of marriage. It provides thus an overview of the main issues in cases concerning religious liberty, for example, many of the debate concerns the question of same sex marriage: a county clerk who refuses to authorise same-sex marriage, or bakers, photographers, printers who refuse to provide same sex wedding services? Moving beyond the LGBT rights debate, it also addresses the wider questions over religious beliefs such as the value of religion, the role of government and the challenges of living in a diverse and free society. Should these people be exempted from discrimination? To what extent can religious liberty be unlawful?
Debating Same-Sex Marriage
John Corvino and Maggie Gallagher
The book is concerned with the ever important and contentious issue of same sex marriage and takes a form of debate between John Corvino and Maggie Gallagher; John Corvino, a philosopher and a writer well-known for his writing on LGBT equality sets forth his arguments eloquently and persuasively that allowing same sex marriage is good for both the couples and as a society at large because we have an interest in supporting a loving and committed relationship marriage equality for all its citizens. Marriage needs not to be exclusively between man and woman. Maggie Gallagher, on the other hand, takes the traditional view that marriage is the union between a husband and wife, ensuring that natural link between father-mother and children.
Listen, We Need To Talk: How to Change Attitudes about LGBT Rights?
Brian F. Harrison and Melissa R. Michelson
This book explains how support for same sex marriage has grown sharply from 11% to 60% majority of American population from 1988 and 2016. The authors show that by priming common social identities or focussing on similarities of interests, opponents who hold strong views against contentious issue such as same sex marriage can be more receptive to listen and change their attitude.
Bodies of Evidence: The Practice of Queer Oral History
Nan Alamilla Boyd, Horacio N. Roque Ramlrez
This book provides an insight into the methodological practices of queer oral historians and examines themes such as desire, sexuality and gender, sexual self-disclosure and voyeurism in documenting LGBTQ lives and experience. It gives raw transcribed interviews followed by commentaries and analysis. It also takes a look at the historiography 1950s and '60s lesbian bar culture; social life after the Cuban revolution; the organization of transvestite social clubs in the US midwest in the 1960s; Australian gay liberation activism in the 1970s; San Francisco electoral politics and the career of Harvey Milk; Asian American community organizing in pre-AIDS Los Angeles; lesbian feminist “sex war ” cultural politics; 1980s and '90s Latina/o transgender community memory and activism in San Francisco; and the war in Iraq and Afghanistan.
You've Changed: Sex Reassignment and Personal Identity
Edited by ... Happy Pride Month from the OUP Philosophy team! To celebrate the LGBT Pride 2017 happening in cities across the world, including the New York City and London Prides this summer, OUP Philosophy is shining a spotlight on books that explore issues ... http://feeds.feedblitz.com/~/397365070/_/oupblogusahistory/https://blog.oup.com/2017/06/nixon-tapes-donald-trump-history/The Nixon tapes and Donald Trumphttp://feedproxy.google.com/~r/oupblogusahistory/~3/q4jRqHXhXXk/
http://feeds.feedblitz.com/~/379768194/_/oupblogusahistory/#respondThu, 29 Jun 2017 09:30:12 +0000https://blog.oup.com/?p=131744Since President Trump’s inauguration, and even before, there have been countless comparisons between the 37th and 45th presidents of the United States. Some of the comparisons make sense, while others do not. For this reason, when I was called upon to ask a question at the 16 May, 2017 CNN town hall debate between Governor John Kasich and Senator Bernie Sanders, and I chose to ask a question about Richard Nixon and Donald Trump.

]]>
Since President Trump’s inauguration, and even before, there have been countless comparisons between the 37th and 45th presidents of the United States. Some of the comparisons make sense, while others do not.

For this reason, when I was called upon to ask a question at the 16 May, 2017 CNN town hall debate between Governor John Kasich and Senator Bernie Sanders, and I chose to ask a question about Richard Nixon and Donald Trump. For more than a decade now, I have been transcribing the secret White House tapes of President Richard Nixon. To date, I have transcribed more than anyone else, and have made the audio available as a free public service online.

Perhaps the most important similarity between Nixon and Trump is that each trusts no one more than he trusts himself. That led Nixon to record more than three thousand hours of his conversations and telephone calls secretly, so that he alone had the complete record of what was said in his presence. I suspect that if Nixon’s tapes had not played a starring role in his downfall, he would have retired to California as planned and wrote his Churchillian multi-volume memoirs. Or, if he had not taped at all, odds are he would have completed his second term. It was not until the revelation that he had taped that we could satisfy Senator Howard Baker Jr.’s admonition to discover “what did the president know and when did he know it.” Taping was the key; the tapes showed what Nixon knew and when he knew it.

In the case of Trump, discussion is now swirling over whether he taped, and whether he is still taping. I, for one, hope he is. In fact, the historian in me wishes that all presidents did. Not voluntarily, but according to statute. Wouldn’t it be great if we could one day know the truth? Even if the tapes were sealed for 50 years, or until everyone that was recorded is dead? And, if the president acted up, taped conversations could be the ultimate act of accountability to the people.

As long as Trump avoids his ‘Watergate’, his tapes will remain secret. Of course, that is if he is disciplined enough not to share them when it suits his interests. That could be a big if. The Trump Tapes would then be processed some decades from now for eventual public release by the National Archives and Records Administration according to the Presidential Records Act.

On the other hand, if Trump has tapes, or is taping, doing so would be a high stakes gamble. As Nixon said to David Frost in 1977, “I gave them a sword, and they stuck it in and twisted it with relish.” Nixon’s tapes were the sword, a sword that could be used against Trump, too, if he is not careful. Nixon and Trump could end up having even more in common.

For a president, the appeal of taping is multifaceted. It is the chance to nudge history, to shape it.

When Nixon’s White House staff discovered they had been secretly taped, their reactions varied. Some believed the president was entitled to a verbatim record of their communications, even if it meant being secretly recorded. Others felt betrayed. Still, others went on to have political careers, never fully independent from their old boss. With each new opening of Nixon tapes – approximately 700 hours have yet to be released – the same old fears return. “What did I say? What did he say about me?”

For a president, the appeal of taping is multifaceted. It is the chance to nudge history, to shape it. A president dares not leave his historical legacy in the hands of those who opposed him, or even to benign neglect. Also, as a general rule, a president cannot expect to be treated fairly during his lifetime. Tapes can be very helpful to overcome this – a way of leaving your unfiltered thoughts to future generations.

Every president desires to create some degree of mythology about himself, his words, and his actions. Presidents do this a number of ways. They invite selected historians in, on their terms, to record what they see and hear. Although the group and moment must be chosen, a president must not be afraid of the fact that they might not all be friendly to him. The president will still control the rules of engagement, and can invite them to see and hear things that serve the president’s purpose.

Presidents have also established a council of historians, whether formally or informally. President Obama did this, and recent historians have called for the role and membership of this group to be more formalized. Meetings can be formal or informal, and as frequent or infrequent as the president desires. Going beyond the group, a president sometimes takes a historian into their confidence, and uses them to nudge history. Leaders have been doing this since at least the time of Alexander the Great.

If there is one thing President Trump seems to appreciate, it is that it is better to be hated than forgotten. But he needs some mythology. The Kennedy White House did this with the creation of Camelot. Richard Nixon, despite the fact that many of his advisors recommended that he burn his tapes, remains an active topic of debate – arguably more so than any other modern president. Even President Reagan’s premature departure from public life due to Alzheimer’s during his final decade of life helped to create the mythology of Reagan with us today.

I do not think Trump will ever be universally loved, but that does not bother him as much as we think. A president can still be consequential, significant, momentous, and history-making without being liked. But Trump will need a little mythology. Taping could help.

Just don’t get caught.

Featured image credit: President Nixon, with edited transcripts of Nixon White House Tape conversations during broadcast of his address to the Nation by National Archives & Records Administration. Public domain via Wikimedia Commons.

]]>http://feeds.feedblitz.com/~/379768194/_/oupblogusahistory/feed/0History,*Featured,National Archives and Records Administration,Richard Nixon,ORE,Oxford Research Encyclopedia of American History,Presidential Records Act,White House tapes,the nixon tapes,ronald regan,America,Online products,Luke A. Nichter,Social Sciences,Donald Trump,PoliticsSince President Trump’s inauguration, and even before, there have been countless comparisons between the 37th and 45th presidents of the United States. Some of the comparisons make sense, while others do not.
For this reason, when I was called upon to ask a question at the 16 May, 2017 CNN town hall debate between Governor John Kasich and Senator Bernie Sanders, and I chose to ask a question about Richard Nixon and Donald Trump. For more than a decade now, I have been transcribing the secret White House tapes of President Richard Nixon. To date, I have transcribed more than anyone else, and have made the audio available as a free public service online.
(My question was never really answered.)
Perhaps the most important similarity between Nixon and Trump is that each trusts no one more than he trusts himself. That led Nixon to record more than three thousand hours of his conversations and telephone calls secretly, so that he alone had the complete record of what was said in his presence. I suspect that if Nixon’s tapes had not played a starring role in his downfall, he would have retired to California as planned and wrote his Churchillian multi-volume memoirs. Or, if he had not taped at all, odds are he would have completed his second term. It was not until the revelation that he had taped that we could satisfy Senator Howard Baker Jr.’s admonition to discover “what did the president know and when did he know it.” Taping was the key; the tapes showed what Nixon knew and when he knew it.
In the case of Trump, discussion is now swirling over whether he taped, and whether he is still taping. I, for one, hope he is. In fact, the historian in me wishes that all presidents did. Not voluntarily, but according to statute. Wouldn’t it be great if we could one day know the truth? Even if the tapes were sealed for 50 years, or until everyone that was recorded is dead? And, if the president acted up, taped conversations could be the ultimate act of accountability to the people.
As long as Trump avoids his ‘Watergate’, his tapes will remain secret. Of course, that is if he is disciplined enough not to share them when it suits his interests. That could be a big if. The Trump Tapes would then be processed some decades from now for eventual public release by the National Archives and Records Administration according to the Presidential Records Act.
On the other hand, if Trump has tapes, or is taping, doing so would be a high stakes gamble. As Nixon said to David Frost in 1977, “I gave them a sword, and they stuck it in and twisted it with relish.” Nixon’s tapes were the sword, a sword that could be used against Trump, too, if he is not careful. Nixon and Trump could end up having even more in common. For a president, the appeal of taping is multifaceted. It is the chance to nudge history, to shape it.
When Nixon’s White House staff discovered they had been secretly taped, their reactions varied. Some believed the president was entitled to a verbatim record of their communications, even if it meant being secretly recorded. Others felt betrayed. Still, others went on to have political careers, never fully independent from their old boss. With each new opening of Nixon tapes – approximately 700 hours have yet to be released – the same old fears return. “What did I say? What did he say about me?”
For a president, the appeal of taping is multifaceted. It is the chance to nudge history, to shape it. A president dares not leave his historical legacy in the hands of those who opposed him, or even to benign neglect. Also, as a general rule, a president cannot expect to be treated fairly during his lifetime. Tapes can be very helpful to overcome this – a way of leaving your unfiltered thoughts to future generations.
Every president desires to create some degree of mythology about himself, his words, and his ... Since President Trump’s inauguration, and even before, there have been countless comparisons between the 37th and 45th presidents of the United States. Some of the comparisons make sense, while others do not.
For this reason, when I was ... http://feeds.feedblitz.com/~/379768194/_/oupblogusahistory/https://blog.oup.com/2017/06/burlesque-history-timeline/The history of American burlesque [timeline]http://feedproxy.google.com/~r/oupblogusahistory/~3/254S2OPae4w/
http://feeds.feedblitz.com/~/378603518/_/oupblogusahistory/#respondWed, 28 Jun 2017 10:30:27 +0000https://blog.oup.com/?p=131838Burlesque is an exotic dance style that draws on theatrical and often comedic performance elements. First introduced by a visiting British dance troupe in the 1860s, burlesque took off in America even as its popularity dwindled in England.

]]>
Burlesque is an exotic dance style that draws on theatrical and often comedic performance elements. First introduced by a visiting British dance troupe in the 1860s, burlesque took off in America even as its popularity dwindled in England. The American style of burlesque evolved and spread across the country, but with an increased emphasis the exotic elements that had been more subtle in British performances. The following timeline highlights key moments in the history of American burlesque.

Featured image provided by Matilda Temperley. Please do not re-use without permission.

]]>http://feeds.feedblitz.com/~/378603518/_/oupblogusahistory/feed/0History,Legends from American Burlesque,*Featured,Dr. Kaitlyn Regehr,Matilda Temperley,burlesque,history of dance,Theatre & Dance,Dance,Books,theater,America,history of burlesque,performance,The League of Exotic DancersBurlesque is an exotic dance style that draws on theatrical and often comedic performance elements. First introduced by a visiting British dance troupe in the 1860s, burlesque took off in America even as its popularity dwindled in England. The American style of burlesque evolved and spread across the country, but with an increased emphasis the exotic elements that had been more subtle in British performances. The following timeline highlights key moments in the history of American burlesque.
Featured image provided by Matilda Temperley. Please do not re-use without permission.
The post The history of American burlesque [timeline] appeared first on OUPblog. Burlesque is an exotic dance style that draws on theatrical and often comedic performance elements. First introduced by a visiting British dance troupe in the 1860s, burlesque took off in America even as its popularity dwindled in England.http://feeds.feedblitz.com/~/378603518/_/oupblogusahistory/https://blog.oup.com/2017/06/harriet-prescott-spofford-feminism/The imaginative power and feminism of Harriet Prescott Spoffordhttp://feedproxy.google.com/~r/oupblogusahistory/~3/qgCtqp_ID7I/
http://feeds.feedblitz.com/~/377542396/_/oupblogusahistory/#respondTue, 27 Jun 2017 11:30:54 +0000https://blog.oup.com/?p=131793“A Flaming Fire Lily Among the Pale Blossoms of New England” poignantly points to the paradoxical nature behind the imaginative power of notable American author Harriet Prescott Spofford.

]]>
“A Flaming Fire Lily Among the Pale Blossoms of New England” poignantly points to the paradoxical nature behind the imaginative power of notable American author Harriet Prescott Spofford. No, she is no longer with us today, having produced most of her work during the mid nineteenth to the early twentieth centuries, but to those of us lucky enough to have encountered her tales of romance and the supernatural, we can only believe that Gothic genius came much earlier than Ann Rice and Joyce Carol Oates. During her time, Spofford published continuously in periodicals, offering short stories, serialized novels, poetry, and articles for adults and children. Despite her long career (over 60 years) and impressive list of publications, Spofford has been neglected by critics and only recently has she been resurrected from the footnote. To overlook this writer who challenged stereotypical depictions of women, blending the colors of romance with the realities of her New England environment, while introducing us to the very first female authored serial detective—A Mr. Furbush—is to shortchange our literary history.

Born in Calais, Maine in 1849, Spofford moved with her parents to Newburyport, Massachusetts, which would be her eventual resting place. She attended the Putnam Free School in Newburyport, and Pinkerton Academy in Derry, New Hampshire, both prestigious schools that indicate a very high level of formal schooling, quite a feat for a woman during this time. At the age of 16, she won a prize for the best essay on Hamlet, which drew the attention of well-known editor, Thomas Wentworth Higginson, who soon became her friend, and gave her counsel and encouragement. Economic hardship, however, owning back to Spofford’s ship-owning grandfather who lost his fortune in the War of 1812, propelled her toward writing for money, especially when her father returned from a Western prospecting trip with little to show for it. Soon after, Spofford began writing short stories to augment her father’s meager income from running a boarding house, so she set to work publishing anonymous pulp fiction in various Boston papers, sometimes laboring fifteen hours a day and making as little as $2.50 a story. One day in 1859, however, she caught a break. She sent “In a Cellar,” a dark story about Parisian life, to the venerable Atlantic Monthly which was edited by the well-known James Russell Lowell. Believing the story to be a translation, he at first rejected it, but after being assured by Higginson of its originality—“I had to be called into satisfy them that a demure little Yankee girl could have written it”—he sent Spofford a check for $100 commending her work.” From then on, her stories were gobbled up by a reading public secretly enjoying the many tales of the supernatural she penned.

Although Spofford was often held up as a model of behavior for other “lady authors,” she was also often criticized for her excessive description that lacked realistic precision. Spofford’s fiction had very little in common with what was regarded as representative of New England life. Her Gothic romances were set apart by luxuriant descriptions, and she had a gift for creating atmosphere through vivid descriptions, using setting and object to capture character, and playing with the passionate and often amoral aspects of human behavior. And yet she was an artist of Romance living in the age of literary Realism. Her dedication to descriptions and fancies ushered in some scathing criticism from male contemporaries who felt she needed to wrangle in her emotions and write with a more realistic temperament.

So, she did what any God-fearing early feminist would do—she continued to write and publish more Gothic fiction. In fact, Emily Dickinson once wrote to Higginson on 25 April 1862, “I read Miss Prescott’s ‘Circumstance,’ but it followed me in the Dark—so I avoided her.” To her sister-in-law, Sue Gilbert, Dickinson conveyed, “Sue, it is the only thing I ever read in my life that I didn’t think I could have imagined myself,” and despite her comment to Higginson, she begged Sue to “send me everything she writes.” This story, based on an incident that actually happened to Spofford’s maternal great-grandmother, describes a woman who spends an entire winter night pinned motionless in a tree by a panther, known as the “Indian Devil.” The heroine must continue to sing, hoping to lull the beast to sleep so that she can escape. She eventually does (after her husband finds her and shoots the beast) only to return to a home burned by Native Americans.

There have been calls of late for critical attention to Spofford’s long-neglected works and most of these calls have cited “Circumstance” (1860), a story concerned with issues of women’s voice. Most recently, editors have called attention to “Her Story” (1872), a tale of rivalry between two women, both unnamed, one the wife of a wealthy minister and the other his ward. The wife narrates her story from the madhouse to which she has been committed – “bur[ied] . . . alive” in “this grave” by her husband Spencer. “Her Story” in many ways anticipates the themes and critical interests of Charlotte Perkins Gilman’s “The Yellow Wallpaper,” (published two decades later). Like “The Yellow Wallpaper,” “Her Story” offers a critique of the social conventions that oppress women, but unlike Gilman’s later tale, “Her Story” is not really concerned with the divisions between men and women, but rather with the origins and significance of divisions among women. Specifically, it explores the way the oppositions common to male- and female-authored fictions (blond-passive-chaste; dark-aggressive-sexual) serve to divide women from each other. Spofford’s adaptation of the Gothic mode demonstrates how the concept of the “other woman” objectifies the status of woman as “other” in the service of intra-gender warfare. For all their apparent differences, the two rivals are shown to be more co-conspirators, both equally subject to male authority, continuously perpetuated by the myths of our culture.

Indeed, Spofford’s work did not always deal with feminist concerns, but following her love of the Gothic, she explored detective fiction as well. In fact, Harriet Spofford is the first woman writer to introduce us to a serial detective in crime fiction, anticipating latter detectives created by such authors as Metta Victor and Anna Katharine Green. “In a Cellar” presents a quasi detective—“[i]t is not often that I act as a detective”—that would be given full form in “Mr. Furbush” (1865) where he works with the New York police and is “a man of genteel proclivities, fond of fancy parties and the haut ton, curious in fine women and aristocratic defaulters and peculators.” He reappears again in “In the Maguerriwock,” (1868) where he works as a private detective. There are so many more stories to be found, perhaps leaving us to wonder if Mr. Furbush does somehow live on in the stacks of old libraries.

The works collected in A Scarlet Poppy (1894) are light satire, quite different from the more somber collection Old Madame and Other Tragedies (1900). Spofford’s days in Washington with her husband provide the basis for the sentimental stories in Old Washington (1906), and her final collection, often considered her best, The Elder’s People (1920), returns to New England. These stories reflect the dry humor, realities, believable dialect, and even restraint of New England life, all of which earned her high praise. Yet the women in these narratives remain as strong as ever. They face the realistic necessities of life, live with the limited perceptions of their men, and triumph through the art they create. Spofford always finds the cerulean and azure threads woven into the browns and grays of the New England life and the women she knew so well.

]]>http://feeds.feedblitz.com/~/377542396/_/oupblogusahistory/feed/0detective fiction,*Featured,Oxford Bibliographies,Cindy Murillo,Gothic Literature,A Scarlet Poppy,Old Washington,Arts & Humanities,The Elder's People,american literature,America,Online products,OBO,women authors,feminism,Literature,Harriet Prescott Spofford,Old Madame and Other Tragedies,Spofford“A Flaming Fire Lily Among the Pale Blossoms of New England” poignantly points to the paradoxical nature behind the imaginative power of notable American author Harriet Prescott Spofford. No, she is no longer with us today, having produced most of her work during the mid nineteenth to the early twentieth centuries, but to those of us lucky enough to have encountered her tales of romance and the supernatural, we can only believe that Gothic genius came much earlier than Ann Rice and Joyce Carol Oates. During her time, Spofford published continuously in periodicals, offering short stories, serialized novels, poetry, and articles for adults and children. Despite her long career (over 60 years) and impressive list of publications, Spofford has been neglected by critics and only recently has she been resurrected from the footnote. To overlook this writer who challenged stereotypical depictions of women, blending the colors of romance with the realities of her New England environment, while introducing us to the very first female authored serial detective—A Mr. Furbush—is to shortchange our literary history.
Born in Calais, Maine in 1849, Spofford moved with her parents to Newburyport, Massachusetts, which would be her eventual resting place. She attended the Putnam Free School in Newburyport, and Pinkerton Academy in Derry, New Hampshire, both prestigious schools that indicate a very high level of formal schooling, quite a feat for a woman during this time. At the age of 16, she won a prize for the best essay on Hamlet, which drew the attention of well-known editor, Thomas Wentworth Higginson, who soon became her friend, and gave her counsel and encouragement. Economic hardship, however, owning back to Spofford’s ship-owning grandfather who lost his fortune in the War of 1812, propelled her toward writing for money, especially when her father returned from a Western prospecting trip with little to show for it. Soon after, Spofford began writing short stories to augment her father's meager income from running a boarding house, so she set to work publishing anonymous pulp fiction in various Boston papers, sometimes laboring fifteen hours a day and making as little as $2.50 a story. One day in 1859, however, she caught a break. She sent “In a Cellar,” a dark story about Parisian life, to the venerable Atlantic Monthly which was edited by the well-known James Russell Lowell. Believing the story to be a translation, he at first rejected it, but after being assured by Higginson of its originality—“I had to be called into satisfy them that a demure little Yankee girl could have written it”—he sent Spofford a check for $100 commending her work.” From then on, her stories were gobbled up by a reading public secretly enjoying the many tales of the supernatural she penned.
Although Spofford was often held up as a model of behavior for other “lady authors,” she was also often criticized for her excessive description that lacked realistic precision. Spofford's fiction had very little in common with what was regarded as representative of New England life. Her Gothic romances were set apart by luxuriant descriptions, and she had a gift for creating atmosphere through vivid descriptions, using setting and object to capture character, and playing with the passionate and often amoral aspects of human behavior. And yet she was an artist of Romance living in the age of literary Realism. Her dedication to descriptions and fancies ushered in some scathing criticism from male contemporaries who felt she needed to wrangle in her emotions and write with a more realistic temperament. Portrait drawing of United States writer Harriet Prescott Spofford. Published in Appletons' Cyclopædia of American Biography, v. 5, 1900, p. 633. Jacques Reich, Public Domain via Wikimedia Commons.
So, she did what any God-fearing early feminist would do—she ... “A Flaming Fire Lily Among the Pale Blossoms of New England” poignantly points to the paradoxical nature behind the imaginative power of notable American author Harriet Prescott Spofford. No, she is no longer with us today, having ... http://feeds.feedblitz.com/~/377542396/_/oupblogusahistory/https://blog.oup.com/2017/06/loving-virginia-anniversary/Loving and beforehttp://feedproxy.google.com/~r/oupblogusahistory/~3/--7xaXHpTXk/
http://feeds.feedblitz.com/~/357131360/_/oupblogusahistory/#respondSun, 11 Jun 2017 09:30:38 +0000https://blog.oup.com/?p=131539This year marks the 50th anniversary of the Supreme Court Case that ruled prohibitions on interracial marriages unconstitutional. The decision and the brave couple, Richard and Mildred Loving, who challenged the Virginia statute denying their union because he was deemed a white man and she, a black woman, deserve celebration. The couple had grown up […]

]]>
This year marks the 50th anniversary of the Supreme Court Case that ruled prohibitions on interracial marriages unconstitutional. The decision and the brave couple, Richard and Mildred Loving, who challenged the Virginia statute denying their union because he was deemed a white man and she, a black woman, deserve celebration. The couple had grown up together in a small rural town where racial tensions and segregation persisted, but were faded by familiarity. As adults, Richard and Mildred fell in love and chose to formalize their relationship. They took a trip to nearby Washington, DC. where they secured a marriage license. However, soon after returning to Virginia, one of 16 states, mostly in the American South, which held firm to its anti-miscegenation statute, an overly enthusiastic sheriff barged into the couple’s bedroom in the middle of the night and arrested them. After an uncomfortable stay in jail—a then-pregnant Mildred was detained longer than Richard—the couple were released. Ordered to depart the state for 25 years, the Lovings reluctantly relocated to Washington, DC where they would raise their children. Mildred, in particular, regretted city life and wished to return to rural Virginia. As much as their longing for home and family, the arguments and energy of the era’s black freedom struggle, and faith in the rightness of their course, persuaded the couple to seek the legal support of the ACLU and file a lawsuit. In 1967, a unanimous court ruled in favor of the Lovings, determining that anti-miscegenation statutes violated the equal protection clause of the 14th amendment.

The victory marked the end of anti-miscegenation statutes that had proliferated and persisted in the United States because Americans regularly romanced across color lines and those who depended upon those lines to protect their authority worked feverishly to reinforce them wherever and whenever possible. Within their respective North American empires, the Spanish and French had selectively discouraged certain types of marriages, but it was the British who most disdained and would actively restrict interracial marriages. This antipathy reflected both the settler nature of the British colonies—the British sought to populate the land with families who would clear forests, build farms, and facilitate trade—and a desire to protect racial slavery. The colony of Maryland debuted the first anti-miscegenation statute in 1684, banning marriages between free English women (presumed to be white) and black slaves. Seven years later, neighboring Virginia passed a more punitive and comprehensive limitation, criminalizing marriages between white women or men and blacks, mulattos, and Indians. Violators would suffer banishment or removal from the dominion. All southern and many northern colonies, including Pennsylvania and Massachusetts, would follow suit. Soon after the American Revolution, the movement to abolish slavery would prompt northern states to repeal anti-miscegenation statutes, but southern states, who worried about the collapse of segregation and white supremacy after the Civil War, recommitted to them.

As the United States added western territory by force and negotiation through the 19th century, Americans and anti-miscegenation statutes moved west, too. Refusing to legally recognize love between races proved an integral part of confirming the American conquest and incorporating new land. Thus, western states not only prevented white Americans from marrying African Americans and Native Americans, as had their counterparts in the Midwest and East Coast, but those of Chinese, Japanese, Korean and Filipino descent. The maturation and popularization of pseudo-scientific ideas about racial divisions within the human population at the end of the 19th and beginning of the 20th centuries helped justify this violation of civil rights.

However, American couples regularly defied or circumvented the laws. Indeed, the triumph of the Lovings built upon the struggles of many other interracial couples who similarly formed intimate partnerships and defied the idea that individuals could be classified and divided by something as capricious as race. Among the most prominent of these forerunners were Andrea Perez and Sylvester Davis. This pair had met in 1940s Los Angeles. The son of black migrants from the south, Sylvester and Andrea, the daughter of Mexican immigrants took an immediate liking to one another. Unfortunately, as Andrea remembered, her father did not share her affections for Sylvester, worrying about the social and economic consequences of his daughter dating a black man in the United States. Although separated during World War II-Andrea worked at a local shipbuilding operation and Sylvester served in the United States Army-the couple rekindled their relationship and decided to marry after the war’s end. Soon following their marriage ceremony in a Catholic Church which honored their union, they asked for the legal help of Daniel Marshall, a leader of the Catholic Interracial Council. Like other individuals of Mexican descent, Andrea enjoyed the legal status of white, if not always the lived privileges of being white, and thus was in violation of California’s ban against marriages between whites and African or Asian Americans. Changing ideas about race accelerated by the era’s democratic rhetoric and the discovery of Nazi atrocities likely shaped the judge’s interpretation. In 1947, California became the first state whose highest court struck down an anti-miscegenation statute.

Without the particular love story of and battle waged by Richard and Mildred, a formal barrier to equality would have stood longer. Yet, we should also remember that the right to marital freedom was asserted by so many Americans. These men and women chose to love whom they loved despite state restrictions and thus became unexpected agitators in the long contest for equal rights in the United States.

Featured Image credit: US Supreme Court building in 2011. Picture by Architect of the Capitol, Public Domain via Wikimedia Commons.

]]>http://feeds.feedblitz.com/~/357131360/_/oupblogusahistory/feed/0History,Allison Varzally scotus,Law,supreme court,America,Social Sciences,PoliticsThis year marks the 50th anniversary of the Supreme Court Case that ruled prohibitions on interracial marriages unconstitutional. The decision and the brave couple, Richard and Mildred Loving, who challenged the Virginia statute denying their union because he was deemed a white man and she, a black woman, deserve celebration. The couple had grown up together in a small rural town where racial tensions and segregation persisted, but were faded by familiarity. As adults, Richard and Mildred fell in love and chose to formalize their relationship. They took a trip to nearby Washington, DC. where they secured a marriage license. However, soon after returning to Virginia, one of 16 states, mostly in the American South, which held firm to its anti-miscegenation statute, an overly enthusiastic sheriff barged into the couple’s bedroom in the middle of the night and arrested them. After an uncomfortable stay in jail—a then-pregnant Mildred was detained longer than Richard—the couple were released. Ordered to depart the state for 25 years, the Lovings reluctantly relocated to Washington, DC where they would raise their children. Mildred, in particular, regretted city life and wished to return to rural Virginia. As much as their longing for home and family, the arguments and energy of the era’s black freedom struggle, and faith in the rightness of their course, persuaded the couple to seek the legal support of the ACLU and file a lawsuit. In 1967, a unanimous court ruled in favor of the Lovings, determining that anti-miscegenation statutes violated the equal protection clause of the 14th amendment.
The victory marked the end of anti-miscegenation statutes that had proliferated and persisted in the United States because Americans regularly romanced across color lines and those who depended upon those lines to protect their authority worked feverishly to reinforce them wherever and whenever possible. Within their respective North American empires, the Spanish and French had selectively discouraged certain types of marriages, but it was the British who most disdained and would actively restrict interracial marriages. This antipathy reflected both the settler nature of the British colonies—the British sought to populate the land with families who would clear forests, build farms, and facilitate trade—and a desire to protect racial slavery. The colony of Maryland debuted the first anti-miscegenation statute in 1684, banning marriages between free English women (presumed to be white) and black slaves. Seven years later, neighboring Virginia passed a more punitive and comprehensive limitation, criminalizing marriages between white women or men and blacks, mulattos, and Indians. Violators would suffer banishment or removal from the dominion. All southern and many northern colonies, including Pennsylvania and Massachusetts, would follow suit. Soon after the American Revolution, the movement to abolish slavery would prompt northern states to repeal anti-miscegenation statutes, but southern states, who worried about the collapse of segregation and white supremacy after the Civil War, recommitted to them.
As the United States added western territory by force and negotiation through the 19th century, Americans and anti-miscegenation statutes moved west, too. Refusing to legally recognize love between races proved an integral part of confirming the American conquest and incorporating new land. Thus, western states not only prevented white Americans from marrying African Americans and Native Americans, as had their counterparts in the Midwest and East Coast, but those of Chinese, Japanese, Korean and Filipino descent. The maturation and popularization of pseudo-scientific ideas about racial divisions within the human population at the end of the 19th and beginning of the 20th centuries helped justify this violation of civil rights.
However, American couples regularly defied or ... This year marks the 50th anniversary of the Supreme Court Case that ruled prohibitions on interracial marriages unconstitutional. The decision and the brave couple, Richard and Mildred Loving, who challenged the Virginia statute denying their union ... http://feeds.feedblitz.com/~/357131360/_/oupblogusahistory/https://blog.oup.com/2017/06/net-neutrality-information-crossroads/Net neutrality and the new information crossroadshttp://feedproxy.google.com/~r/oupblogusahistory/~3/wmc3NzWVKds/
http://feeds.feedblitz.com/~/343474912/_/oupblogusahistory/#respondFri, 02 Jun 2017 08:30:47 +0000https://blog.oup.com/?p=131174Despite the rapidly expanding collections of information, the nation’s information is at risk. As more of it comes in digitized form and less in printed or verbalized formats, it can be corralled and viewed more easily by groups or institutions concerned with only their interests.

]]>
Despite the rapidly expanding collections of information, the nation’s information is at risk. As more of it comes in digitized form and less in printed or verbalized formats, it can be corralled and viewed more easily by groups or institutions concerned with only their interests. Charging for access and the flow of information can be increased thanks to software. Today, lobbying effectiveness can influence laws and regulations to an extent not seen since the wild political days of the Gilded Age. Despite where information physically sits, if in digital form it can be easily changed, added to, or constrained. That capability was never possible with printed materials or shared conversations, which were always dispersed across the vast North American continent. The benefits of having much information in digital formats are enormous and will not be denied, but as a nation Americans need to recognize that they are confronting a new world requiring informed action.

Today’s information crossroad is about the future of the Internet. It is the base upon which so much information created and used by people will continue to rest. Massive deployment of sensors communicating with each other, people, computers, and other devices, already under way, will be another, what is cutely being called the Internet of Things (IoT). There are many issues before the American public, but the central ones concern security, privacy of information, and to what extent the Internet should be an “Information Highway” on which everyone can travel.

But why involve the public? Google knows what it wants to do. The Federal Communications Commission has a solid record of making information accessible as every new technology has come along. From time to time, Congress updates copyright and patent laws to account for new technologies, most notably software. Even the slow-moving federal court system has supported the notion of free flow of information and it appears every new information-handling technology ends up generating litigation for it to adjudicate. Most recently, in 2014, the U.S. Supreme Court judged that police could not access data on a private cell phone without a search warrant. Academics in many disciplines are extensively engaged in the broader conversation, producing a flood of books and articles that make it difficult for one to keep up with. So, why not let the experts deal with crossroad issues in their own ways?

The problem with relying solely on these various groups is that the issues before Americans involve 100 percent of the public and (for those roughly over the age of 7) activities they perform frequently each day. Their issues deal with the fundamental right to access and use information. These are essentially the same ones the public faced in the last quarter of the eighteenth century when they set up the United States. The central issues involving the Internet—data security, privacy, and net neutrality—have now been before the public for over two decades. But Americans have not become as engaged with them as they should. They ought to do so now, as these will need to be addressed head-on over the next several years. Recent evidence of the Trump administration putting these at risk is more an intensification of a debate rather than the introduction of a new conversation. But that intensification teaches us that to ignore these issues is to invite a circumstance when it may become too late to reverse course. There is always someone who wants to deny you the freedom to create, collect, and use information. Today is no different than 1917 or 1817.

Americans could well see information denied to them, they will probably see a sharp increase in what they are charged for what they access, and they may be sufficiently misinformed so as to threaten, or possibly weaken, their democratic way of life. That is why everyone needs to form opinions on these three issues and to take group and individual actions in line with their conclusions. Crowdsourcing technologies combined with social media software practices demonstrate that individuals can influence the course of events to a far greater extent than in earlier times. That was a lesson of the Arab Spring of 2011, which has frequently been called such things as the Twitter Revolution and the Facebook War.

American history is full of examples of threats to data security, privacy, and—the newest for a quarter century—net neutrality. The security of one’s banking and medical records stored in computers has been the subject of controversy, vigilance, and protections granted and denied since the 1960s. We are going through a period of unprecedented attacks on data security through hacking, after a period of relative calm in the early 2000s. Privacy has been with us since the dawn of the United States. Even in the Constitution, public leaders had to include language to protect invasion of American homes and papers. Net neutrality—the idea that everyone should have equal access to the Internet—is another manifestation of ideas such as freedom of the press, access to newspapers and books, and the ability to send mail to anyone uninhibited by special prices, censorship, or prejudicial to targeted groups. The history of American information suggests that openness and access has to be constantly earned and, in fact, re-earned. The historical record is full of ugly fights, but more successes than failures. It also tells us that we cannot leave it to experts or officials. American society as a whole must engage in the debate, take sides, and vote in favor of their positions.

]]>http://feeds.feedblitz.com/~/343474912/_/oupblogusahistory/feed/0History,crowdsourcing,*Featured,net neutrality,data security,Arts & Humanities,Data,google,information history,internet,America,A History of Information in the United States since 1870,James W. Cortada,All the FactsDespite the rapidly expanding collections of information, the nation’s information is at risk. As more of it comes in digitized form and less in printed or verbalized formats, it can be corralled and viewed more easily by groups or institutions concerned with only their interests. Charging for access and the flow of information can be increased thanks to software. Today, lobbying effectiveness can influence laws and regulations to an extent not seen since the wild political days of the Gilded Age. Despite where information physically sits, if in digital form it can be easily changed, added to, or constrained. That capability was never possible with printed materials or shared conversations, which were always dispersed across the vast North American continent. The benefits of having much information in digital formats are enormous and will not be denied, but as a nation Americans need to recognize that they are confronting a new world requiring informed action.
Today’s information crossroad is about the future of the Internet. It is the base upon which so much information created and used by people will continue to rest. Massive deployment of sensors communicating with each other, people, computers, and other devices, already under way, will be another, what is cutely being called the Internet of Things (IoT). There are many issues before the American public, but the central ones concern security, privacy of information, and to what extent the Internet should be an “Information Highway” on which everyone can travel.
But why involve the public? Google knows what it wants to do. The Federal Communications Commission has a solid record of making information accessible as every new technology has come along. From time to time, Congress updates copyright and patent laws to account for new technologies, most notably software. Even the slow-moving federal court system has supported the notion of free flow of information and it appears every new information-handling technology ends up generating litigation for it to adjudicate. Most recently, in 2014, the U.S. Supreme Court judged that police could not access data on a private cell phone without a search warrant. Academics in many disciplines are extensively engaged in the broader conversation, producing a flood of books and articles that make it difficult for one to keep up with. So, why not let the experts deal with crossroad issues in their own ways?
The problem with relying solely on these various groups is that the issues before Americans involve 100 percent of the public and (for those roughly over the age of 7) activities they perform frequently each day. Their issues deal with the fundamental right to access and use information. These are essentially the same ones the public faced in the last quarter of the eighteenth century when they set up the United States. The central issues involving the Internet—data security, privacy, and net neutrality—have now been before the public for over two decades. But Americans have not become as engaged with them as they should. They ought to do so now, as these will need to be addressed head-on over the next several years. Recent evidence of the Trump administration putting these at risk is more an intensification of a debate rather than the introduction of a new conversation. But that intensification teaches us that to ignore these issues is to invite a circumstance when it may become too late to reverse course. There is always someone who wants to deny you the freedom to create, collect, and use information. Today is no different than 1917 or 1817.
Americans could well see information denied to them, they will probably see a sharp increase in what they are charged for what they access, and they may be sufficiently misinformed so as to threaten, or possibly weaken, their democratic way of life. That is why everyone needs to form opinions on these three issues and to take group and ... Despite the rapidly expanding collections of information, the nation’s information is at risk. As more of it comes in digitized form and less in printed or verbalized formats, it can be corralled and viewed more easily by groups or ... http://feeds.feedblitz.com/~/343474912/_/oupblogusahistory/https://blog.oup.com/2017/05/fundamentalism-us-mexico-border/Law and order fundamentalism and the US-Mexico borderhttp://feedproxy.google.com/~r/oupblogusahistory/~3/yNOj5XTYe_4/
http://feeds.feedblitz.com/~/340534948/_/oupblogusahistory/#respondWed, 31 May 2017 08:30:57 +0000https://blog.oup.com/?p=131296Today, the United States is experiencing a surge of law and order fundamentalism in the US-Mexico borderland. As it pertains to the international divide, law and order fundamentalism as a political ideology has a long genealogy that stretches back to the late nineteenth century. It is grounded in anti-Mexicanism as well as the abiding conviction that the border is inherently dangerous and “needs” to be policed.

]]>
The history of violence is distinct from the history of crime. This is partly because violent acts are not typically classified as crimes when they are committed by the state, and because many individuals who are categorized as criminals have not committed violent acts. When we think about the US-Mexico border in light of these crucial distinctions, it becomes clear that the way policing agencies and politicians have portrayed the international divide—as a criminal zone—does not match the history of violence in the borderland. The majority of the bloodshed, incarceration, and forced expulsion has been triggered or perpetrated by the state itself, directed against some of the most politically weak, economically vulnerable, and historically excluded people on the continent.

Today, the United States is experiencing a surge of law and order fundamentalism in the US-Mexico borderland. As it pertains to the international divide, law and order fundamentalism as a political ideology has a long genealogy that stretches back to the late nineteenth century. It is grounded in anti-Mexicanism as well as the abiding conviction that the border is inherently dangerous and “needs” to be policed. In recent years, these ideas have been rapidly amplified. Law and order fundamentalism is rooted in the allure of security for all. Precisely because the fantasy of perfect safety has such a deep human appeal, it is easily captured and converted into a tool by the very few to dominate the many. The borderland is a particularly ripe target in this regard because it is tied up with notions of territorial sovereignty, nationalism, and patriotism, each of which is freighted with notions of purity, separatism, and exceptionalism. Law and order fundamentalism is uninterested in the broader social good or in any sort of social contract. To the contrary, it eliminates the very concept of unjust laws, and thereby forecloses upon a broader debate about equality and humanitarianism. It bestows the US code with an absolute authority, as if immigration laws and drug prohibition regimes had been on the books since time immemorial.

Original Treaty of Guadalupe Hidalgo, from the Library of Congress; last page of Treaty, with signatures and seals by Nicholas Philip Trist Papers via the Hispanic Reading Room, Library of Congress. Public domain via Wikimedia Commons.

Belief in such authority reflects a complete indifference to social and historical context. Immigration laws, for instance, have been radically overhauled many times since the US-Mexico War. During the first half of the twentieth century, there were no quotas on immigration from Mexico, yet immigration from China was almost completely banned. Similarly, anti-drug laws follow a meandering and inconsistent logic as is demonstrated by the fact that alcohol was once criminalized and cocaine was once legal. Nevertheless, law and order fundamentalists insist that there is a “right” way to immigrate and comport oneself within society, despite the fact that the legal landscape of the borderland has been remarkably unstable.

This inflexibility connects law and order fundamentalism to the most illiberal and undemocratic traditions in American history. It equates illegality and immorality, thereby eroding the rights of those accused of crimes and upending an “innocent until proven guilty” paradigm. It also casts aspersions on entire groups that are associated with crime, even as it disregards the extent to which some groups are policed far more heavily than others. It obsesses over the infractions of the poor, but ignores the crimes of the rich. And although it draws its authority from anecdotal acts of violence, it leads border police overwhelmingly to sweep up nonviolent offenders. More than any other law enforcement tradition in the United States, border policing is connected to profiteering. This is another way in which it is divorced from the public good. Especially since the end of the Cold War, weapons manufacturers have found lucrative markets in selling high-tech surveillance equipment to border policing agencies, and in recent years private prison corporations have reaped enormous profits from incarcerating non-citizens, including asylum seekers.

These phenomena are highly resistant to critique because failure is politically ambiguous in border policing. To law and order fundamentalists, increases in unauthorized border crossing or bigger seizures of drugs fail to indicate that criminal justice responses do not work, or that non-coercive solutions to social problems should be sought. Rather, border problems trigger greater funding and expansion of police, guided by no consistent metric of what constitutes success. This way of developing policy is uninterested in structural, root causes, but instead understands “crime” simplistically and intuitively as the result of lax enforcement.

The border is now the crown jewel of law and order fundamentalism in the United States. It is also the site where the liberal democratic traditions of transparency, government accountability, due process, equal protection, nondiscrimination, and the right to privacy are under the greatest threat. It remains to be seen whether the political ideology responsible for the massive police buildup in the borderland will seep inward and permeate other aspects of American life, or whether the police apparatus on the border will come to be understood as too extreme.

Featured image credit: US border patrol car on the US-Mexico border by Steve Hillebrand, U.S. Fish and Wildlife Service. Public Domain via Wikimedia Commons.

]]>http://feeds.feedblitz.com/~/340534948/_/oupblogusahistory/feed/0History,*Featured,C. J. Alvarez,ORE,ore american history,border patrol,immigration laws,mexico,US immigration,Law,american history,US-Mexico relations,America,Online products,Online Products,Social Sciences,US history,Politics,immigration,US-Mexico War,us mexico borderThe history of violence is distinct from the history of crime. This is partly because violent acts are not typically classified as crimes when they are committed by the state, and because many individuals who are categorized as criminals have not committed violent acts. When we think about the US-Mexico border in light of these crucial distinctions, it becomes clear that the way policing agencies and politicians have portrayed the international divide—as a criminal zone—does not match the history of violence in the borderland. The majority of the bloodshed, incarceration, and forced expulsion has been triggered or perpetrated by the state itself, directed against some of the most politically weak, economically vulnerable, and historically excluded people on the continent.
Today, the United States is experiencing a surge of law and order fundamentalism in the US-Mexico borderland. As it pertains to the international divide, law and order fundamentalism as a political ideology has a long genealogy that stretches back to the late nineteenth century. It is grounded in anti-Mexicanism as well as the abiding conviction that the border is inherently dangerous and “needs” to be policed. In recent years, these ideas have been rapidly amplified. Law and order fundamentalism is rooted in the allure of security for all. Precisely because the fantasy of perfect safety has such a deep human appeal, it is easily captured and converted into a tool by the very few to dominate the many. The borderland is a particularly ripe target in this regard because it is tied up with notions of territorial sovereignty, nationalism, and patriotism, each of which is freighted with notions of purity, separatism, and exceptionalism. Law and order fundamentalism is uninterested in the broader social good or in any sort of social contract. To the contrary, it eliminates the very concept of unjust laws, and thereby forecloses upon a broader debate about equality and humanitarianism. It bestows the US code with an absolute authority, as if immigration laws and drug prohibition regimes had been on the books since time immemorial. Original Treaty of Guadalupe Hidalgo, from the Library of Congress; last page of Treaty, with signatures and seals by Nicholas Philip Trist Papers via the Hispanic Reading Room, Library of Congress. Public domain via Wikimedia Commons.
Belief in such authority reflects a complete indifference to social and historical context. Immigration laws, for instance, have been radically overhauled many times since the US-Mexico War. During the first half of the twentieth century, there were no quotas on immigration from Mexico, yet immigration from China was almost completely banned. Similarly, anti-drug laws follow a meandering and inconsistent logic as is demonstrated by the fact that alcohol was once criminalized and cocaine was once legal. Nevertheless, law and order fundamentalists insist that there is a “right” way to immigrate and comport oneself within society, despite the fact that the legal landscape of the borderland has been remarkably unstable.
This inflexibility connects law and order fundamentalism to the most illiberal and undemocratic traditions in American history. It equates illegality and immorality, thereby eroding the rights of those accused of crimes and upending an “innocent until proven guilty” paradigm. It also casts aspersions on entire groups that are associated with crime, even as it disregards the extent to which some groups are policed far more heavily than others. It obsesses over the infractions of the poor, but ignores the crimes of the rich. And although it draws its authority from anecdotal acts of violence, it leads border police overwhelmingly to sweep up nonviolent offenders. More than any other law enforcement tradition in the United States, border policing is connected to profiteering. This is another way in which it is divorced from the public ... The history of violence is distinct from the history of crime. This is partly because violent acts are not typically classified as crimes when they are committed by the state, and because many individuals who are categorized as criminals have not ... http://feeds.feedblitz.com/~/340534948/_/oupblogusahistory/