by Sheldon M. Stern

HNN October 24, 2014

For many weeks, on television, in newspapers and magazines, and especially on the Internet, there has been a steady drumbeat of fear, panic, paranoia, misinformation, and ignorance about an Ebola “outbreak” in the United States. Perhaps most regrettable, has been the effort to politicize the illness and blame the “outbreak” on President Obama. Some have gleefully called the virus “Obola;” others have even charged that the administration deliberately introduced this African disease as a covert form of slavery reparations to punish white America for the crime of slavery. There have also been much more serious allegations that the government is covering up the frightening truth that Ebola is transmitted through the air as well as through contact with a patient’s bodily fluids; this view has been espoused by Senator Rand Paul, who told college students that “This thing is incredibly contagious” and the Obama administration “has downplayed how transmissible it is,” and echoed by political analyst George Will, who warned ominously against believing the medical and scientific experts. New Hampshire Senate candidate Scott Brown insists that an Ebola epidemic is coming, carried by arrivals from Africa and even by illegal immigrants with ties to ISIS crossing the Mexican border. It is common sense, he contends, for the public to reject the views of so-called experts.

Of course, there is no Ebola “outbreak” in the United States. Thus far three people have been diagnosed with Ebola in Texas and one in New York. There has been just one fatality, Thomas Eric Duncan, a Liberian national visiting his family in Dallas. The two nurses who came into direct contact with his bodily fluids have recovered. The lax medical protocols in Dallas that contributed to the illness of the two nurses have already been tightened so that the New York case is very unlikely to lead to additional infections.

Meanwhile, during the same period, some seven hundred American children and adolescents in 45 states and D.C. have been diagnosed with the Enterovirus, and at least two have died. That virus is easily transmitted, much like the common cold or the flu. It seems, according to the director of the CDC’s National Center for Immunization and Respiratory Diseases, “When people are anxious about a threat like Ebola, it doesn’t necessarily matter if they look at numbers, facts and probabilities. Because of the way our brains work, something rare and exotic is much scarier than something that’s familiar.”

Is it just “the way our brains work” or is something else at work here that has deep roots in the American past? Duncan’s family, with whom he stayed for several days after becoming ill, has now gone through the 21-day quarantine period. None contracted Ebola. But, an online news site immediately questioned whether the incubation period may really be longer than 21 days, and another proclaimed that Duncan’s family would still be burdened with the stigma of having been exposed to Ebola—a claim notably absent in the cases of the three white American doctors who contracted the disease in Africa but were successfully treated after returning to the United States.

Anyone familiar with America’s racial past will detect some disturbing parallels in these irrational if not hysterical fears: bringing to mind, for example, the 17th century conviction that blackness was a physical and moral curse that consigned Africans to the status of diseased outcasts who must remain permanently under white control; the 19th century defense of slavery as a “positive good” for blacks and whites alike; the popular early 20th century “scientific” literature which proclaimed “The Negro, A Beast;” and the defense of segregation as the last hope for preserving a pure and separate white America.

The response to the thus far much more serious Enterovirus has generally been muted and reasonable. But, what if that virus, rather than Ebola, had been brought to the United States from Africa? Would it now be politically exploited as the Onterovirus and linked to fears of a conspiracy at the highest levels of the government? If one looks at the photos of Thomas Eric Duncan’s family in Texas, what you see is a group of poor, scared, and isolated people who are learning that the stigma associated with Ebola is, at least for them, irreversible, in part because it is embedded in attitudes long associated with blackness itself in our nation’s history.

Sheldon M. Stern is the author of numerous articles and Averting ‘the Final Failure’: John F. Kennedy and the Secret Cuban Missile Crisis Meetings (2003), The Week the World Stood Still: Inside the Secret Cuban Missile Crisis (2005), and The Cuban Missile Crisis in American Memory: Myths vs. Reality (2012), in the Stanford University Press Nuclear Age Series. He received his Ph.D. from Harvard in 1970 and was historian at the JFK Library in Boston from 1977 to 2000.

Phantom Menace The myth of American isolationism

In an op-ed last year in The Washington Post, former Sens. Joe Lieberman and Jon Kyl warned of “the danger of repeating the cycle of American isolationism.” That summer, Post columnist Charles Krauthammer heralded “the return of the most venerable strain of conservative foreign policy: isolationism.”

What makes these warnings odd is that in contemporary foreign policy discourse, isolationism—as the dictionary defines it—does not exist. Calling your opponent an “isolationist” serves the same function in foreign policy that calling her a “socialist” serves in domestic policy. While the term itself is nebulous, it evokes a frightening past, and thus vilifies opposing arguments without actually rebutting them. For hawks eager to discredit any serious critique of America’s military interventions in the “war on terror,” that’s very useful indeed.

TO GRASP HOW little basis today’s attacks on “isolationism” have in reality, it’s worth understanding what the term “isolationism” actually means. Merriam-Webster defines it as “the belief that a country should not be involved with other countries.” The Oxford dictionaries call it “a policy of remaining apart from the affairs or interests of … other countries.”

When critics decry isolationism today, they usually map that dictionary definition onto a particular historical period: the 1920s and 1930s. Warnings about isolationism almost always come with the same historical morality tale: America turned inward in the interwar years, and the world went to hell. That’s what makes “isolationism” scary. Like “socialism,” it’s a euphemism for “Hitler and Stalin are coming.”

The problem is that isolationism—as commonly understood—not only doesn’t fit American foreign policy today, it doesn’t even fit American foreign policy in the 1920s and 1930s. There are plenty of valid critiques of how the United States comported itself on the world stage between World War I and World War II. But the claim that America detached itself from other countries is simply not true. In 1921, for instance, President Harding summoned the world’s powers to the Washington Naval Conference and pushed through what some have called the first disarmament treaty in history. In 1924, after Germany’s failure to pay its war reparations led French and Belgian troops to occupy the Ruhr Valley, the Coolidge administration ended the crisis by appointing banker Charles Dawes to design a new reparations-payments system, which Washington muscled the European powers into accepting. American pressure helped to produce the 1925 Treaty of Locarno, which guaranteed the borders between Germany and the countries to its west (though not, fatefully, to its east). In 1930, President Hoover played a key role in the London Naval Conference, which placed further limits on naval construction.

Dr. Seuss drew many anti-isolationism cartoons during the early 1940s. (PM Magazine/Dr. Seuss)

Again and again during the interwar years, the U.S. deployed its newfound economic power to shape politics in Europe. And this overseas engagement wasn’t limited to America’s government alone. Although the United States severely limited European immigration in the 1920s, Americans built the avowedly internationalist institutions that would help guide the country’s foreign policy after World War II. The Council on Foreign Relations was born in 1921. The University of Chicago created America’s first graduate program in international affairs in 1928. And during the interwar years, American travel to Europe expanded dramatically. To be sure, the U.S. in the interwar years was more comfortable intervening economically and diplomatically than militarily. But despite the Neutrality Acts meant to keep the U.S. out of another European war, the Roosevelt administration began sending warplanes and warships to Britain two years before Pearl Harbor. By early 1941, long before America officially entered the war, its ships were already hunting German vessels across the Atlantic.

The only sense in which the United States in the interwar years truly remained apart from other nations lay in its refusal to make binding military commitments, either via the League of Nations or through alliances with particular nations. America wielded power economically, diplomatically, and even militarily, but it jealously guarded its sovereignty. That’s why one influential history of the era dubs U.S. foreign policy between the wars “independent internationalism.” (The last prominent spokesperson for that form of independence was Sen. Robert Taft of Ohio, who during the early Cold War opposed NATO because it required that America pledge itself to Europe’s defense, but who endorsed an all-out war with China to reunify Korea under Western control.) The popular “characterization of America as isolationist in the interwar period,” argues Ohio State University’s Bear Braumoeller in a useful review of the academic literature on the period, “is simply wrong.”

IF CALLING AMERICA isolationist in the 1920s and 1930s is wrong, calling America isolationist today is absurd. The United States currently stations troops in more than 150 countries. Its alliances commit it to defend large swaths of Europe and Asia against foreign attack. Recent presidents have dropped bombs on, or sent troops to, Kuwait, Iraq, Afghanistan, Bosnia, Kosovo, Somalia, Sudan, Syria, Libya, Pakistan, and Yemen. Last month, President Obama sent 3,000 American troops to battle an Ebola outbreak in West Africa. And while Americans fiercely debate particular military interventions and foreign-aid programs, the general presumption that the United States should play a leading role in solving problems far from our shores is largely uncontested in the American political mainstream.

Just how uncontested becomes clear when you examine the foreign policy evolution of Rand Paul, the man frequently held up as the leader of his party’s isolationist wing. As a Senate candidate in 2009, Paul mused about reducing America’s military bases overseas. In 2011, soon after entering the Senate, he suggested eliminating foreign aid. He has also repeatedly insisted that only Congress, and not the president, can declare war (a position that Barack Obama championed when he was in the Senate as well).

Even these views did not make Paul an isolationist. He has never questioned America’s membership in NATO, for instance, or its security alliance with Japan, the cornerstones of America’s post-World War II global role. But in Paul’s early days on the national political stage, his foreign policy instincts did diverge substantially from the ones that held sway in official Washington.

What has happened since shows just how hegemonic America’s globalist consensus actually is. For starters, Paul’s efforts to dial back American interventionism went nowhere. His Senate bill to end foreign aid to Egypt, Pakistan, and Libya got 10 votes. A later bid to reduce America’s overall aid budget from $30 billion to $5 billion garnered 18 votes. This at a time when, according to Bill Keller, America was in “a deep isolationist mood.”

Moreover, Paul’s own views have become markedly more conventional. After first saying that the U.S. should not “tweak” Russia for its aggression in Ukraine, Paul later called for imposing harsh sanctions on Moscow, reinstalling missile-defense systems in Poland and the Czech Republic, and boycotting the Winter Olympics in Sochi. On ISIS, Paul has followed a similar path. After expressing initial skepticism about the value of air strikes, he now says, “If I had been in President Obama’s shoes, I would have acted more decisively and strongly against ISIS.”

Were Paul really an isolationist, his approach to the Middle East would be straightforward: Extricate America from the region and stop giving its people reasons to hate us. But he has explicitly repudiated that view. “I don’t agree that absent Western occupation, that radical Islam goes quietly into that good night, ” he said in a speech last year. “Radical Islam is no fleeting fad but a relentless force.” Paul has even attacked Obama for “disengaging diplomatically in Iraq and the region.”

(PM Magazine/Dr. Seuss)

Instead, over the last year, Paul has developed an approach patterned on the internationalist thinking that influenced foreign policy elites during the Cold War. In a speech last February, Paul said the United States should contain jihadist Islam the way George Kennan envisioned containing Soviet Communism. For Kennan, containment represented an alternative to both isolationism and war. It required buttressing partners that could halt the expansion of Soviet power without trying to roll it back, since that would risk war. Whether one can usefully transfer the concept of containment to the current “war on terror” is questionable. But in invoking Kennan, Paul was expressing a preference for steady, cautious, long-term American engagement in the Middle East—hardly what you’d expect from an isolationist.

Besides containment, Paul’s other watchword is “stability.” “What much of the foreign policy elite fails to grasp is that intervention to topple secular dictators has been the prime source of that chaos,” he said last month. “From Hussein to Assad to Qaddafi, we have the same history. Intervention topples the secular dictator. Chaos ensues, and radical jihadists emerge. … Intervention that destabilizes the region is a mistake.”

Against both liberal interventionists and “neoconservatives” who support intervention to produce more democratic, pro-Western regimes, in other words, Paul wants the United States to support the Arab world’s traditional, comparatively secular autocrats, because at least they keep the region under control. His core argument with hawks such as John McCain and Lindsey Graham is not over whether America should withdraw from the Middle East. It’s over whether America should use its influence there to prop up the old order or usher in something new. That’s why Paul now peppers his speeches with quotes from Colin Powell, Robert Gates, and Dick Cheney circa 1991, policymakers who cut their teeth in the more risk-averse but still undoubtedly internationalist Republican Party of Henry Kissinger and George H.W. Bush. As Jason Zengerle recently pointed out in The New Republic, Paul’s foreign policy has become a fairly standard brand of realism, with some anxiety over unchecked presidential power thrown in.

Critics see this as cynical. Paul, as numerous articles have noted, has grown more hawkish as he’s courted the donors he needs to fund his likely presidential campaign. But the fact that Paul is, by necessity, drawing closer to a foreign policy consensus he once challenged is evidence not of that consensus’s weakness, but of its strength.

THAT CONSENSUS WITHIN the political class is not built upon big-dollar donations alone. There are certainly differences between how party elites want the United States to behave around the world and what ordinary citizens desire. But contrary to much media commentary, isolationism is not only largely absent from foreign policy discourse in Washington. It’s also largely absent from foreign policy discourse among the public at large.

Last December, a poll by the Pew Research Center found that, by 52 percent to 38 percent, Americans wanted the U.S. to “mind its own business internationally,” the largest gap in a half-century. The poll sparked a torrent of journalistic anxiety. “American isolationism,” fretted a Washington Post headline, “just hit a 50-year high.”

But upon closer examination, it becomes clear that Americans don’t actually want their country to “mind its own business” overseas at all. The same Pew poll that supposedly revealed Americans to be isolationists also found that, by a margin of more than 40 percentage points, they believe that “greater U.S. involvement in the global economy is a good thing.” Fifty-six percent of respondents told Pew the United States should “cooperate fully with the United Nations.” Seventy-seven percent agreed that, “in deciding on its foreign policies, the U.S. should take into account the views of its major allies.” And a clear majority opposed the idea that “since the U.S. is the most powerful nation in the world, we should go our own way in international matters.” In that same vein, a recent study by the Chicago Council on Global Affairs found that 59 percent of Americans want the U.S. to maintain its overseas military deployments at current levels. It also found that when told how much the U.S. spends on defense and foreign aid, Americans urge cutting the former but want the latter to go up.

(PM Magazine/Dr. Seuss)

How can a public that endorses greater economic globalization, far-flung military bases, extensive coordination with American allies and the United Nations, and higher foreign aid also say it wants the U.S. to “mind its own business” internationally? The answer lies in the way Washington elites have defined America’s international “business.” In recent years, America’s highest-profile overseas behavior has been its military interventions, either directly or via proxies, in Afghanistan, Iraq, Libya, Syria, and, at one point, potentially Ukraine. When Pew conducted its poll in late 2013, it was those interventions that Americans rejected, not international engagement, or even military action, per se.

The Chicago Council poll teased out the distinction. Like Pew, it uncovered an ostensibly high level of isolationism: Forty-one percent of respondents said it would “be best for the future of the country” if “we stay out of world affairs.” But when the council dug deeper, it found, “Even those who say the United States should stay out of world affairs would support sending U.S. troops to combat terrorism and Iran’s nuclear program. However, many of the conflicts in the press today—for example, in Syria and Ukraine—are not seen by the public as vital threats to the United States.” It’s no surprise, therefore, that since September, when the ISIS beheadings convinced many Americans that the chaos in Iraq and Syria might threaten them, the percentage supporting military action in those countries has shot up.

In important ways, in fact, the standard claim that elites must overcome the ingrained isolationism of ordinary Americans gets things backward. When it comes to working through the U.N. or paying heed to America’s allies, the public is more sympathetic to international cooperation than are many Beltway insiders. In official Washington, for instance, it is virtually taken for granted that America must remain the world’s lone superpower. By contrast, ordinary Americans, according to Pew, overwhelmingly want America to play a “shared leadership role” with other countries. Only 12 percent want America to be the “single world leader,” the same percentage who want America to play “no leadership role” at all.

GIVEN THE OVERWHELMING evidence, both from politicians and the public, that isolationism in America today is virtually nonexistent, why do so many high-profile commentators and politicians depict it as a grave threat? One clue lies in a word that these Cassandras use as a virtual synonym for isolationism: “retreat.” If the subtitle of Bret Stephens’s forthcoming book is The New Isolationism and the Coming Global Disorder, its title is America in Retreat. In their op-ed warning of a new “cycle of American isolationism,” Lieberman and Kyl employ variations of “retreat” or “retrench” six times.

But “isolationism” and “retreat” are entirely different things. Isolationism has a fixed meaning: avoiding contact with other nations. Retreat, by contrast, only gains meaning relatively. The mere fact that a country is retreating tells you nothing about the extent of its interactions overseas. You need to know the position it is retreating from.

(Ed Hall)

Herein lies the rub. In general, the isolationism-slayers are far more comfortable bemoaning American retreat than defending the military frontiers from which America is retreating. That’s because those frontiers, which reached their apex under George W. Bush, were both historically unprecedented and historically calamitous.

To realize how historically unprecedented they were, it’s worth remembering how much more circumscribed America’s military ambitions were under Ronald Reagan. He could not have imagined sending ground troops to invade Afghanistan or Iraq. For one thing, both countries were clients of the Soviet Union. For another, the bitter legacy of Vietnam made sending hundreds of thousands of troops to overthrow a government half a world away inconceivable. During his eight years in office, Reagan invaded only one foreign country: Grenada, whose army boasted 600 troops. In his final year in the White House, when some administration hawks suggested he invade Panama, Reagan adamantly refused. The idea struck him as far too risky.

Equally inconceivable was the idea of deploying American troops on former Soviet soil. One of the disputes that initially led hawks to label Rand Paul an isolationist was the Kentuckian’s 2011 opposition to admitting the former Soviet republic of Georgia into NATO, an issue that put him in conflict with fellow GOP rising star Marco Rubio. But if Paul is an isolationist because he opposes an American military guarantee to defend Georgia, what does that make James Baker, who in 1990 reportedly promised Mikhail Gorbachev that if Moscow allowed Germany to reunify, NATO would not expand “one inch” further east: not even into East Germany, let alone the rest of Eastern Europe, let alone the former Soviet Union itself.

Between Reagan’s presidency and Obama’s, America’s military frontier advanced to fill the gap left by the collapse of Soviet power. Aspects of that expansion turned out well. George H.W. Bush reestablished Kuwait’s sovereignty in the first Persian Gulf War; Bill Clinton helped stabilize southeastern Europe by waging war to stop Slobodan Milosevic’s rampage through Bosnia and later Kosovo; countries such as Poland, Hungary, and the Czech Republic have prospered under NATO protection.

But in Afghanistan and Iraq, America’s forward march turned catastrophic. More than twice as many Americans have died in those two wars than in the September 11 attacks that justified them. A 2013 study by Linda J. Bilmes of Harvard’s Kennedy School of Government estimates that they will ultimately cost the United States between $4 trillion and $6 trillion. As a result, she argues, their financial legacy “will dominate future federal budgets for decades to come.”

Obama has made mistakes in his retreat from those wars. (I’ve been particularly critical of him for disengaging diplomatically from Iraq while Nuri al-Maliki was pushing his country’s Sunnis into the arms of ISIS.) But the notion that Obama should not have retreated—that he should have defended a historically unprecedented military frontier in wars that were causing America debilitating long-term fiscal damage and snuffing out thousands of young American lives, against insurgencies that posed no direct or imminent threat to the United States—is hard to forthrightly defend. Which is why hawks rarely defend it. Instead, they equate retreat with isolationism and isolationism with a fictionalized account of the 1920s and 1930s. And, presto, Obama becomes a latter-day Neville Chamberlain while they become heirs to Winston Churchill rather than to a guy named Bush.

Hawks worried that Barack Obama, or Rand Paul, or the American people have not defended American interests forcefully enough in Iraq, Syria, Ukraine, or Iran can make plenty of legitimate arguments. Calling their opponents “isolationists” isn’t one of them. It’s time journalists greet that slur with the same derision they currently reserve for epithets like “socialist,” “fascist,” and “totalitarian.” Then, perhaps, we can have the foreign policy debate America deserves.

Peter Beinart is is an associate professor of journalism and political science at the City University of New York.

The Importance of Being Exceptional From Ancient Greece to Twenty-First-Century America By David Bromwich

The origins of the phrase “American exceptionalism” are not especially obscure. The French sociologist Alexis de Tocqueville, observing this country in the 1830s, said that Americans seemed exceptional in valuing practical attainments almost to the exclusion of the arts and sciences. The Soviet dictator Joseph Stalin, on hearing a report by the American Communist Party that workers in the United States in 1929 were not ready for revolution, denounced “the heresy of American exceptionalism.” In 1996, the political scientist Seymour Martin Lipset took those hints from Tocqueville and Stalin and added some of his own to produce his book American Exceptionalism: A Double-Edged Sword. The virtues of American society, for Lipset — our individualism, hostility to state action, and propensity for ad hoc problem-solving — themselves stood in the way of a lasting and prudent consensus in the conduct of American politics.

In recent years, the phrase “American exceptionalism,” at once resonant and ambiguous, has stolen into popular usage in electoral politics, in the mainstream media, and in academic writing with a profligacy that is hard to account for. It sometimes seems that exceptionalism for Americans means everything from generosity to selfishness, localism to imperialism, indifference to “the opinions of mankind” to a readiness to incorporate the folkways of every culture. When President Obama told West Point graduates last May that “I believe in American exceptionalism with every fiber of my being,” the context made it clear that he meant the United States was the greatest country in the world: our stature was demonstrated by our possession of “the finest fighting force that the world has ever known,” uniquely tasked with defending liberty and peace globally; and yet we could not allow ourselves to “flout international norms” or be a law unto ourselves. The contradictory nature of these statements would have satisfied even Tocqueville’s taste for paradox.

On the whole, is American exceptionalism a force for good? The question shouldn’t be hard to answer. To make an exception of yourself is as immoral a proceeding for a nation as it is for an individual. When we say of a person (usually someone who has gone off the rails), “He thinks the rules don’t apply to him,” we mean that he is a danger to others and perhaps to himself. People who act on such a belief don’t as a rule examine themselves deeply or write a history of the self to justify their understanding that they are unique. Very little effort is involved in their willfulness. Such exceptionalism, indeed, comes from an excess of will unaccompanied by awareness of the necessity for self-restraint.

Such people are monsters. Many land in asylums, more in prisons. But the category also encompasses a large number of high-functioning autistics: governors, generals, corporate heads, owners of professional sports teams. When you think about it, some of these people do write histories of themselves and in that pursuit, a few of them have kept up the vitality of an ancient genre: criminal autobiography.

All nations, by contrast, write their own histories as a matter of course. They preserve and exhibit a record of their doings; normally, of justified conduct, actions worthy of celebration. “Exceptional” nations, therefore, are compelled to engage in some fancy bookkeeping which exceptional individuals can avoid — at least until they are put on trial or subjected to interrogation under oath. The exceptional nation will claim that it is not responsible for its exceptional character. Its nature was given by God, or History, or Destiny.

An external and semi-miraculous instrumentality is invoked to explain the prodigy whose essence defies mere scientific understanding. To support the belief in the nation’s exceptional character, synonyms and variants of the word “providence” often get slotted in. That word gained its utility at the end of the seventeenth century — the start of the epoch of nations formed in Europe by a supposed covenant or compact. Providence splits the difference between the accidents of fortune and purposeful design; it says that God is on your side without having the bad manners to pronounce His name.

Why is it immoral for a person to treat himself as an exception? The reason is plain: because morality, by definition, means a standard of right and wrong that applies to all persons without exception. Yet to answer so briefly may be to oversimplify. For at least three separate meanings are in play when it comes to exceptionalism, with a different apology backing each. The glamour that surrounds the idea owes something to confusion among these possible senses.

First, a nation is thought to be exceptional by its very nature. It is so consistently worthy that a unique goodness shines through all its works. Who would hesitate to admire the acts of such a country? What foreigner would not wish to belong to it? Once we are held captive by this picture, “my country right or wrong” becomes a proper sentiment and not a wild effusion of prejudice, because we cannot conceive of the nation being wrong.

A second meaning of exceptional may seem more open to rational scrutiny. Here, the nation is supposed to be admirable by reason of history and circumstance. It has demonstrated its exceptional quality by adherence to ideals which are peculiar to its original character and honorable as part of a greater human inheritance. Not “my country right or wrong” but “my country, good and getting better” seems to be the standard here. The promise of what the country could turn out to be supports this faith. Its moral and political virtue is perceived as a historical deposit with a rich residue in the present.

A third version of exceptionalism derives from our usual affectionate feelings about living in a community on the scale of a neighborhood or township, an ethnic group or religious sect. Communitarian nationalism takes the innocent-seeming step of generalizing that sentiment to the nation at large. My country is exceptional to me (according to this view) just because it is mine. Its familiar habits and customs have shaped the way I think and feel; nor do I have the slightest wish to extricate myself from its demands. The nation, then, is like a gigantic family, and we owe it what we owe to the members of our family: “unconditional love.” This sounds like the common sense of ordinary feelings. How can our nation help being exceptional to us?

Teacher of the World

Athens was just such an exceptional nation, or city-state, as Pericles described it in his celebrated oration for the first fallen soldiers in the Peloponnesian War. He meant his description of Athens to carry both normative force and hortatory urgency. It is, he says, the greatest of Greek cities, and this quality is shown by its works, shining deeds, the structure of its government, and the character of its citizens, who are themselves creations of the city. At the same time, Pericles was saying to the widows and children of the war dead: Resemble them! Seek to deserve the name of Athenian as they have deserved it!

The oration, recounted by Thucydides in the History of the Peloponnesian War, begins by praising the ancestors of Athenian democracy who by their exertions have made the city exceptional. “They dwelt in the country without break in the succession from generation to generation, and handed it down free to the present time by their valor.” Yet we who are alive today, Pericles says, have added to that inheritance; and he goes on to praise the constitution of the city, which “does not copy the laws of neighboring states; we are rather a pattern to others than imitators ourselves.”

The foreshadowing here of American exceptionalism is uncanny and the anticipation of our own predicament continues as the speech proceeds. “In our enterprises we present the singular spectacle of daring and deliberation, each carried to its highest point, and both united in the same persons… As a city we are the school of Hellas” — by which Pericles means that no representative citizen or soldier of another city could possibly be as resourceful as an Athenian. This city, alone among all the others, is greater than her reputation.

We Athenians, he adds, choose to risk our lives by perpetually carrying a difficult burden, rather than submitting to the will of another state. Our readiness to die for the city is the proof of our greatness. Turning to the surviving families of the dead, he admonishes and exalts them: “You must yourselves realize the power of Athens,” he tells the widows and children, “and feed your eyes upon her from day to day, till love of her fills your hearts; and then when all her greatness shall break upon you, you must reflect that it was by courage, sense of duty, and a keen feeling of honor in action that men were enabled to win all this.” So stirring are their deeds that the memory of their greatness is written in the hearts of men in faraway lands: “For heroes have the whole earth for their tomb.”

Athenian exceptionalism at its height, as the words of Pericles indicate, took deeds of war as proof of the worthiness of all that the city achieved apart from war. In this way, Athens was placed beyond comparison: nobody who knew it and knew other cities could fail to recognize its exceptional nature. This was not only a judgment inferred from evidence but an overwhelming sensation that carried conviction with it. The greatness of the city ought to be experienced, Pericles imagines, as a vision that “shall break upon you.”

Guilty Past, Innocent Future

To come closer to twenty-first-century America, consider how, in the Gettysburg Address, Abraham Lincoln gave an exceptional turn to an ambiguous past. Unlike Pericles, he was speaking in the midst of a civil war, not a war between rival states, and this partly explains the note of self-doubt that we may detect in Lincoln when we compare the two speeches. At Gettysburg, Lincoln said that a pledge by the country as a whole had been embodied in a single document, the Declaration of Independence. He took the Declaration as his touchstone, rather than the Constitution, for a reason he spoke of elsewhere: the latter document had been freighted with compromise. The Declaration of Independence uniquely laid down principles that might over time allow the idealism of the founders to be realized.

Athens, for Pericles, was what Athens always had been. The Union, for Lincoln, was what it had yet to become. He associated the greatness of past intentions — “We hold these truths to be self-evident” — with the resolve he hoped his listeners would carry out in the present moment: “It is [not for the noble dead but] rather for us to be here dedicated to the great task remaining before us — that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion — that we here highly resolve that these dead shall not have died in vain — that this nation, under God, shall have a new birth of freedom.”

This allegorical language needs translation. In the future, Lincoln is saying, there will be a popular government and a political society based on the principle of free labor. Before that can happen, however, slavery must be brought to an end by carrying the country’s resolution into practice. So Lincoln asks his listeners to love their country for what it may become, not what it is. Their self-sacrifice on behalf of a possible future will serve as proof of national greatness. He does not hide the stain of slavery that marred the Constitution; the imperfection of the founders is confessed between the lines. But the logic of the speech implies, by a trick of grammar and perspective, that the Union was always pointed in the direction of the Civil War that would make it free.

Notice that Pericles’s argument for the exceptional city has here been reversed. The future is not guaranteed by the greatness of the past; rather, the tarnished virtue of the past will be scoured clean by the purity of the future. Exceptional in its reliance on slavery, the state established by the first American Revolution is thus to be redeemed by the second. Through the sacrifice of nameless thousands, the nation will defeat slavery and justify its fame as the truly exceptional country its founders wished it to be.

Most Americans are moved (without quite knowing why) by the opening words of the Gettysburg Address: “Four score and seven years ago our fathers…” Four score and seven is a biblical marker of the life of one person, and the words ask us to wonder whether our nation, a radical experiment based on a radical “proposition,” can last longer than a single life-span. The effect is provocative. Yet the backbone of Lincoln’s argument would have stood out more clearly if the speech had instead begun: “Two years from now, perhaps three, our country will see a great transformation.” The truth is that the year of the birth of the nation had no logical relationship to the year of the “new birth of freedom.” An exceptional character, however, whether in history or story, demands an exceptional plot; so the speech commences with deliberately archaic language to ask its implicit question: Can we Americans survive today and become the school of modern democracy, much as Athens was the school of Hellas?

The Ties That Bind and Absolve

To believe that our nation has always been exceptional, as Pericles said Athens was, or that it will soon justify such a claim, as Lincoln suggested America would do, requires a suppression of ordinary skepticism. The belief itself calls for extraordinary arrogance or extraordinary hope in the believer. In our time, exceptionalism has been made less exacting by an appeal to national feeling based on the smallest and most vivid community that most people know: the family. Governor Mario Cuomo of New York, in his keynote address at the 1984 Democratic convention, put this straightforwardly. America, said Cuomo, was like a family, and a good family never loses its concern for the least fortunate of its members. In 2011, President Obama, acceding to Republican calls for austerity that led to the sequestration of government funds, told us that the national economy was just like a household budget and every family knows that it must pay its bills.

To take seriously the metaphor of the nation-as-family may lead to a sense of sentimental obligation or prudential worry on behalf of our fellow citizens. But many people think we should pursue the analogy further. If our nation does wrong, they say, we must treat it as an error and not a crime because, after all, we owe our nation unconditional love. Yet here the metaphor betrays our thinking into a false equation. A family has nested us, cradled us, nursed us from infancy, as we have perhaps done for later generations of the same family; and it has done so in a sense that is far more intimate than the sense in which a nation has fostered or nurtured us. We know our family with an individuated depth and authority that can’t be brought to our idea of a nation. This may be a difference of kind, or a difference of degree, but the difference is certainly great.

A subtle deception is involved in the analogy between nation and family; and an illicit transfer of feelings comes with the appeal to “unconditional love.” What do we mean by unconditional love, even at the level of the family? Suppose my delinquent child robs and beats an old man on a city street, and I learn of it by his own confession or by accident. What exactly do I owe him?

Unconditional love, in this setting, surely means that I can’t stop caring about my child; that I will regard his terrible action as an aberration. I will be bound to think about the act and actor quite differently from the way I would think about anyone else who committed such a crime. But does unconditional love also require that I make excuses for him? Shall I pay a lawyer to get him off the hook and back on the streets as soon as possible? Is it my duty to conceal what he has done, if there is a chance of keeping it secret? Must I never say what he did in the company of strangers or outside the family circle?

At a national level, the doctrine of exceptionalism as unconditional love encourages habits of suppression and euphemism that sink deep roots in the common culture. We have seen the result in America in the years since 2001. In the grip of this doctrine, torture has become “enhanced interrogation”; wars of aggression have become wars for democracy; a distant likely enemy has become an “imminent threat” whose very existence justifies an executive order to kill. These are permitted and officially sanctioned forms of collective dishonesty. They begin in quasi-familial piety, they pass through the systematic distortion of language, and they end in the corruption of consciousness.

The commandment to “keep it in the family” is a symptom of that corruption. It follows that one must never speak critically of one’s country in the hearing of other nations or write against its policies in foreign newspapers. No matter how vicious and wrong the conduct of a member of the family may be, one must assume his good intentions. This ideology abets raw self-interest in justifying many actions by which the United States has revealingly made an exception of itself — for example, our refusal to participate in the International Criminal Court. The community of nations, we declared, was not situated to understand the true extent of our constabulary responsibilities. American actions come under a different standard and we are the only qualified judges of our own cause.

The doctrine of the national family may be a less fertile source of belligerent pride than “my country right or wrong.” It may be less grandiose, too, than the exceptionalism that asks us to love our country for ideals that have never properly been translated into practice. And yet, in this appeal to the family, one finds the same renunciation of moral knowledge — a renunciation that, if followed, would render inconceivable any social order beyond that of the family and its extension, the tribe.

Unconditional love of our country is the counterpart of unconditional detachment and even hostility toward other countries. None of us is an exception, and no nation is. The sooner we come to live with this truth as a mundane reality without exceptions, the more grateful other nations will be to live in a world that includes us, among others.

The Civil War’s Most Famous Clown

David Carlyon

The New York Times September 18, 2014

A clown ran for public office – and no, that’s not the beginning of a joke. On Sept. 15, 1864, America’s most famous circus clown, Dan Rice, accepted the Democratic nomination for the Pennsylvania State Senate. And it was just his first foray into politics: Even while continuing his career as a clown, a state convention later considered him as a candidate for Congress, and, in 1867, he made a brief but legitimate run for president.

Dan Rice, ca. 1870.Credit David Carlyon

While the idea of a clown running for office sounds like a gimmick, in the 1860s it was taken seriously — because circus itself was taken seriously, as adult fare. Long before it was relegated to children’s entertainment, early circus in this country combined what appealed to grown-up tastes: sex, violence, political commentary and, in a horse-based culture, top-notch horsemanship. George Washington attended the first circus in 1793 in Philadelphia not for family-friendly amusement — a notion that didn’t emerge until the 1880s — but as a horseman keen to see animals and humans working together at a peak level.

Sex and violence enhanced the appeal. Like later burlesque comedians, talking clowns told dirty jokes in a titillating whirl of the scantily clad: Circus acrobats and riders showed more skin — or flesh-colored fabric that seemed to be skin — than could be seen anywhere else in public life.

Walt Whitman approved. Reviewing a circus in 1856 in Brooklyn, he wrote: “It can do no harm to boys to see a set of limbs display all their agility.” (In a favorite mind-plus-body theme, Whitman added: “A circus performer is the other half of a college professor. The perfect Man has more than the professor’s brain, and a good deal of the performer’s legs.”) Meanwhile, fights were a daily occurrence, drawing attention the way fights at soccer matches do now. Violence was so common that Rice’s journal from 1856 noted the rare days when no fight occurred.

And while nostalgia portrays early circus as small and quaint, antebellum tents were some of the largest structures on the continent, seating thousands, while over the winter, circuses played major city theaters.

Dan Rice stood in the center of this lively public arena. Born in New York City in 1823, he burst onto the circus scene in the 1840s with a lightning-quick wit and sharp topical instincts that made him a national favorite. Proclaiming himself “the Great American Humorist,” he combined ad-libs, jokes ancient and new, sexual allusions, comic and sentimental songs, clever parodies of Shakespeare and quips on current events. (He did little physical comedy, which was the specialty of knockabout clowns and acrobats.)

Scholars believe that Mark Twain, who later adopted that Great American Humorist label, used Rice as his model for the clown described in “Huckleberry Finn,” “carrying on so it most killed the people,” as “quick as a wink with the funniest things a body ever said.” Though obscure when he died in 1900, Rice had probably been seen by more Americans than any other public figure. Nor was renown restricted to the United States: Imitators in England and Germany appropriated his famous name in their own acts.

As the country tumbled toward war, Rice expanded his “hits on the times.” Instead of Bozo, think Jon Stewart or Rush Limbaugh. Or Robin Williams, who shared the same quick wit, verbal virtuosity, and sharp political humor. (In fact, Williams toyed with the idea of playing Rice in a movie.) Rice’s expanded approach extended to his costumes, as he alternated between traditional clown garb decorated in stripes and stars, and a new look of tailcoat, vest, and pants, the Great American Humorist as respectable gentleman, a man with serious opinions on the events of the day.

Once the Civil War erupted, Rice pushed directly into politics, a Peace Democrat condemning Abraham Lincoln and “Black Republicans” from the circus ring. By 1864, it was a natural step for the Democrats of Erie, Pa.., near his winter quarters in Girard, to choose the nationally prominent “Col. Dan Rice” as their candidate for the state senate. (The title was self-granted, matching the times’ martial mood.)

Writing from his tour on Sept. 15 to accept the nomination, Rice denied that he worshipped “at the shrine of any political dogma,” but did declare that his “proclivities were formerly with the Whigs.” He condemned Lincoln for violating the Constitution and creating an imperial presidency. Rice wrote: “When I see the great principles of personal liberty and the rights of property being cloven down by the men now running the machine of Government, ‘the ancient landmarks’ of the Constitution ‘which our fathers set’ removed, I feel like crying, in the language of the Holy Writ, ‘cursed be he that removeth them.’”

Historians, adopting the later family-friendly image of circus, assumed that a clown’s campaign for office had to be a publicity stunt. But Rice’s nomination was no joke. Chicago newspapers took it seriously: On Sept. 23, the Republican Tribune opened a two-day attack in its headline, “Dan Rice and Disloyalty.” It complained that Rice filled “his ring talk with disloyal utterances and flings at Lincoln and the war. A trimmer so cautious as this personage who once, it is said, actually gave a performance under the confederate flag, should understand that this style of thing will not pay in loyal communities.” (The “Confederate flag” jab was political spin, because Rice presented his circus in New Orleans when Louisiana seceded.)

Next the Tribune claimed that no one laughed at Rice’s “quips and pasquinades persistently leveled at the President, the war, the government, and the anti-slavery sentiment of the north.” That Rice could make these jokes and still attract customers is another indication that late into 1864, discontent about the war remained strong. The Tribune, in an allusion to Southern sympathizers known as Copperheads, concluded by urging the press on his route to guard that his jokes did not “resemble a certain kind of soda — ‘drawn from copper.’” (Rice, visiting his friend Morrison Foster, Stephen Foster’s brother, apparently met the notorious Copperhead Clement Vallandigham there.)

Even as criticism of abolitionists continued, the crucible of war was burning away belief that the nation’s “peculiar institution” of slavery was acceptable. And as the country changed, so did Rice. In a July 4 speech in Elmira, N.Y., he had declared that blacks “are God’s creatures, and shouldn’t belong to Jeff. Davis, or any other man,” for they “were not made for southern planters to vote on, nor northern fanatics to dote on.” He added a folksy variation on Lincoln’s theme of equality: “Let every tub stand on its own bottom.”

Rice ran an abbreviated campaign. He was still a businessman with a show to troupe. He also knew he faced an uphill battle, running against a Republican incumbent, Morrow Lowry, in a heavily Republican district. Whatever advantage his national renown gave him was offset by the leading families of Girard, who harbored the distaste of small-town gentry for “the show business.” That distaste increased when Rice married into one of those families against their objections, to a woman the same age as his daughters

Despite such handicaps, in November Rice ran ahead of the Democratic ticket. He attracted 40 percent of the district’s vote, while the presidential candidate Gen. George McClellan got only 36 percent.

Later, like others who had criticized the war, Rice sought to shore up his reputation for patriotism. In 1865 in Girard he erected what was said to be the first Civil War monument, with a ceremony featured on the front page of the Nov. 25 Harper’s Weekly.

He also began peddling a claim that he’d been Abraham Lincoln’s pal, dropping by the White House to cheer up war-weary Abe and advise him on the mood of the country. Blatantly false, the tale thrived thanks to Rice’s national stature and the postwar urge to paper over the bitter divide of the war. The Lincoln fiction survived intact into the 20th century, as a bit of trivia about the president, because it fit a new sentimentality about clowns as sweetly innocuous. It was easier to believe in a clown consoling Lincoln than one attacking him as a tyrant.

Another claim, though one that Rice didn’t make himself, said he’d been the model for Uncle Sam. At first glance it’s unlikely. Thomas Nast, the cartoonist who completed the evolution of that image to the icon we know today, was a fervent Republican who wouldn’t have knowingly based anything on a fervent Democrat like Rice. But it wouldn’t have been unusual to be unconsciously influenced by one of the most famous Americans of the era. In any case Nast drew a cartoon that echoed Rice perfectly, combining the famous clown’s democratic irreverence, his trademark goatee, the top hat he often wore, and a mash-up of his two primary costumes, a clown’s stars and stripes and the fancy wardrobe of a middle-class gentleman. If anyone could be said to have been the model for Uncle Sam, it was Dan Rice, circus clown and political candidate.

A detonation over the Marshall Islands in 1952 was the first test of a hydrogen bomb. Credit Underwood Archives, via Getty Images

At the height of the McCarthy era, J. Robert Oppenheimer, the government’s top atomic physicist, came under suspicion as a Soviet spy.

After 19 days of secret hearings in April and May of 1954, the Atomic Energy Commission revoked his security clearance. The action brought his career to a humiliating close, and Oppenheimer, until then a hero of American science, lived out his life a broken man.

But now, hundreds of newly declassified pages from the hearings suggest that Oppenheimer was anything but disloyal.

Historians and nuclear experts who have studied the declassified material — roughly a tenth of the hearing transcripts — say that it offers no damning evidence against him, and that the testimony that has been kept secret all these years tends to exonerate him.

“It’s hard to see why it was classified,” Richard Polenberg, a historian at Cornell University who edited a much earlier, sanitized version of the hearings, said in an interview. “It’s hard to see a principle here — except that some of the testimony was sympathetic to Oppenheimer, some of it very sympathetic.”

Photo

J. Robert Oppenheimer Credit Associated Press

A crucial element in the case against Oppenheimer derived from his resistance to early work on the hydrogen bomb. The physicist Edward Teller, who long advocated a crash program to devise such a weapon, told the hearing that he mistrusted Oppenheimer’s judgment, testifying, “I would feel personally more secure if public matters would rest in other hands.”

Richard Rhodes, author of the 1995 book “Dark Sun: The Making of the Hydrogen Bomb,” said the records showed that making fuel to test one of Teller’s early H-bomb ideas would have forced the nation to forgo up to 80 atomic bombs.

“Oppenheimer was worried about war on the ground in Europe,” Mr. Rhodes said in an interview. He saw the need for “a large stockpile of fission weapons that could be used to turn back a Soviet ground assault.”

Robert S. Norris, a senior fellow at the Federation of American Scientists and the author of “Racing for the Bomb,” a biography of Lt. Gen. Leslie R. Groves, the military leader of the World War II project to develop the atomic bomb, said a reading of the formerly secret testimony showed it had little or nothing to do with national security.

“In many cases, they deleted material that was embarrassing,” he said in an interview. “That’s pretty obvious.”

The Energy Department, a successor to the Atomic Energy Commission, offered no public analysis of the 19 volumes and no explanation for why it was releasing the material now. It did, however, note that the step took 60 years. Sidestepping questions of guilt or innocence, it referred to the 1954 hearing as a federal assessment of Oppenheimer “as a possible security risk.”

Steven Aftergood, director of the Federation of American Scientists’ project on government secrecy, called the release “long overdue” and added, “It lifts the last remaining cloud from the subject.”

Priscilla McMillan, an atomic historian at Harvard and author of “The Ruin of J. Robert Oppenheimer,” applauded the release but also expressed bafflement at its having taken six decades, saying her own research suggested that the transcripts held “zero classified data.”

An eccentric genius fond of pipes and porkpie hats, Oppenheimer grew up in an elegant building on Riverside Drive in Manhattan, attended the Ethical Culture School and graduated from Harvard in three years. After studies in Europe, he taught physics at the University of California, Berkeley.

As a young professor, he crashed his car while racing a train, leaving his girlfriend unconscious. His father gave the young woman a painting and a Cézanne drawing.

In the 1930s, like many liberals, Oppenheimer belonged to groups led or infiltrated by Communists; his brother, his wife and his former fiancée were party members.

The physicist Edward Teller. In secret hearings in 1954, Teller said he did not trust Oppenheimer’s judgment. Credit Associated Press

The physicist Edward Teller. In secret hearings in 1954, Teller said he did not trust Oppenheimer’s judgment.Credit Associated Press

In the 1940s at Los Alamos in New Mexico, in great secrecy, he led the scientific effort that invented the atomic bomb. Afterward, as chairman of the Atomic Energy Commission’s main advisory body, he helped direct the nation’s postwar nuclear developments.

Oppenheimer’s downfall came amid Cold War fears over Soviet strides in atomic weaponry and Communist subversion at home. In 1953, a former congressional aide charged in a letter to the Federal Bureau of Investigation that the celebrated physicist was a Soviet spy.

Troubled by the allegation, President Dwight D. Eisenhower ordered “a blank wall” erected between Oppenheimer and any nuclear secrets.

No evidence came to light that supported the spy charge. But the security board found that Oppenheimer’s early views on the hydrogen bomb “had an adverse effect on recruitment of scientists and the progress of the scientific effort.” He died in 1967, at 62.

Experts who have looked at the declassified transcripts say they cast startling new light on the Oppenheimer case. Dr. Polenberg of Cornell, for example, expressed bewilderment that 12 pages of testimony from Lee A. DuBridge, a friend and colleague of Oppenheimer’s who discussed the atomic trade-offs and the European war situation, had remained secret for 60 years.

“A difference of opinion doesn’t mean disloyalty,” he said. “It’s hard to see why it was redacted.”

Dr. Polenberg also pointed to 45 pages of declassified testimony from Walter G. Whitman, an M.I.T. engineer and member of the Atomic Energy Commission’s advisory body. “In my judgment,” Mr. Whitman said of Oppenheimer, “his advice and his arguments for a gamut of atomic weapons, extending even over to the use of the atomic weapon in air defense of the United States, has been more productive than any other one individual.”

Asked his opinion of Oppenheimer as a security risk, he called him “completely loyal.”

Alex Wellerstein, an atomic expert at the Stevens Institute of Technology, said in a comment on the secrecy blog of the Federation of American Scientists that years ago he had asked the government to declassify the secret Oppenheimer testimony.

The department’s public silence on his request, he said, made the unveiling look like “the result of an internal interest in the files rather than prodding from an outside historian.”

A few of the declassifications cast new light on what were already famous moments in Oppenheimer’s downfall.

Isidor I. Rabi, a Nobel laureate and veteran of the Manhattan Project who staunchly defended the beleaguered physicist, told atomic investigators that he found the hearing “most unfortunate” given what “Dr. Oppenheimer has accomplished.”

The restored transcript adds a deleted phrase in which Dr. Rabi mentioned the hydrogen bomb, then also known as the Super. It underscored the depth of his fury.

“We have an A-bomb,” he told the hearing, as well as “a whole series of Super bombs.” He added: “What more do you want, mermaids?”

William J. Broad is a science journalist and senior writer at The New York Times. He shared two Pulitzer Prizes with his colleagues, as well as an Emmy Award and a DuPont Award.

Michael H. Ebner

HNN October 12, 2014

The obituary a few weeks ago of a former major league baseball player – George Shuba – has furnished a useful lesson about race.

Shuba, a journeyman outfielder, played for six years with the Brooklyn Dodgers (1948-1950 and 1951-1955). He never appeared in more than one-hundred games during his career, although he did have the distinction of playing in the World Series of 1952, 1953, and 1955. Brooklyn won its first world championship in the latter year. On the field Shuba is best remembered as a dependable pinch hitter – lifetime batting average of .259, with twenty-five home runs (one of them against the Yankees in the World Series of 1953) – for a team that was regularly in contention for the National League pennant.

Largely forgotten until the publication of his obituary last week, now Shuba is celebrated for breaking an inter-racial taboo. He did so by extending his hand by way of congratulating teammate Jackie Robinson, who had just hit a home run for the minor league team known as the Montreal Royals of the International League. The late Jules Tygiel, a peerless researcher, made no mention of it. Arnold Rampersad, in his biography of Robinson, mentions the handshake – and includes a photograph of it – but does not make much of the incident.

Next we turn to the obituary of Steve Gromek (1920-2002), a pitcher for the Cleveland Indians and later the Detroit Tigers. Over a seventeen-year career (1941-1957), he compiled a respectable win-loss record of 128-108. He won nineteen games during 1951 and eighteen in 1954. Gromek also won a World Series game in 1948, filling in for baseball legend Bob Feller who required an extra day of rest.

When Jackie Robinson arrived in the major leagues in 1947, he experienced the sting of racial hostility. A handful of his Brooklyn Dodger teammates unsuccessfully sought to prevail on the management of the Dodgers to drop Robinson from the roster. Pee Wee Reese – the team’s captain and shortstop, later elected to the Baseball Hall of Fame – rejected the opportunity to sign a petition circulated among teammates opposing Robinson’s presence on the roster. Well known is that the ringleader of the petition effort, Dixie Walker, ultimately found himself traded away by the Dodgers.

When the Dodgers played in Cincinnati a fan hurled vicious epithets at Robinson. Reese – a native Kentuckian – quietly walked across the infield to Robinson and gently placed his arm on his teammate’s shoulder. The hecklers ceased. While this moment remains much remembered, no known image exists.

This brings us to Larry Doby, the first African American to play in the American League. The rookie outfielder hit a key home run in the World Series of 1948, securing Gromek’s winning pitching effort. Afterwards Gromek enthusiastically hugged Doby in the clubhouse, an image that made its way into newspapers. Margaret Mackenzie wrote about the episode for the Pittsburgh Courier, a widely read African American newspaer: ”That picture of Gromek and Doby has unmistakable flesh and blood cheeks pressed close together, brawny arms tightly clasped, equally wide grins.” The chief message of the Doby-Gromek picture is acceptance.”

Years later Gromek, a native of Hammtramck, Michigan – a largely white working-class suburb adjacent to Detroit – experienced ostracism but quickly shrugged it off. Today the Gromek-Doby embrace remains an iconic image in the history of American race relations.

These episodes – each of them situated in the immediate aftermath of World War II – reflect changing racial sensibilities. The Swedish social scientist Gunnar Myrdal, in his landmark book – An American Dilemma (1944) – anticipated the shifting tableaux of race relations in post-war American culture. What is remarkable is that major league baseball – its game played before crowds numbering in the tens of thousands – represented an agent of social change. The re-integration of professional baseball occurred seven years in advance of Brown v. Tulsa Board of Education.

Michael H. Ebner is professor of American history of emeritus at Lake Forest College. He can be reached at ebner@mx.lakeforest.edu