As the twentieth century ended and the specter of the Cold War appeared to be fading into history, political scientists pondered the question of how a new world order would take shape under the direction of a victorious superpower. As John Ikenberry stated, victors try “to find ways to set limits on their powers and make it acceptable to other states.”[1] The United States, having spent a century building its image as military power determined to protect the world from evil and in doing so spread democracy, found itself in a dilemma. While talking heads and braggarts proclaimed U.S. superpower greatness, diplomats faced the harsh reality that yesterday’s protector can quickly become today’s bully and tomorrow’s enemy. Additionally, the economic strain military spending places on a society can become politically detrimental once victory occurs. In the past it was said that to the victor goes the spoils, but in modern times with plundering being frowned upon, the victor tends to win a headache both at home and abroad without seeing any real benefit. Without change in policy, particularly policy pertaining to its military superiority and status, a victorious nation discovers that military superiority can lead to economic and political decline.

Of the many headaches the United States developed as a single superpower in the years following the end of the Cold War, probably the most contentious one was the headache of being asked to intervene in conflicts great and small. Seldom was there a clear right side and wrong side to support. In many cases the crises that prompted the debate over intervention occurred in regions that had been previously under the political, economic, and military supervision of the Soviet Union. Even when using the umbrella of the United Nations, U.S. intervention could stir conflicting emotions in the crisis region. The United States had been both the enemy and possessor of enviable commodities for fifty years. Envy and distrust were not feelings easily eradicated simply because war was over. In a world that seemed to be rupturing in the absence of Cold War superpower dominance, the United States struggled with its expanded role of policeman, banker, and social worker.

Military dominance, which had spurred the U.S. economy in the years following World War II, became a burden following the end of the Cold War. In the wake of international cooperation and the perception of peace, nations could shift away from military technology as a basis of economic growth. Nations which remained entrenched in military development became economically dependent on wars that seldom required Cold War technology. Furthermore, Cold War technology had been all about fighting a war from a distance, and the conflicts of the twenty-first century required boots on the ground. When President Truman and President Eisenhower put their support behind the development of nuclear technology and behind the technology to deliver nuclear weapons from a distance, part of their justification was that it would save U.S. casualty and hypothetically shorten, if not prevent war. Their reasoning was based predominantly on the notion that nations would fight nations, and that the days of tribal warfare were becoming part of the past. When the theories and perception of modern war shifted after the attacks on the United States in 2001, the world powers seemed taken by surprise. When the Second Gulf War did not produce the results predicted, when peace did not flourish, and when terrorism spread rather than diminished, the United States seemed not only surprised but confused. The U.S. war strategy and military development, so honed during the twentieth century, did not work in the twenty-first. A nation which had grown powerful through military superiority, found itself the targeted enemy rather than the celebrated hero. Furthermore, it found itself struggling to justify increasing national debt, made larger due to wars that seemed to have no end. Like many great powers which had come before, the United States faced decline despite continued military superiority. In fact, it could be argued, the United States faced decline because of its military superiority.

Endotes

[1] John G. Ikenberry, After Victory: Institutions, Strategic Restraint, and the Rebuilding of Order after Major Wars (Princeton: Princeton University Press, 2001), xi.

Further Reading

Hixson, Walter L. The Myth of American Diplomacy: National Identity and U.S. Foreign Policy. New Haven, CT: Yale University Press, 2008.

Kennedy, Paul. The Rise and Fall of the Great Powers. New York: Vintage Books, 1989.

There is a belief held by many that a strong nation can ensure stability and can promote prosperity by developing a strong military presence in a region. It is not a new theory nor is it difficult to validate when history is full of examples of empires formed by military strength who then add to their own prosperity through the quelling of regional conflict and instability. In fact, it is much easier to cite examples of empires made strong by force than by diplomacy; therefore, it should be of no surprise that the United States followed a similar path as it sought to expand its economic interests during the late nineteenth and early twentieth century.

What might be surprising, especially after the fact that the United States went on to flex its military might for the greater part of the twentieth century, is that there had been fierce opposition within the United States to the notion of militarizing, taking on the role of stabilizer and protector, and pursuing the status of empire.[1] Even during the years following the Monroe Doctrine many argued that the United States needed to simply concentrate on the lands of North American and leave the affairs of Europe to the Europeans. However, these well intended notions of independence and isolation failed take into consideration that sea trade could not be ‘free’ or ‘secure’ unless someone policed the waters. The United States was comfortable allowing the British Navy the job even though the British posed the greatest threat to U.S. interests at the time. However, by the end of the nineteenth century more U.S. voices were calling for a change. One of these voices was that of Alfred Thayer Mahan who wrote, “All men seek gain and, more or less, love money; but the way in which gain is sought will have a marked effect upon the commercial fortunes and the history of the people inhabiting a country.”[2] He argued that for economic gain to increase, sea trade must be protected, and rather than relying on the naval strength of others, the United States must get into the game and become a naval power. A few short years after he made his argument, the United States acquired territories and increased its markets overseas. A larger navy was required.

When faced with questions and criticism concerning the appearance of imperial objectives, President Theodore Roosevelt responded, “When the Constitution was adopted, at the end of the eighteenth century, no human wisdom could foretell the sweeping changes, alike in industrial and political conditions, which were to take place by the beginning of the twentieth century.”[3] A few years later he would assure the critics, “All that this country desires is to see the neighboring countries stable, orderly, and prosperous.”[4] Whether Roosevelt was genuine in his assurances or whether he was fully aware that the nation was heading down an imperial path is debatable, but one thing that has been clear from that point forward – the United States was no longer theoretically a regional power but had become one in reality. During the next two decades, the United States would transition from regional power to world power and the transition would happen through the use of military might.

[1] Many will argue that the United States never pursued or achieved the status of empire. They will claim that the United States assimilated and incorporated territories rather than acquired colonies and that the peoples of the territories were treated as citizens rather than as subjugated peoples. The debate on the question of whether the United States is or was an empire can be quite interesting to follow.

In 1820, the Congress of Troppau was convened. The great powers of the day determined that they held the right to intervene in the revolutionary conflicts of neighboring states. Maintaining the status quo and preventing the spread of nationalism and revolution was viewed as vital in the quest to quell the type of conflict that had erupted in Europe during the French Revolution and the Napoleonic Era. While the beginning of the century had been fraught with what some called the first worldwide war, the remainder of the century saw only regional conflicts, most that were harshly quelled before they could spread outside their borders. However the policy of intervention did not quell nationalism. During the twentieth century nationalism would be at the heart of many conflicts, and the notion that great nations had the right to intervene to protect the status quo would be at the center of international policy for many nations including the United States.

In the case of the United States, intervention became a tool to either protect or disrupt the status quo in a region depending on which was most beneficial to interests of the United States. Intervention often placed the nation at odds with its own revolutionary history and patriotic rhetoric. Despite seeming hypocritical in nature, the United States was not forging new diplomatic patterns but rather following the patterns established by the great powers of the past. The U.S. Founding Fathers may have wanted to distance themselves from the politics and practices of Europe, but their decedents embraced the policies as the United States rose to international supremacy during the twentieth century.

During the rise to superpower status, the United States benefited economically and politically. The right to intervene allowed the United States to protect economic markets, and in some cases add new markets and resources to its growing stock pile. While the nation doggedly denied that it was an empire, by the end of the twentieth century the problems associated with empires began to plague the nation. Most prominently, it could be argued, the United States faced the growing international expectation that it would intervene when conflict threatened a region’s status quo. After a century of gaining prominence and wealth through international intervention, often with the sole goal of protecting resources and markets, the United States found that the right to intervene had transformed into an obligation to intervene.

In the modern world of minute-by-minute news coverage, it is easy to assume that history is being recorded both comprehensively and accurately. One may even think that the role of the historian is passé and all that is needed for the modern world is the analyst who will try to make sense out of current events. Even in a world where the events of the day are documented, and where social media can turn our most mundane activities into a historical sketch that we can share with all of our cyber friends, the role of the historian is crucial. It may even be more crucial than ever before because of the sheer volume of data that must now be shifted through in order to create a comprehensive, yet pertinent, story.

Accuracy in historical record has always been important to historians, but it has not been nearly as important as the story. In the days in which history was borrowed from others in order to bolster a rising nation’s image, accuracy was often less important than fostering the image that a new empire was ancient and eternal in origin. A good example of this is found with the Roman Empire, which having risen in power desired an historical record that would magnify its greatness rather than highlight its youth. Throughout history, political entities as well as powerful individuals have sought to bolster their images by creating histories that connect them to other prestigious historical events, periods, and reigns. By likening themselves to others who were dynamic, successful, dominant, and strong, they create an image of grandeur that is only partially based on the events of their own time and of their own making.

As technology and the availability of the written record evolved over the centuries, it became harder to use the history of others as a means in which one’s own history could be created. Even before the printing press, some historians began comparing their own region’s historical journey with that of their neighbors. In some cases, as with the historian Tacitus, the neighbor was heralded for its purity and simplicity in contrast to the corruption at home. In other cases, the neighbor was villainized in an attempt to deflect attention away from unpopular home-state policy. In either situation, the history of others was borrowed, no longer as a means to explain where a political state had come from, but rather to explain how the home-state compared to others. This trend created an interesting phenomenon in the writing of history. No longer was it simply good enough to extoll the greatness of one’s own past, but now it was acceptable, and even expected to criticize the neighbor as a means of exhausting oneself. By making the neighbor seem less noble or even villainous, the historian could create an even more illustrious history of the home-state.

In the not so distant past, historians were at the whim and will of powerful men who could finance the historical pursuit of the scholar. Modern technology has changed this to some extent. Scholarly history may still be contingent on research grants made possible by powerful institutions and individuals, but technology has made everyone who uses the internet a historian, or at least a historical participant. No longer is it only the famous, powerful, or well-connected who get recorded. Some individuals may only be contributors of data, whereas others may add more significantly to the record of daily events. In this world of high speed technology and vast data collection, history is being recorded more thoroughly than ever before, but that doesn’t mean that the record is any more accurate. Often, history is being recorded in very inaccurate ways and by people with little to no understanding of the ramifications this has on both the people of today as well as the historians of tomorrow. In the race to beat the ‘other guys’ to the best story, accuracy, once again, is secondary to the story being told.

Modern historians bound by ethical parameters of historical accuracy, try to share a story that is comprehensive, and as unbiased as possible. They are taught to question sources and present a full picture of an event, a person, or a period of time. In some cases, they are even taught to use good writing skills in order to make the story enjoyable to read. They are taught to recognize that history is not always pleasant, but it can always be of value, if even to only a few. At times, history can be a story of valor, bravery, and patriotic glory. At other times, history can be just the opposite. The modern historian may write a tale that makes some readers uncomfortable, but the job of the historian is to write a comprehensive and pertinent story rather than the myths and propaganda so many others are determined to write.

During war, even a war fought in far flung lands, the civilian public is not guaranteed the comforts of peacetime. Rationing of food and clothing can be expected as a nation directs its energy and material goods toward the war effort. Additionally, one can expect taxation to increase as the nation’s war debt mounts. However, when one’s liberty becomes a cost of war, the nation faces a crisis that is much more difficult to overcome with patriotic slogans. Fear, spread through propaganda campaigns and doom-inspiring rhetoric, becomes the tool that convinces a nation that the loss of constitutionally protected liberty is price worth paying for the ultimate goal of winning the war.

In the mid-to-late 1700s, the cost of war was hugely felt in the form of taxation. Colonial Americans were opposed to the new taxes despite the fact that they helped pay for the military support the colonists benefited from each time a frontier war erupted. Their argument, in simple terms, was that if they were to be taxed like regular English subjects, then they should have all the rights and privileges afforded to regular English subjects. Particularly, they should have the right to political representation. When their demands for equality were not heeded, the colonists decided that rebellion was the solution. War weariness and the costs of war played a large role in the final outcome. Endless war was not a good national policy, and even the powerful British Empire had a difficult time arguing against that truth.

During the American Revolution, the colonists who supported rebellion and sought independence were willing to sacrifice personal comfort for their cause, but that dedication was challenged when the new nation found itself sacrificing economic prosperity due to the Embargo Act of 1807. In an ill-conceived attempt to force France and Great Britain into dealing with the United States with greater respect, President Thomas Jefferson and Congress passed an embargo that resulted in great hardship for the New England merchants. Fortunately, the War of 1812 concluded just as the anger in New England was reaching a boiling point, and President James Madison was not faced with the daunting task of suppressing a homeland rebellion.

When homeland rebellion did finally erupt years later as the national argument concerning the issue of slavery boiled over, President Abraham Lincoln did not hesitate suspending certain constitutionally guaranteed rights in an effort to settle the conflict more quickly. His justification was that those who were trying to separate from the union and those who were a direct threat to the union were not necessarily protected by the constitution. He was not alone in his evaluation that during war certain liberties might need to be curtailed. The remnants of Congress agreed, and passed the Habeas Corpus Suspension Act of 1863.

Economic hardship and the forfeiture of liberty seemed justifiable when the nation was at war; especially if the forfeiture of liberty was directed at those who seemed set on disrupting the nation’s ability to fight the war. It should not come to a surprise that when the nation went to war after the bombing of Pearl Harbor, those who seemed too closely tied to the enemy would find themselves stripped of their constitutionally protected liberty. It mattered little that their ties were familial in nature as opposed to political. The nation had to be protected in order for the United States to prevail. In the end, the war only last a few short years. The rights and liberty of the interned were restored, everyone went on their merry way, and the nation flourished as it helped rebuild the free world. Or so the propagandists proclaimed.

Yet another enemy lurked and another war loomed. Constitutionally protected rights were no longer sacred in the face of an enemy. A nation at war, even a cold one, had to protect itself from enemy sympathizers and subversives. If this meant spying on its own citizens, then that is what the nation would do. When the truth of this violation became publicly known after the burglary at the FBI office in Media, Pennsylvania in 1971, Congress acted to halt such a travesty, but it was questionable even at the time whether the actions of Congress would hold up during the ongoing Cold War.

War, it seemed, would always be a justification for a temporary loss of freedom and liberty, but as the twentieth century ended and the twenty-first century began, war shifted away from the traditional conflicts that often erupted between two political enemies. Instead, war became a conflict with phantoms and ideologies. First there was the War on Drugs and then the War on Terror, both eradicating the protections guaranteed in the constitution, and both without any end in sight. The cost of these wars continues to be great and it seems that rather than causing economic hardship and the sacrifice of personal comfort, these wars demand a greater price – liberty.

History is a required subject in schools throughout the United States, but is history simply a subject to be covered, crammed, tested, and forgotten? How much do we really know and understood about our own history? Historian Tony Williams asked, “do we really understand the difference between Jamestown and Plymouth? Or between the Declaration of Rights and the Declaration of Independence?”[1] Do we remember more about our elementary Thanksgiving pageants than we do about the actual people and events that shaped our nation and the world in which we live?

Recently, I saw a meme popup on the internet that counseled the readers to not believe revisionist historians, and inferred that they lie in order to strip away the moral fiber of the nation. Clearly, the intent of the statement was to cause distrust in accounts of history that challenge particular points of view, and to breed distrust of academic sources of history as opposed to sensationalized, patriotic versions of history that tend to leave out the controversial bits. Sadly, too many people avoid academic histories because they distrust the historian’s motivation or because they think scholarly history is boring. Contrary to what many believe, scholarly history is not monolithic in nature, and most historians are not set on convincing the public that the celebrated historical characters are all villainous. Rather, academic historians work hard to replace fiction with fact, and separate myth from history. Historian Carol Berkin wrote, “They write about what interests them… [and] firmly reject collective agendas no matter what group suggests them and no matter what pressing problems those agendas might promise to resolve.”[2] The result is that rather than only providing a timeline of the events and peoples of the past, historians have provided greater access to and understanding of the real people and of their lives beyond the grand events of their day. Instead of data to be memorized the night before a test and then quickly forgotten, scholarly history provides a journey back in time, introducing the reader to a diverse world that is much more fascinating than might have ever been discovered in the days when cramming for the test was all that seemed to matter.

Prior to the chaos of the French Revolution and Napoleon’s meteoric rise to power, three great powers balanced the Western World: Great Britain, France, and the Ottoman Empire. The Far East and the Americas were still peripheral, with only the United States disrupting the colonial empire system in any fundamental way during the eighteenth century. Throughout the nineteenth century, the three great empires faced ever-growing challenges as nationalistic zeal spread worldwide. In response to the chaos created by the both the French Revolution and the Napoleonic era, the great powers of Great Britain, Austria, Prussia, and Russia chose to form an alliance that they hoped would prevent a repeat of the decades of war. They also redoubled their efforts to contain and control their own territories. The great threat to political stability came from two entities: empire seekers and nationalistic zealots. Control and contain both, and it was believed that chaos could be avoided. Yet as well conceived as the Concert of Europe was for the age, there was an inherent flaw in the concert system. The very nature of forming alliances to prevent imperial expansion or nationalistic revolution also entangled the great nations, and would, in the early twentieth century, lead them into another great international conflict. Fear became the demon; fear of what would happen if a nation chose not to honor the treaties and pacts.

The twentieth century saw the rupture of empires and the colonial system that had made the empires great. While the rupture was often bloody and chaotic, there remained a level of control because as the great empires of the past declined, two even greater empires replaced them. Historians and political scientists argue over whether these two great nations ever became empires in the true sense, or if they were only empires of influence during the second half of the twentieth century. They do, however, agree that the influence of the United States and the Soviet Union during the Cold War suppressed a great deal of the chaos that might have erupted as colonial shackles were lifted and fledgling states emerged as independent nations. As fifty years of Cold War ended, and ended rather unexpectedly and abruptly, the world faced a daunting task of answering the ultimate question. What would come next?

One political scientist suggested an answer to the question. “The great divisions among humankind and the dominating source of conflict will be cultural… the clash of civilizations will dominate global politics.”[1] Unlike the independence movements that plagued international stability in the eighteenth, nineteenth and twentieth century, the twenty-first century has seen a greater surge of culturally driven conflicts, some contained to rhetorical mudslinging, and some violent, bloody, and devastating to the peoples who get in the way of power seeking individuals who achieve dominance through the spread of chaos. The rise in cultural conflict has grown during the last decade and it threatens both stable and week nations alike. It is not limited to the traditionally war-torn regions of the world, and it will take cooperation to counter it. Like the great nations that faced the chaos of the French Revolution and the Napoleonic Wars, the nations of today must find a way to combat this growing crisis; a way that recognizes that the chaos is the goal of the enemy and not simply a byproduct.

Further Reading

Samuel P. Huntington, The Clash of Civilizations and the Remaking of World Order (New York: Simon & Schuster, 2011).

Seventy years ago, the United States unleashed a new weapon with the aim of ending the Pacific theater of World War II. President Truman addressed the nation, “With this bomb we have now added a new and revolutionary increase in destruction to supplement the growing power of our armed forces… It is an atomic bomb. It is a harnessing of the basic power of the universe. The force from which the sun draws its power has been loosed against those who brought war to the Far East.”[1] This new weapon was horrifying in its destructive capability and the United States hoped that destruction on such a momentous scale would finally bring Japan to its knees. Many historians and scholars of military strategy argue that bombing campaigns, even ones as devastating as the bombings of Hiroshima and Nagasaki, are less effective than their architects anticipate.[2] In the case of the surrender of Japan, it is argued that the Soviet entrance into the Pacific War had a greater impact on the Japanese decision then the U.S. bombs.[3] It has also been argued that the United States chose to use its new weapon with the clear intention of ending the war before the Soviet Union made its decision to enter the Pacific War public. The Japanese did not surrender until after the Soviet declaration of war on August 9, a date that they had chosen to coincide with their military movements on the continent, but also a date that coincided with the second U.S. bombing of a Japanese city.

Whether Japan surrendered due to the bombs or due to the threat of Soviet involvement, “Stalin managed to join the war in the nick of time,” and thwarted the efforts of the United States to reduce Soviet influence in the region.[4] Ending World War II was the primary objective of both the United States and Soviet Union, but it was not the sole objective of the two nations. It has been argued that this maneuvering, both by Truman and Stalin, was the first action of the Cold War. As one war ended, another was emerging from the shadows. While the United States believed itself to have a clear and comfortable head start in the nuclear race, Soviet espionage had already undermined the U.S. lead. It would take only a few short years before the realities of Hiroshima and Nagasaki became the nightmares of the worldwide community.

Memory is a tricky thing that tends to filter events by removing the negative aspects from our recollection. When current events are not to our liking, we look to the past and remark on how much better the past was in comparison to the present. While it is also true the positive aspects of an event or period of time can be filtered leaving us with only a bleak recollection of the time, it is more often the case with collective memory that we glorify rather than demonize the past. History, the record and study of that record, helps remove the myth that memory creates.

For many who came to maturity during the 1980s, the decade has come to represent a better time, or in other words, The Good Old Days. The decade is viewed as one where U.S. power and culture was strong and celebrated. The music and clothing were distinctive and memorable. Soft Power was used in conjunction with traditional methods of political power, and the influence of the United States was felt worldwide. The notion that the Cold War was won by forceful rhetoric and the exportation of McDonalds and MTV has resonated with those who now view the 1980s as the glorious decade of U.S. supremacy. While few will argue against the notion that the United States reached a superpower zenith as the twentieth century neared its end, historians will be quick to note that there was more to the decade than glory and power. There was fear – fear of nuclear destruction, fear of pandemic spread of disease, and fear of an ever increasing drug use in mainstream society. However in a decade where politicians could harness the media, or at least greatly influence the script, and where social media was yet unborn, it was easy for the general public to hear the strong rhetoric and believe the message. Imbedded in the rhetoric was the notion that war was the answer to all the ills that plagued the nation. Whether an ideological war with an evil enemy, a hot war often conducted in secrecy, or a war on drugs that often impinged on civil rights but had a moral justification, war was the solution. War was also the solution to a lagging economy. Investment into the machines of war burdened the nation with debt, but it also put people to work and made a select group wealthy in the process. War and power went hand in hand, and those who viewed power as the ultimate evidence of success sought to encourage and perpetuate the notion that only through the constant demonstration of strength could the fears of a nation be quelled. Decades later their efforts have caused many to look back in longing for a better time – a time of strength.

Memory is a tricky thing. Few in the public participated directly in the world changing events of their youth, and fewer still have found a need to crack open the history books to learn more about period of time in which they lived. Historians seek to delve beyond collective memory and search for the data that reveals a greater image of the people and events of a period of time. For those who seek to understand the history rather than the myth of the 1980s, The Good Old Days were days of rhetoric and war, a nation recovering from an economic recession, and a time when money equaled political power. So, in a way, those days are not so dissimilar to the present.

Further Reading

Chollet, Derek, and James Goldgeier. America Between the Wars: From 11/9 to 9/11; The Misunderstood Years Between the Fall of the Berlin Wall and the Start of the War on Terror. New York: PublicAffairs, 2008.

Gaddis, John Lewis. We Now Know: Rethinking Cold War History. Cambridge, MA: Oxford University Press, 1997.

In June of 1987, U.S. President Ronald Reagan stood at the Berlin Wall and demanded that Soviet President Mikhail Gorbachev “Tear down this wall!” When just a few years later the wall was breached and then torn down by the people, many in the United States credited Reagan with a victory. While the specific role of the United States in the collapse of the Soviet Union is a hotly debated topic, what is clear to historians is that Reagan’s rhetoric was not the cause of the Fall of the Berlin Wall. However, his dedicated efforts to work diplomatically with Gorbachev, even to the point of becoming friends, can be viewed as integral to the end of the Cold War. Normalization of relations was not something that either leader took lightly, especially after the near disaster that was only narrowly avoided during the Able Archer exercises in 1983.

While some historians will argue that Reagan did not dramatically change his policy after learning of the near disaster, others believe that he became more open to diplomatic discourse in a desire to avoid nuclear war. In either case, the notion that Reagan’s big talk was key to a campaign of intimidation that directly led to the end of the Berlin Wall and the ultimate end of the Soviet Union is on the whole founded on myth rather than reality. Unfortunately, it is a myth that became firmly rooted in a generation who now view diplomacy as being weak and shouting as being effective. Big talk may have a place in foreign policy, but it is not the key to success that so many believe it to be. Quiet diplomacy on the other hand, while seldom making the news, has a more lasting impact current affairs.

Further Reading

Fischer, Beth A. The Reagan Reversal: Foreign Policy and the End of the Cold War. Columbia, MO: University of Missouri Press, 1997.

Gaddis, John Lewis. The United States and the End of the Cold War: Implications, Reconsiderations, Provocations. New York: Oxford University Press, USA, 1992.

Hutchings, Robert L. American Diplomacy and the End of the Cold War: An Insider’s Account of US Diplomacy in Europe, 1989-1992. Washington, DC: The Johns Hopkins University Press, 1997.