All posts by Jessie Hagen, M.A. (PioneerLady)

What started out as rambling thoughts, somehow grew into something more. Now in addition to pithy ponderings, I find myself writing pithy essays - history essays. Thus, two blogs: pithyponderings.com and pithyhistory.com.

Seven decades ago, the United States emerged from World War II relatively unscathed compared to other great nations of the world. It found itself in the position to help rebuild, and in doing so it prospered. This prosperity was evident in its purchasing power. A look through the cupboards and attics of our aging population will unearth the evidence of that purchasing power. Crystal and silver tea services, porcelain and fine china, flat ware of the highest quality, and linens too lovely to ever have been used except for the most special occasions. These imported items, often the gifts associated with marriage and new life, were made in recovery zones, and helped reestablish the war-torn markets and industries vital to the lives of those fortunate to have survived a horrific war. These items also confirmed to the U.S. populace that they had fully become a great power, much like the empires that had dominated the century before.

The purchasing power and abundance of the post war era in the United States also provided a balm for hardship from which so many had suffered. The world had been either at war or suffering financial depression for over three decades when WWII ended; an entire generation felt the burden of despair lifted when the industrial and economic potential of the United States was reached post-war. The youth, those too young to have felt the full brunt of hardship, reached adulthood in the glow of economic world domination. This glow was only slightly dimmed by the threat of nuclear war, a threat that increased as they aged but did little to blunt their earning power. War machines equaled economic growth as long as the nation continued to view such development as being vital. Once that view shifted, however, the realities of over extension and taxation created an ever growing sense of waste and loss. The greatness of their youth seemed to have slipped away and in its place, only a sense of uncertainty remained. The Cold War, with all its ills, provided secure jobs and a sense of proactive security. When it ended, a new generation faced the aftermath of war. For them, the balm came in the form of a technology boom, of rapidly falling interest rates, and open borders; these changes provided the American Dream the youth had heard about, but had worried would be outside their reach.

As the twenty-first century dawned, rumblings of change and challenge emerged: first with the Y2K fears and then with the market crash following the September 11 terror. A nation which had for so many years found economic stability in military development and distant wars, once again turned to war as a means to unify and solidify a shaken populace. However, unlike during the Cold War, the United States had lost its standing as superpower. Politically, economically, and militarily – others had risen from the ashes and emerged powerful equals. No longer was the United States seen as the great protector; rather, many saw the United States as a threat to peace. Others questioned if the political system, which had weathered over two hundred years of challenge, would survive the challenges of the new century. Unlike in recent history (the last three hundred years or so) the new century had seen a return of conflict dominated by non-state actors which thereby created a longing for the seemingly stable nature of the Cold War, stable despite its harsh suppression of ethnic conflict, damaging political interference, and costly proxy wars.

This longing for the stability and prosperity of the Cold War provided fuel for the fear, anger, and desperate hope which motivated many as they voted yesterday. The new century has not secured the American Dream for its younger generation; rather, it seems to only have jeopardized it for the older generation. Conservative or liberal, the policies formed in national and state capitals seem, at best, to be bandages rather than sutures. Few anticipated a speedy recovery, but many are willing to risk experimental treatment in the hopes of a miracle cure. The nation should survive from this latest illness, and from the treatment it has chosen; however, it is unlikely that the youth, the youngest voters, will find the balm their parents and grandparents found from an economic boom. Industry, and even much of technology, has gone elsewhere. The borders of nations are closing rather than opening. Peace is threatened as much from the turmoil within as it is from without, and the economy is adversely affected by all the uncertainty. The generations who have suffered the ills and recoveries of the past may be too fatigued to calm the fears and fevers of today’s youth. There simply may be no balm.

History often times seems to be about groups of people working against or for an issue. After destructive wars, terrible depressions, or horrific epidemics, people tend to work together to bring about recovery, with special concern for the young who are always the true hope for a better future. At this time when the ills that face the world are less tangible but no less threatening, it is vital, as we look to history for the lessons taught by groups of people in the past, that we remember the work always began and ended with the individual; the individual who created the cure, who did the work, and who didn’t lose hope. Never did they wash their hands and walk away from the crisis or turn their backs on the young; rather they recognized that the young are, in reality, the key to the stability and prosperity so sought after.

Broad based or narrow focused, history is not merely a collection of data, rather it is a story. At times, the story may seem dull, at other times captivating. The study of history can introduce us to the challenges and triumphs of the past. It can help us see patterns in the ‘action and reaction’ cycle of human relations. It can help us learn from the past events which have paved the way for present actions. However, it can only teach us if we are willing to learn. Simply hearing the story is not enough. Regardless of how enthralling, action-packed, or awe-inspiring, history is not simply a story to be heard. It is a story to be understood.

Whether we look at the rise of Hitler, the arms race of the Cold War, or the growth of empire through colonialization, history can teach us about how groups of humans react when they feel threatened by other groups of humans. During the inter-war period in Germany, the people felt sorely abused by the rest of Europe. They sought a change and a savior from the economic oppression they felt was unjust. During the Cold War, citizens on both sides sought powerful military might as a means of protection from a threat often ideological more than physical. They didn’t simply want a powerful government, they wanted an all-powerful government that could protect them from phantoms as well as from armies. In both of these historical stories, if we take the time study them rather than simply hear them, we can learn that people are willing to give up basic human and civil rights in order to feel protected from outside threats. Additionally, if we go beyond the simple narrative often taught in history primers, we can see cases where people were easily persuaded to put aside their moral compass in order to achieve group affiliation and protection. While the story of Hitler and his atrocious reign of power might more easily provide examples of how people can become swayed by nationalism and nativism, the story of the Cold War also provides examples. Foreign relations, the relations between nations rather than individuals, often times reflect the very nature of human relations. Just as human and civil rights were often trampled upon in both the United States and the Soviet Union by their own respective citizenry, national sovereignty and the right to self-determination were often trampled upon by the superpowers as they spread their economic, political, and military influence. The notion that ‘might makes right’ was not constrained.

The notion of ‘might makes right’ is clearly depicted in the colonization period leading up to the twentieth century. Peoples who seemed to be less civilized in comparison to the social and political norms of Europe were to be suppressed and subjugated, or eradicated if they would not accept their place in the more ‘civilized’ society. Moral qualms were assuaged by dehumanizing those who did not fit the norm and who did not hold the power. This was not the first time the process of dehumanizing the ‘other’ for social or political gain occurred in history, but it did normalize it as culturally acceptable. Even as slavery lost support, colonial conquest and rule, including the westward expansion of the United States, reinforced the idea that certain peoples were more valuable than others. The mighty western nations viewed their culture to be better than the rest, and believed that forced assimilation was right and justified.

To the victor goes the spoils and also the chance to write the story, but history is more than just one person or nation’s account. It is a compilation of stories from many different perspectives. Like the heroic sagas of old, history can inspire and teach lessons to the listeners, but the study of history can do more. It can dispel notions that any one group of people is more perfect or more sinful than the others. It highlights the shared humanity of man; a humanity that is full of valor and full of vice.

Political campaign season tends to encourage comparisons. Recently a journalist noted that Dwight D. Eisenhower had never held a public office prior to holding the highest public office of the United States. Eisenhower was a military man who had never voted for president, yet found himself asked by members from both political parties to run for president. In the end, and after much encouragement, Eisenhower chose to run for president with the Republican Party. His successful campaign, fueled by the slogan, “I like Ike,” was supported by a public hoping he would work to fix a broken national government.[1]

Eisenhower was a straight talking man who had honed his style and mannerisms during a lifetime of military service. Accepted into West Point in 1911, he began his service to his nation and committed himself to a life of duty and honor.[2] His experience as a leader grew during the three decades leading up to World War II and during his time as the Supreme Commander of the Allied Expeditionary Force. Military leadership at the level reached by Eisenhower required political skill and the ability to use diplomatic finesse. His experience as an able politician was refined when he was chosen to be the first Governor of the American Zone of Occupied Germany, and as he maneuvered through the political tensions that accompanied the position of Supreme Commander of Europe during the turbulent early period of the Cold War. While Eisenhower may not have engaged in domestic politics prior to running for the office of U.S. President, he was not unfamiliar with the skills and demeanor required of a U.S. president. The nation didn’t just “like Ike” but rather they loved him for what he represented and for the manner in which he conducted himself.

It could be said that Woodrow Wilson’s ideas are like a work of art. While the artist lived, the world was slow to embrace the art, but after the artist’s death, the world recognized the greatness of the work. Like with a work of art, interpretation would be highly subjective creating great potential for debate and disagreement.

In October 1916, Edward M. “Colonel” House, an American diplomat, stated, “We are part of the world…nothing that concerns the whole world can be indifferent to us.” During the same month, President Wilson stated that the United States would need to “serve the world.”[1] In order to provide this service, Wilson believed that a change in how international relations was conducted would be needed. It was vital that the old system of alliances be replaced by a new system of international cooperation.

Wilson was correct in the need for a new world order, and despite a growing isolationist movement in the United States, there would be no turning back from greater international political involvement. At the end of the Second World War, the United States played a dominant role in the international political body that was created to replace the failed League of Nations. While the United Nations would both be valued and criticized, it would, through accident or plan, become a way for nations to work together in war-torn regions of the world. Conflict and hostility might not have been eradicated through international cooperation, but service to the world’s population through peacekeeping efforts did, in some measure, fulfill the progressive ideas of the early twentieth-century. Certainly, it became harder for any powerful nation to remain indifferent to the concerns of the world.

[1] George C. Herring, From Colony to Superpower: U.S. Foreign Relations Since 1776 (New York: Oxford University Press, 2008), 407.

As a young nation, the United States found itself in a conundrum. The desire to avoid the entanglements of European politics clashed with the desire for economic prosperity. Some early leaders, including Thomas Jefferson, believed that the plentiful natural resources of the Americas would remain in high demand by Europeans and would ensure that a predominately agrarian society would continue to prosper for decades, even centuries to come. Others were more doubtful and recognized that trade would mandate political interaction. While idealists would cleave to the notion that the demand for U.S. raw materials would force the nations of Europe to treat the new nation with respect and dignity, others rightfully worried that it would take strength to bring about international respect.

The United States would spend much of its first one hundred and fifty years debating how to be taken seriously as a world power while at the same time remaining distant from the conflicts of Europe. However, isolation was never the viable option that many envisioned it to be. By the end of the Second World War, the United States fully understood that international respect came both from military strength and from economic influence. Political finesse was also vital for peaceful coexistence, but it was too often overlooked or dismissed in the eyes of the general public. Even though the United States had produced a few outstanding diplomats during its youth and adolescence, too often the role of diplomacy was overshadowed by the feeling that military and economic strength could get the job done without diplomatic pageantry. Like a few of the early founding fathers, many in the twentieth century believed that the peoples of the world would wish to purchase U.S. products and thereby highly value peaceful relations with the United States. On the other hand, there were many who derived lessons from the decades when a strong navy equaled security at home, and encouraged prosperity through protected shipping routes and foreign markets. In the years following the end of the Second World War, U.S. economic and military might certainly seemed to be the key to prosperity, and not just to the prosperity of the United States, but prosperity for Europe as well. Unfortunately, what many failed to foresee was a day in which the rebuilding of Europe would be completed. Furthermore, many failed to anticipate a day when Europe might wish to free itself from the protection and economic influence of the United States.

Prosperous international relations, whether they are economic, military, or political, are dependent on diplomacy. At different times, the idea of isolation has appealed to policy makers and the public alike. At other times, policy makers and the public support aggressive relations and even war with the other nations of the world. In either case, diplomacy is underrated by those who hold to the notion that prosperity is something that can be controlled by one nation at the expense of others. History shows that such beliefs are founded on a limited understanding of the vital role of diplomacy during periods of strife and in times of prosperity.

Further Reading:

Bemis, Samuel Flagg. John Quincy Adams and the Foundations of American Foreign Policy. New York: Alfred A. Knopf, 1956.

Herring, George C. From Colony to Superpower: U.S. Foreign Relations Since 1776. New York: Oxford University Press, 2008.

Lind, Michael. The American Way of Strategy: U.S. Foreign Policy and the American Way of Life. Oxford University Press, USA, 2006.

Chaos breeds fear much like an insidious virus; everyone becomes fearful that they will be next to catch it. Segregation is then seen as a positive means of prevention; a measure taken before eradication can commence. Calls for calm and cooperation become drown out by vitriolic shouts for action. It seems that when chaos threatens human cooperation, tact and finesse are the first casualties. Within the world of international cooperation, chaos creates a force against which diplomacy struggles to survive. By the end of World War II, chaos had taken a terrible toll on humanity. Devastating war, multiple pandemics, and a severe economic depression all contributed to a general fatigue which left many seeking strong leadership rather than diplomatic dialogue. The rise of authoritarian leadership should not have surprised many, nor should there have been surprise that some desired isolation. Like in the case of the insidious virus, many felt that segregation from the problem was the logical solution. Others placed their faith in military strength and vitriolic rhetoric. World War II demonstrated that neither segregation nor authoritarian leadership would stop chaos. A terrible truth became evident; the world was too interconnected to ever truly support isolationist policies or prevent the chaos which can derive from authoritarian regimes. However, even as the interconnectedness of the world became an undisputed fact and the vital role of international diplomacy became apparent to those who had once questioned its value, the chaos of a post-WWII world threatened the very cooperation that had brought the war to an end.

World War II had ceased but the suffering caused by war had not. Additionally, the process of decolonization was creating renewed competition for areas of the world which had previously been controlled by foreign powers. A post-colonial world was ripe for chaos, particularly political chaos. The great powers of the day did not wish to see the return of any form of chaos, particularly chaos located in their own back yard. While, the Cold War has been characterized as a war between ideologies, it can also be viewed as a war to eradicate regional chaos. The United States and the Soviet Union both developed international policies which were authoritarian in nature. The nations of the world felt distinct pressure to choose a side. Traditional diplomacy suffered even as the United Nations worked to promote peace through diplomatic means. At the end of the day, pressure in the forms of military posturing and economic support or sanction often dictated international relations more than traditional diplomacy. For nearly fifty years, the United States and Soviet Union managed to keep the chaos from spreading within their own borders. Like with a virus, small outbreaks were to be expected, but the big pandemic was avoided. If chaos was a virus, then the Cold War cure was death to the host if segregation was ineffective. Diplomacy might seem a slow and imperfect treatment for the conflicts that threaten to unleash chaos, but is there truly wisdom in containing chaos through the threat or creation of greater chaos? Some will argue yes while others shudder no, but both should agree that when chaos threatens, diplomacy struggles.

Further Reading:

Herring, George C. From Colony to Superpower: U.S. Foreign Relations Since 1776. New York: Oxford University Press, 2008.

Lind, Michael. The American Way of Strategy: U.S. Foreign Policy and the American Way of Life. Oxford University Press, USA, 2006.

Weigley, Russell F. The American Way of War; a History of United States Military Strategy and Policy. New York: Macmillan Publishing Company, 1973.

Zubok, Vladislav M. A Failed Empire: The Soviet Union in the Cold War from Stalin to Gorbachev. Chapel Hill, NC: The University of North Carolina Press, 2009.

Society is locked in a battle of interpretations when it comes to the Cold War. Was it a war against the sinister spread of communism that threatened the moral fiber and the political existence of the United States, or was it a battle between two economic powers determined to gain world hegemony? Even among historians, the debate rages. Regardless of the underlying goals that fueled the Cold War, one thing remains clear – it was a war both the United States and the Soviet Union were committed to winning. Part of the strategy employed by both powers was the use of education as a means of instilling a common ideology. While the United States would point fingers at the Soviet Union and accuse it of indoctrination rather than education, a real effort to promote an American Way of Life was embarked upon at home. It was also exported in much the same manner as the Soviet exportation of communism.

Unlike with communism, the United States did not have a concise definition that it could promote, but during the decades of the Cold War, an ideology emerged even though it was never capsulized in one definitive form. Movies and television idolized an American Way of Life that often romanticized an ideal version of the United States and its history. Books were written promoting a celebrated notion of Americanism; some warning about the pervasive threats against the United States, and other attempting to define what was un-American and what wasn’t. The American image was molded and promoted at home and abroad.

World War I had highlighted a need for a more educated populace, but the post-World War II era took education into a new realm with the U.S. educational system undergoing a transformation during the Cold War. The study of science and technology increased, and universities often found endless government funding for research and development particularly in areas that were argued as essential for national defense. While higher education benefited from an influx of funds, it was not just the research labs which saw change. In public elementary and secondary schools nationwide, the youth learned civics lessons even as they learned toDuck and Cover. However what may have been the most dramatic change came in the form of racial integration. For a nation proclaiming a dedication to equality and promoting democracy worldwide, segregation, especially the segregation of school children, was a political nightmare. The Supreme Court and the State Department worried that segregation jeopardized national interests and foreign policy. A nation determined to promote and export an American Way of Life needed to eradicate segregation from its narrative, and Brown vs. Board of Education was key to changing that narrative. The United States hoped to put to rest international criticism against a way of life which had supported segregation. A national policy of desegregation, accompanied by film images of the forced desegregation of elementary schools, went far in achieving that goal. In an ideological battle between superpowers, perception was a vital component of strategy. A change in national policy, particularly with regard to education, helped improve the perception that the principle of equality was fundamental to an American Way of Life.

Further Reading

Dudziak, Mary L. “Brown as a Cold War Case.” The Journal of American History 91, no. 1 (2004): 32–42.

One hundred years ago, malnutrition was a problem that worried a nation facing war. Industrialization and urban growth had moved large populations into congested cities and away from rural communities. Both World War I and World War II would see an increase in the urbanization of the United States. The progressive reformers of the early twentieth century recognized that urbanization was leading to an unhealthy population and pushed for reform. They also pushed for vocational education, particularly in the area of what would become known as Home Economics.

One of the great misconceptions of the modern age is that the skills of the preindustrial age were easily passed from generation to generation, and that it is only modern society that struggles with the problems associated with the loss of these skills. Unlike the dissemination of information, knowledge is gained through practice. Skilled crafts and vocations require practice and often a good deal of instruction by a skilled guide. Remove proper training, and the skills are not learned and society struggles. In particular, modern society struggles with issues malnutrition and, more recently, obesity, both of which can be directly linked to a lack of basic knowledge of nutritional food consumption. It could also be argued that the conveniences of modern food production lend to the problems, especially when the issue of ‘prepared’ foods is under discussion. Despite the flood of DIY programs and videos demonstrating cooking and gardening techniques, home production and preparation of food is not as common as needed for a healthy society.

New technology in the early 1900s brought advancements in home food production and storage, but the skills needed to safely process food had to be learned. During the WWI, home canning and food storage was demonstrated and encouraged by divisions of local government and subsidized by the U.S. Department of Agriculture.[1] The Smith-Lever Act and the Smith-Hughes Act are two acts which provided funding for increased training in food production and domestic skills.

According to historian Ruth Schwartz Cowan, the “decade between the end of World War I and the beginning of the depression witnessed the most drastic changes in patterns of household work.”[2] Industrialization was changing the way work was managed, not just in the factories, but also in the homes. Industrialization increased the availability of commodities, many which made household work less time consuming and arduous. Convenience is usually a commodity appreciated, especially by those tasked with managing a household and feeling the pressures of working outside the home. However, the skills that had been learned before convenient options became available were not always passed down to the next generation. Much like the youth of today, youth of past generations seldom liked learning to do things the old-fashioned way, especially not when new technology and innovation were changing the world. In order to offset the trend and ensure a healthier society, young women in private and public schools were taught the skills that many today assume would have been handed down from mother to daughter. Books titled, Clothing and Health, Shelter and Clothing, Foods and Household Management, and Household Arts for Home and School were produced and marketed to U.S. high schools. In the words of one author, “The authors feel that household arts in high schools should not be confined to problems in cooking and sewing. They are only a part of the study of home making.” In the 1915 edition of Shelter and Clothing, an entire chapter is dedicated to “the water supply and disposal of waste,” and included diagrams of the modern flushable toilet. Technology had changed the lives of many, but progressive minds of the age could see how new technology had to be integrated in to society through education rather than simply leaving society to work through the changes without assistance. World War I, the Great Depression, and World War II jolted policy makers into action. By the mid-1950s, Home Economics, as a high school subject, was accepted as an integral part of keeping the nation healthy and ready for future war. Even as warfare became more mechanized, the nation still held on to a belief that a healthy society was a strong society, and many school systems encouraged both male and female participation in Home Economics during the early 1980s. Unfortunately, the Technological Revolution of the 1990s and 2000s shifted the mindset of many, and like the industrial revolutions of the past, this latest revolution has supplanted convenience over skill. While information is just a click away, the knowledge that comes from skilled instruction is often harder to obtain, placing the nation at risk once more.

Endnotes

[1] Emily Newell Blair , and United States Council of National Defense. The Woman’s Committee: United States Council of National Defense, An Interpretative Report. April 21, 1917, to February 27, 1919, e-book (U.S. Government Printing Office, 1920).

In January 1789, the newly elected President George Washington wrote to his dear friend, Marquis de Lafayette, the following words.

While you are quarreling among yourselves in Europe – while one King is running mad – and other acting as if they were already so, but cutting the throats of the subjects of their neighbours; I think you need not doubt, My Dear Marquis we shall continue in tranquility here – And that population will be progressive so long as there shall continue to be so many easy means for obtaining a subsistence, and so ample a field for the exertion of talents and industry.

Washington, like so many of his countrymen, saw the American abundance of land and resources as a way to ensure the avoidance of foreign chaos, specifically the chaos that derives from overcrowding and the ills such chaos inspires. He wrote, “I see a path, as clear and as direct as a ray of light…Nothing but harmony, honesty, industry, and frugality are necessary to make us a great and happy people.”[1]

Men like Washington felt strongly that certain key moral principles would flourish in a land as abundantly blessed as America. As a leader of men for most of his adult life, he would not have been blind to the tendencies of human nature, but clearly he believed that those men dedicated to “industry and frugality” would prevail over those who sought slothful pursuits. The United States was predominantly agrarian during those early years. Commerce, especially the trade of raw materials for finished goods, may have dominated the sea side areas of the new nation, but industrialization had not yet lured workers from the fields and into cities. Subsistence farming was still both the predominant occupation and an occupation that did not tolerate slothful pursuits. Washington was able to envision generations of “tranquility” rather than the chaos that derived from congested cities and limited resources. However, he was not naive to the realities of human nature; he simply could not foresee how quickly the world would change once industrialization took hold.

In 1783 at the army camp located in Newburgh, New York rumors of revolt were quelled when General George Washington addressed his men. The rhetoric, which had grown from frustration with Congress over back pay, was effectively countered when Washington spoke, “…let me entreat you, Gentlemen, on your part, not to take any measures, which, viewed in the calm light of reason, will lessen the dignity, and sully the glory you have hitherto maintained…”[1] Scholars have argued over whether the crisis in Newburgh was one of rhetoric only, or if an actual conspiracy existed which threatened the stability and future of the United States.[2] Regardless, the Newburgh Affair highlights how political rhetoric can lead to crisis, and how calm leadership rather than dramatic action can be the solution.

Conspiracy theorists and politically motivated historians have inferred that orchestrated nationalist machinations were the cause of the rumors and implied threats that swirled around Newburgh in the fall and winter of 1782-83. Others argue that frustration at the lack of pay, and the worry of a post-conflict future, organically inspired the rhetoric Washington felt needed addressed on March 15, 1783. Pamphlets, newspapers, public meetings, and personal correspondence were the main vehicles for the spreading of news and the airing of grievances prior to the technological age. The years leading up to the outbreak of war proved that these were effective tools in rousing public opinion in order to force change. It stood to reason then that these same tools would be used when Congress ground to a standstill on the issue of military pay and veteran benefits.

Even in the days before technology transformed the ways in which the world communicated, rumors once started were difficult to suppress. Enflamed rhetoric was even harder to manage for often it was printed and preserved for posterity. Fortunately for the young republic, General Washington was a man who had learned that brash language and rash actions were counter-productive to stability and prosperity. While he understood the frustration[3] of his men, he also understood that a liberty so newly achieved could not withstand civil discord.[4] A nation built from the fire of revolution would have to learn how to handle and even embrace civil discord.; however, Washington was wise in objecting to discord created by “insidious design” and spread by rumor and extreme rhetoric.