We use cookies to enhance your experience on our website. By continuing to use our website, you are agreeing to our use of cookies. You can change your cookie settings at any time.Find out moreJump to
Content

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, AMERICAN HISTORY (americanhistory.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).

date: 19 March 2018

The American Experience during World War II

Summary and Keywords

On the eve of World War II many Americans were reluctant to see the United States embark on overseas involvements. Yet the Japanese attack on the U.S. Pacific fleet at Pearl Harbor on December 7, 1941, seemingly united the nation in determination to achieve total victory in Asia and Europe. Underutilized industrial plants expanded to full capacity producing war materials for the United States and its allies. Unemployment was sucked up by the armed services and war work. Many Americans’ standard of living improved, and the United States became the wealthiest nation in world history.

Over time, this proud record became magnified into the “Good War” myth that has distorted America’s very real achievement. As the era of total victories receded and the United States went from leading creditor to debtor nation, the 1940s appeared as a golden age when everything worked better, people were united, and the United States saved the world for democracy (an exaggeration that ignored the huge contributions of America’s allies, including the British Empire, the Soviet Union, and China). In fact, during World War II the United States experienced marked class, sex and gender, and racial tensions. Groups such as gays made some social progress, but the poor, especially many African Americans, were left behind. After being welcomed into the work force, women were pressured to go home when veterans returned looking for jobs in late 1945–1946, losing many of the gains they had made during the conflict. Wartime prosperity stunted the development of a welfare state; universal medical care and social security were cast as unnecessary. Combat had been a horrific experience, leaving many casualties with major physical or emotional wounds that took years to heal. Like all major global events, World War II was complex and nuanced, and it requires careful interpretation.

Between the world wars, The United States followed policies that were not always consistent or helpful to international stability. Facing Europe, America attempted some cooperation to promote peace but was generally isolationist, following the mood of the people. Americans generally tended to favor letting Europeans solve their own problems, thus politically handicapping U.S. ability to cooperate with other powers in keeping the peace. As late as 1940, 11.5 million white Americans were first-generation immigrants (and a further 23 million were second generation), many of whom had fled European conditions and rejected involvement with the Old World. By contrast, in the Pacific, the United States became increasingly confrontational in its relations with Japan. Both approaches to some degree compromised American bids for neutrality.

Millions around the world, including in the United States, had hoped that the Great War, the “war to end all wars,” would inaugurate a lasting era of international peace. It became clear almost immediately that this would be the case. Four major empires had fallen: the Ottoman, Austro-Hungarian, and, most importantly, German and Russian Empires, destabilizing the international situation. From German defeat would rise Nazism, and continuing revolution in Russia produced Bolshevism, a militant anti-Western communism. As capitalism teetered in the 1920s and 1930s, authoritarian regimes sprang up in nations like Italy, whose charismatic leader, Benito Mussolini, promised national recovery through conquest. Expansion of European militaries diminished unemployment after the world financial crash of 1929. Japan resenting Western imperialism and condescension, moved toward achieving major-power status through aggressive expansionism, conquering much of Chinese Manchuria by 1931. The U.S. Exclusion Act of 1924, virtually banning Asian immigration to America, made negotiation with Japan harder.

While the United States helped bring about the Washington naval treaties of 1921–1922, limiting the size of great-power navies, seen as an important step in preventing aggressive war, the U.S. Senate in another move refused to ratify League of Nations membership. This fear of formal foreign entanglements dogged American policy into the 1930s. When in 1936 right-wing fascist forces under General Francisco Franco began a civil war in Spain that would result in the overthrow of the elected republican government, the United States and United Kingdom failed officially to support the republican forces. The democracies failed again to resist fascist aggression when Hitler, after remilitarizing the Rhineland, absorbed Austria into the Reich in March 1938. He then demanded the Sudetenland province of Czechoslovakia, which had a German population of 3.25 million. Soviet Russia refused to cooperate militarily with France and Britain, and, as these two western powers were still rearming, they gave in to Hitler’s demands. The United States stood aloof, President Franklin D. Roosevelt merely sending Hitler a telegram asking him to settle the Sudeten question amicably, hoping to avoid European involvement.

Nevertheless, FDR and informed American observers understood the United States could not stand aside as Axis conquests continued. After war broke out in September 1939, American trade relations increasingly favored the Allies over the Axis. After France fell in summer 1940 and Britain stood alone, FDR moved to provide increased military equipment, telling Americans through his fireside chats that the United Kingdom must not be allowed to fall. He was helped by able commentators like Edward R. Murrow, who broadcast from London about determined British resistance during the German aerial blitz.

In late autumn 1940 the Lend-Lease policy was announced. Simply put, Britain was running out of financial reserves to carry on the fighting. The United States, said FDR, would lend its democratic cousin the tools of war, just as one would help a neighbor put out a fire. The United Kingdom would repay the loan with favorable trade relations and other concessions after the war. Further, the United States gave Britain fifty destroyers in return for access to British naval bases that could threaten Japan. In his State of the Union Address on January 6, 1941, FDR articulated four freedoms that all peoples should enjoy: freedoms of speech and worship, and freedom from fear and want. The president thus built support for giving further help to those fighting the Axis, and throughout the hostilities the freedoms were an inspiration around the world, being incorporated into the United Nations Declaration of Human Rights. In 1941, the United States extended protection of British merchant convoys into the mid-Atlantic, inevitably exchanging fire with German vessels. The first incident occurred on September 4, 1941, when a U.S. destroyer, the Greer, exchanged fire with a German submarine.

Yet intervention remained relatively unpopular. When America’s first peacetime draft was introduced in summer 1940, many potential inductees swore to “go over the hill.” Pearl Harbor brought the United States into the war, and many Americans wanted a “Pacific first” policy, some even arguing against contributing military forces to the fight in Europe. Appeasement of the dictators had failed in Europe, but deterrence proved equally unsuccessful in Asia. Japanese aggression in China had continued. In spring 1940, the United States dispatched the bulk of its Pacific fleet to Hawaii, sending a warning to Japan, and in July 1940, along with Britain and Holland, imposed sanctions, cutting off Japan’s access to vital military raw materials. This policy backfired, as it convinced the Japanese pro-war faction that the nation must seize what it needed. In July 1941, Japan swallowed southern Indochina. FDR sent B-17 bombers to Pacific bases, warning the imperial government that the planes could destroy the delicate wooden buildings of Japanese cities. Meant to deter the Japanese military, it actually convinced them that the United States was planning obliterating catastrophic offensive on the home islands. Therefore, Japan launched a preemptive strike, sending America into the war.

The American War Machine

Japanese aggression galvanized American patriotic feeling, and after Germany declared war on the United States on December 11, 1941, Americans united in seeking unconditional victory over the Axis powers. America’s resolve and energy surprised its enemies, who had characterized the United States as a clumsy, materialistic giant without the warrior ethic.

In coordination with the government, American industry turned from a peacetime footing to war production. Manufacturing plants, underutilized during the Depression, reached full capacity as federal spending primed the pump with massive war contracts. Not only steel and oil but advertising companies and movie studios were enlisted to fight a war of the mind. The unemployment pool was drained as the military grew to over sixteen million and factories begged for labor. So-called stay-at-home moms were urged to join the work force, particularly as they were blamed for raising spoiled boys who failed army physicals. American industrial production outpaced that of all other belligerents, turning out approximately 300,000 planes, 77,000 ships, 372,000 heavy guns, 20 million small arms, 6 million tons of bombs, and 102,000 armored vehicles. The United States helped equip its allies as well as its own military in the European and Pacific theaters.

This record inevitably led to some exaggeration of the achievement in the public mind: America’s tanks and planes were boasted to be the best; all workers’ shoulders were to the wheel; ostensibly there was no industrial unrest or divisions among the public. In fact, particularly at the start, much equipment was inferior. The Japanese Zero fighter was technologically superior to Allied models flown in 1941, such as the Brewster Buffalo. Many U.S. torpedoes were defective. The Grant tank had only a 180-degree traverse, so it covered only half the battlefield. Although the M-1 was a solid shoulder arm, the clumsy bazooka antitank gun was inferior to the German Panzerfaust, as was the Browning machine gun to the lighter, faster Spandau. Most costly of all, the ubiquitous Sherman tank was too lightly armored, and its 75 mm gun was inferior to the German 88. Gas-powered models burned easily. The weaknesses in hardware, coupled with an overabundance of ammunition, led field commanders to call in heavy air and artillery support, pulverizing everything ahead through a policy of overwhelming fire power that saved friendly lives but caused immense civilian losses and damage to the environment.

Although labor and management cooperated, industrial strikes doubled in 1942–1943 and again in 1944–1945. In 1944 alone, nearly nine million work days were lost. White workers refused to work with blacks, resulting in wildcat strikes by whites. And the popular picture of harmony was marred by wage and promotion discrimination against ethnic minorities and women. Segregation in housing, restaurants, and virtually all walks of life continued. Threatened with civil rights marches on Washington, on June 25, 1941, FDR established the Fair Employment Practices Committee to investigate hiring abuses. But it had a small staff and no authority to enforce its findings through fines or other penalties. Business leaders used the conflict for their own ends, promoting the war as a consumer battle for the right to own a better car or refrigerator.

Ultimately, the effectiveness of the war machine rested with those in uniform, recruited primarily through the selective service system. Historians generally feel that the system, building on the model used during World War I, was probably the fairest in U.S. history. The Civil War draft was marred by the substitute provision whereby rich men could hire others to take their places in the ranks. The Vietnam-era draft allowed generous deferments for those pursuing higher education, thus discriminating against the underprivileged who did not have this option. Nevertheless, the World War II system had its weaknesses.

Blue-collar workers and all minority groups were underrepresented on draft boards, so that their legitimate reasons for requesting deferment might not get heard; this class bias caused resentment. The induction process, attempting to assess millions of candidates, could be clumsy and shallow, particularly in the beginning. For example, boys who were not physically fit were rejected. Yet their deficiencies could have been remedied mostly by the regular exercise regimen of basic training. In trying to deal with the overwhelming numbers of inductees, recruiters relied heavily on mental fitness assessments using recently introduced behavioral-science screening procedures. The interview might rely on only two or three standardized questions to establish a man’s suitability. The most notorious instance was the military’s obsession with homosexuality, stemming from a phobia rooted deep in society. Despite gay men having a higher than average incentive to fight Nazism, as the Nazis targeted homosexuals for liquidation, the military believed gays made bad soldiers, a judgement without empirical backing. If a recruit answered yes when asked whether girls made him nervous, he might fail. The interviews were often so vacuous that a common joke was that if you wanted to flunk induction, wear perfume and affect a mincing step.

Working-class whites filled a disproportionate share of rifle squads. In basic training, intellectuals and despised minority groups like Jews might be harassed by Old Army working-class drill sergeants. But on assignment, educated men could hope for jobs in intelligence or staff, or receive officer training. (Although blacks were often in danger as stevedores or drivers, they were barred from most combat units, to prevent bearing arms leading to demands for full citizenship rights after the war. Nevertheless, 125,000 served overseas, some in combat units such as the famed Tuskegee Airmen and the 761st Tank Battalion under Benjamin O. Davis Jr.) Thus, rifle squads, who were exposed to some of the most dangerous fighting, were made up primarily of underprivileged whites, whose predicament was made worse by the rigid army classification system by which, once assigned to combat units, there was rarely release short of wounds or death.

The Major American Military Campaigns

World War II is popularly remembered as the era of blitzkrieg, or lightning warfare. We imagine tank generals like George S. Patton ripping across open landscapes. Actually, blitzkrieg largely characterized the opening campaigns, when Axis forces made swift conquests. Rolling back Axis gains proved slow and grueling. Initially, instead of spreading their armor and air support thinly along the front, German generals concentrated them in attack formations that punched holes through opposing lines, driving deep into hostile territory and leaving the slow-moving infantry to contain enemy armies marooned behind. Attacked on September 1, 1939, Poland capitulated in less than a month. By June 1940, Denmark, Norway, the Low Countries, and France had fallen, leaving Britain virtually alone. Early in 1941, Yugoslavia and Greece collapsed. In the east, surging Japanese forces, subsisting on rice, a little meat, and wild plants, made immense gains; Singapore, the British bastion, fell to a land attack through terrain thought impassable. Conquests included the Philippines, Thailand, Malaya, Guam, and Hong Kong.

However, after spectacular successes, Axis offensives were blunted. In summer 1940, Hitler lost the Battle of Britain in the air. Turning east, he attacked Russia on June 22, 1941, but, despite huge losses, the Soviets survived and counterattacked, bleeding the German forces. By February 1942, Germany had lost a million men. In the Pacific, the Japanese were overextended and drained by the Chinese war. Never intending to conquer all of Asia or assault the continental United States, they envisaged a negotiated peace, acknowledging their conquests. But Pearl Harbor thwarted that. At the Casablanca Conference, January 12–24, 1943, the three major powers, the United States, Soviet Union, and Britain, adopted an unconditional surrender policy in both theaters.

Also at Casablanca, fearing German power and under pressure from the Soviets to open a second European front, FDR and Churchill agreed to adopt a Europe-first strategy. American generals wanted a cross-Channel attack, but the British argued that first the Battle of the Atlantic and the air war over Fortress Europe must be won. It was agreed to destroy Axis armies in North Africa, guaranteeing oil lines through the Mediterranean and giving U.S. forces needed experience. Operations began in October 1942 and ended in May 1943. In August, Allied forces took Sicily, and in September they invaded Italy, which quickly surrendered. Germany rushed in troops from France and skillfully resisted in mountainous country, with grinding fighting that echoed World War I. Nevertheless, this shifting of forces weakened Germany’s Atlantic defenses.

Meanwhile, the U.S. Eighth and Twelfth Air Forces joined the Royal Air Force in bombing occupied Europe. The RAF only conducted night raids to minimize losses, but U.S. flyers struck in daylight. This strategy forced the Luftwaffe to fight for the skies; U.S. air crews were sacrificed so that by D-Day the Allies had total air control. Allied bombing also slowed the rate of German industrial production. By mid-1943, the Allies were also winning the Atlantic sea war. Helped by the breaking of German operational codes and radar and sonar tracking technology, the U-Boats were decimated; 785 of 1,175 were sunk. Although the Germans sank 23.3 million tons of Allied shipping, 42.5 million replacement tons were built. Preparations for a French invasion could advance. At the Tehran Conference, November 28–December 1, 1943, the United States and Britain pledged Stalin to invade France in May 1944 (bad weather would force postponement until June 6).

False information produced by Allied intelligence convinced Hitler that invasion would come directly across the English Channel at Calais. Until too late, he held back German armor that could have contested the real invasion, most tanks being destroyed behind the front by aerial strikes. On June 6, 1944, the largest seaborne invasion in history began; in the first days, 4,000 landing craft took 176,000 soldiers to five Normandy beaches. By June’s end, one million men were ashore. However, the Germans fought stubbornly behind thick hedgerows and in ruined towns, again simulating trench warfare. Breakthroughs finally occurred at the end of July, aided by U.S. and Free French landings in southern France to flank the enemy. Mobility returned to the battlefield, and Paris was liberated on August 25.

However, the pace of advance slowed as Allied fuel supplies dwindled and German resistance stiffened. Market Garden, a British operation in late September to overwhelm Germany’s defenses, failed, and by winter the armies halted in front of Germany’s West Wall fortifications. Capitalizing on bad weather, Hitler counterattacked through the Ardennes Forest on December 16, 1944, hoping to divide the Allies and defeat them piecemeal. In this Battle of the Bulge, American lines bent without breaking. Patton, in a remarkable tactical feat, pulled his Third Army out of line to swing east and take the Germans in flank. With weather clearing, air attacks destroyed enemy armor and Hitler’s hopes.

With the end of the war in Europe in sight, “the big three” met again, at Yalta, February 4–11, 1945, with a follow-up at Potsdam, July 17–August 2. (In between the two, Nazism virtually ended when Hitler committed suicide on April 30, 1945.) At the conferences, it was agreed that Germany would be divided into four spheres, France being added to the list of occupying powers. Also, democratic elections would be held in all the liberated nations. Unfortunately, the Soviets would renege on this deal, as would the Western allies to a lesser degree, such as in Greece, where a right-wing puppet government was installed. FDR had dreamed of the great powers each having a sphere of influence that would be benevolently nurtured, but the dream relied too much on his personal relations with the other leaders and proved impossible to achieve.

In the Pacific, after initial Japanese successes, the tide turned quickly. In two sea battles, the Coral Sea, May 4–8, and Midway, June 4–7, 1942, Japanese naval power was crippled. With too few submarines to threaten Allied surface dominance, Japan’s lines of communication were disrupted, island garrisons left without support. While the British fought to regain Malaya and other colonial possessions, the United States devised a two-pronged strategy to attack Japan through the Pacific. General Douglas MacArthur, commanding U.S. and British Commonwealth soldiers, would strike from Australia through the southern Pacific to the Philippines. Admiral Chester Nimitz, directing naval and marine forces, would attack across the central Pacific from Hawaii. Some have criticized the dual strategy as wasteful, but, given America’s abundant resources, bringing crippling pressure from two directions appeared reasonable.

Savage fighting, such as on Guadalcanal, led to the concept of island hopping. Allied control of sea lanes meant many islands could be bypassed, the Japanese on them left to rot. By October 1944, MacArthur was in the Philippines; in the decisive Battle of Leyte Gulf he crushed Japanese resistance. In the same period, Nimitz’s forces took Tinian, Guam, and Saipan, only 1,340 miles from the home islands. By February 1945, marines were slugging it out on Iwo Jima, just 750 miles from Tokyo. From advanced air bases, bombers rained incendiaries on Japan’s wooden cities, inflicting perhaps half a million casualties. Allied naval forces prevented the enemy receiving war materials from abroad.

At this point, a strategic decision had to be made whether to starve Japan into surrender, invade the home islands, or use the two extant atomic bombs to end hostilities. Allied war weariness precluded the first option, and estimates of high casualties (tough fighting, often involving kamikaze or banzai suicide attacks, suggesting fanatical resistance) made invasion an unattractive choice. President Harry Truman (FDR had died on April 12) probably had little political choice but to use the bombs. What is more problematic is that two populous cities were targeted. It was thought that demonstrating the bombs on vacant land or on military installations would not have the same shock value And, in the atmosphere of devastation wrought by total war, in which millions had died, the killing of thousands more civilians did not seem the enormity it might later appear, particularly as the long-lasting effects of radiation were not fully understood. Those who claim the Japanese would have surrendered if the emperor’s position had been guaranteed have to prove their assertion. And it was difficult for Allied leaders to put aside their demand for unconditional surrender.

The Axis refusals in the east and the west to surrender after defeat was a foregone conclusion and led to its inevitable result in the massive ruin of total war. The last military act of the world war was also the chilling opening to a Cold War, whose threat of global annihilation still hangs over humanity.

The American Experience of Combat

We often glamorize combat, imagining sleek fighting machines streaking across the landscape in a sanitized video game. Actually, most combat was tough and terrifying.

For GIs, stress began when troop ships took them overseas. The converted holds of merchant ships were dark and claustrophobic, stinking of cigarette smoke, excrement, and vomit. Rumors of impending enemy attacks fed anxiety. Eventually, landing craft deposited men into lethal crossfire on hostile beaches. Shellfire smashed boats, leaving a litter of equipment and body parts in the surf. Officers urged forward movement; go to ground and enemy fire would find you.

Moving inland, soldiers entered an eerily empty landscape. Often, the only indicator of enemy presence was a sniper bullet. Planes attacked and were gone; mortar rounds fell seemingly from nowhere, leaving shaken survivors and casualties. The wounded were an embarrassment, exposing man as animal, made of meat and gristle. The dead rotted and stank.

A direct hit on a tank might leave only red smears. Repeated concussions caused traumatic brain injuries that could be lasting and might help to induce post-traumatic stress syndrome (although not diagnosed until the 1970s, it is clearly apparent in mental wounds). After forcing one’s body to get up and push forward into hostile fire daily, legs no longer worked, and men cracked. As mental wounds were incompletely understood, emotional casualties were often accused of malingering and cowardice.

Killing led to extreme physical and mental distress. Army field studies suggested many riflemen did not fire their weapons in action, either through fear or because of ingrained moral inhibitions. Mass killing was done by blanket fire from heavy weapons platoons, planes, and artillery. Army studies exploded the myth that if soldiers survived initial combat exposure, they could cope from then on. Efficiency did rise initially, but peaked around sixty days, then declined, until, after ninety days of continuous combat, exhaustion resulted. Not surprisingly, as the war dragged on, soldiers deserted in large numbers. In all, about forty thousand GIs deserted during the conflict.

The army policy of permanent classifications encouraged this trend: tired fighting men were not relieved. Theoretically, officers filled out regular efficiency reports on their subordinates, allowing for intervention. But these reports were normally short-answer standardized forms that led to a “tick the box and send it on” mentality. Real problems did not get addressed. Ironically, the Wehrmacht, typically stereotyped as martinets, allowed officers more flexibility in relieving men on the edge. The Wehrmacht also demanded initiative in reading the immediate situation on the ground, whereas the American military relied heavily on top-down management techniques, junior officers being expected to follow orders to the letter. For example, submariners early in the war had to radio for fire permission before launching torpedoes, meaning the target could be lost.

The combat soldier was filthy, diseased, and insect-infested. Army rations soured the stomach. Men soiled themselves in action. Dust in summer, mud in winter, extreme temperatures, and rain and snow were natural enemies. The jungle was a festering opponent; the northern European winter inflicted frostbite. Fighting on coral atolls, men suffered hideous wounds from flying rock shards; in forests, shellfire showered them with sharp wood splinters. Much of the fighting was static, troops inching forward through a blasted landscape that dictated painfully slow forward movement. Men pinned down in action often could not change their clothes or bathe for weeks. Stuck in fox holes, men defecated and urinated where they crouched. Taking off rotted socks tore putrid flesh from feet.

Air and naval personnel could normally bathe and eat decently. But they faced special terrors. Momentary hesitations cost fighter pilots their lives. Flyers burned in the air. Bombers flew at altitudes where it was always winter; if condensation from breath froze in an oxygen mask, one might asphyxiate. As U.S. bombers raided in daylight, vapor trails exposed their positions to enemy attackers. The bombers flew in tight formations with carefully designed overlapping fields of fire so ships could protect each other. But with fighter attacks and air fluctuations, planes deviated from flight patterns, alignment was lost, and planes fell to friendly fire. The racket of multiple machine guns firing proved almost unbearable for crewmen.

As bombers neared their destinations, they locked on to their target and could not deviate to avoid shrapnel, called flak, thrown in their path. A direct hit caused huge damage; gunners were reduced to mush splashed on Plexiglas domes. The badly wounded were strapped to parachutes and jettisoned. Crew members bailing out might be dragged by the slipstream into the props of planes behind, or, if bombers fell quickly, they were trapped inside the fuselage. Unlike infantry, airmen normally served a finite term. But in Europe, say, the number of combat missions was twenty-five and many crews did not live beyond fifteen. Flyers had to live with knowledge of the massive destruction they wreaked. But it was at arms-length; ground soldiers knew war’s true savagery up close, such as flame throwers that turned men into flaring pillars, burning for hours.

Sailors suffered continual stress, fearing attack from the air and by surface and undersea raiders. Their metal ships both protected and maimed; one man was cut into symmetrical chunks when he was blown through a staircase. Fire spelled terrible death, as did drowning below decks in torpedoed vessels when watertight doors slammed shut. Survivors in the water were sucked down by sinking vessels’ undertow, to be blown back to the surface when engines exploded, entrails floating beside them. Oil slicks seared eyes and lungs and, when aflame, broiled victims. Men in the water could be machine-gunned or eaten by sharks. Those fortunate to get in life rafts might be rescued or die of exposure and dehydration. On convoy escort-duty to Russia in the north Atlantic, seamen dreaded the frigid temperatures; fingers froze to guns, and life expectancy in the water was mere minutes. Late in the Pacific war, sailors faced kamikaze attacks. Most missed their targets, but the experience was nerve-racking.

America in Transition

World War II changed America radically but not always in expected and straightforward ways. Rosie the Riveter battling for workplace equality is an incomplete picture. By 1944, nineteen million women had paid jobs. But two-thirds of females did not work outside the home, and polls showed that even though some women resented being forced out of the workplace in 1945–1946, a majority felt going back home was not a hardship. Elite traditionally male professions such as law, medicine, and senior business and political offices remained white men’s preserves. The emergent role of women was as consumers and housewives. The war produced teen culture when adolescents took jobs, such as valeting and babysitting, vacated by older siblings in the military or factories. Their discretionary income created a separate consumer culture and, some thought, encouraged juvenile delinquency. Social pressure was placed on women to stay home as an antidote. Not the women of the war generation so much as their daughters rejected the role of suburban housewife.

The status of minorities remained relatively unchanged. Jim Crow laws predominated, especially in the South. Even in 1952, 87 percent of blacks there had never voted. The best careers remained closed to African Americans. Nevertheless, the war produced a migration of over one million blacks from the rural South and Midwest, seeking better conditions in cities such as Cincinnati, Detroit, Oakland, and Richmond, Virginia. Although migrants faced entrenched racism, coming together produced a determination to fight back in incidents that were precursors to the civil rights battles of the 1960s. Racial clashes included the Harlem Riot of 1943, in which the shooting of a young black soldier by white police officers led to confrontations in which six blacks were killed and seven hundred injured. Hispanics, welcomed as migrant workers in wartime, still endured conditions of near-slavery. Although a proportionate percentage of Mexican Americans wore uniform, they were widely viewed as unpatriotic. In June 1943, the so-called Zoot Suit Riots exploded in Los Angeles. For almost a week, off-duty white military personnel, mainly sailors, roamed city streets, assaulting Hispanics, who were erroneously believed to be shirkers, and whose unconventional gear offended conventional values.

Japanese Americans, most of whom lived on the West Coast, were distrusted by mainstream white culture, and actively feared after Pearl Harbor. In an atmosphere of hysteria, Executive Order 9066 was issued on February 19, 1942. This allowed for most of the forty-seven thousand Issei (Japanese from abroad) and eighty-thousand Nissei (American children of Issei) to be incarcerated in camps, robbed of their civil and property rights. Homosexuals, by contrast, made limited cultural gains. They found kindred spirits in the military and war industries. Gay bars sprang up in cities, encouraging an alternate culture. Military psychological testing, although remaining hostile to gays, switched the debate from moral and religious to medical and sociological terms, an advance of sorts.

The greatest domestic impact of war was to encourage bigness. Corporations thrived on government contracts, accelerating a trend started in the Civil War. Just two months after Pearl Harbor, two hundred thousand small businesses had gone under, unable to compete. Cities mushroomed, and rural America shrank as agribusiness took the land. More people moved during the war than the westward internal migrations of the 18th and 19th centuries, encouraging further mobility in American life. The massive shift in population led to overcrowding in the cities where the new jobs lay. Detroit’s Willow Run, where a new Ford plant was sited, grew from fifteen thousand to forty-seven thousand overnight. Some landlords tried to help new arrivals, and stores were asked to stay open late at least once a week to help out working wives. But others exploited newcomers’ needs, charging exorbitant fees for accommodation and services. Day-care services were largely inadequate for mothers of young children. By October 1942, 1.2 million families were doubled up in one-family units.

Big business helped dictate public policy in Washington, forging the government-industrial-military complex. Observers expressed disgust at the influx of business lobbyists seeking special privilege for corporations. Powerful CEOs attacked the nascent social welfare system, begun under the New Deal. Faced by wartime caps on salaries, companies offered lavish benefits packages, including pensions and medical insurance. These undercut a developing push for universal social security and healthcare benefits, now deemed communistic, so that many Americans remained outside the social safety net.

Ironically, America at the same time embarked on an enormous piece of social engineering through the 1944 Serviceman’s Readjustment Act, known as the GI Bill. This gave veterans generous unemployment and housing benefits, free schooling, and job training. The bill successfully widened the middle class and prevented a new recession. At the same time, some veterans who most needed help, lower-class blacks and whites, uneducated and vulnerable to mental issues, unable to lead stable lives, could not access the bill. Some female vets were denied benefits. Educators worried about a decline in the intellectual quality of universities when veterans demanded applied job training instead of traditional liberal arts courses. Grade inflation rose, many professors being reluctant to fail students with war records.

This concern with the life of the mind played into a broader fear that veterans were overly conformist, many having come of age in the military and therefore without adult experience beyond taking orders. The demobilization of millions of disciplined but quiescent young men made some critics worry for American individuality and initiative. Veterans tended to respect hierarchy and unquestioning patriotism and to resent nay-sayers and the newly liberated adolescents. Some could not relate to their children except as little soldiers to be bossed about. When coupled with the dominance of hierarchically managed corporations, this urge to fit in created a potent equation for negative change in the marketplace.

William H. Whyte, an editor of Fortune Magazine, railed against what he called the organization man, a team player, comfortable in committee meetings but short on innovative ideas. America, he predicted, would lose its competitive edge in an atmosphere of group think. The sociologist David Riesman concurred, arguing America was becoming a society of the “other directed,” not following an inner voice but bowing to peer pressure, measured in opinion polls and reflected in popular fashions. They were a “lonely crowd.” Politicians, too, increasingly bowed to majority opinion.

The pressure to conform to hardline, mainstream American values was encouraged by public understanding of the world created by war. Wartime propaganda projected the Soviet people and the nationalist Chinese as earthy democrats, much like Iowans. When, at the end of the war, it became clear that Josef Stalin would not allow democratic governance in those countries that fell under Russian influence, and when the nationalists lost mainland China to communist forces, Americans reeled in shock. Tentative wartime alliances, relying too much on personal relations between FDR, Churchill, and Stalin, quickly eroded into a Cold War. Millions of American veterans returned from abroad, shocked by the destruction brought about by radical social divisions, determined they would not develop here. They supported a strong defense and the silencing of opposition that evolved into anticommunist witch hunts. Thus, while America emerged from the war powerful and prosperous, there was also an undercurrent of anxiety that would lead to endless military and quasi-military interventions around the globe. Sometimes the Central Intelligence Agency, successor of the wartime Office of Strategic Services, undertook questionable covert operations that sprang from wartime precedent, such as assassination of foreign leaders and the overthrow of governments labeled unfriendly to U.S. interests.

The United States justified the type of actions that it condemned in others through the theory of American “exceptionalism,”—the inherent goodness of America’s intentions and its self-evident global mission to preserve democracy making its wrongdoing necessary and therefore innocent. The sense of American special providence was present from the country’s Puritan beginnings. But World War II vastly enhanced the American sense of exceptionalism. The United States had indeed led the world in saving democracy, but the struggle was too often depicted as one between pure good and total evil. Given the immense atrocities perpetrated by Axis opponents, this oversimplification was understandable. Yet it remained a distortion, leading Americans to assume license to do as they wished in the world.

The American Myth of the “Good War”

The magnitude of U.S. victory in World War II produced a growing legend written in superlatives about the “Good War.” The myth began during the conflict, when U.S. propaganda portrayed the war as one between pure good and incarnate evil, and peaked in the 1990s. The story went that Americans alone saved the world for democracy; their war machine was the greatest ever; the people were all united; the typical GI was a happy warrior. Some veterans challenged this picture, saying the United States was guilty of savagery and blunders, and combat soldiers hated the glamorization of their predicament. But criticism was increasingly ignored.

After two smaller, indecisive wars, Korea and Vietnam, the 1940s appeared in retrospect as a golden age. America’s loss of global political prestige and economic power in the 1960s and beyond, going from leading creditor to debtor nation, reinforced nostalgia. A prominent purveyor of Good War mythology was the prominent historian Stephen E. Ambrose. His books increasingly simplified reality, emphasizing D-Day as the pivotal moment of the war, slighting other crucial events such as the German defeats in the Battle of Britain and at Stalingrad, necessarily underplaying the contributions of America’s allies, including the British Empire, China, Russia, and the many freedom fighters against the Nazis. American fighting men, in Ambrose’s picture, were a transcendent band of brothers. TV journalist Tom Brokaw elevated the vision into ancestor worship when in 1998 he called this the greatest generation in human history.

The myth declined after 9/11. Although George W. Bush’s administration called 9/11 a second Pearl Harbor and launched conventional attacks on Afghanistan and Iraq, seeking a renewed Good War, they misread history. While World War II was fought against conventional militaries, 9/11 was perpetrated by outlaws, originating mainly in Saudi Arabia, a major U.S. ally. A more credible strategy than refighting the Good War might have been to hunt down al-Qaeda in an international police action. Certainly, Vietnam offered a more appropriate precedent than World War II. Administration predictions that coalition troops would be greeted as liberators, as in Axis-occupied countries in 1944, proved misguided: the United States became the occupiers now. And the analogy ignored that the nation was widely resented in 1944 for the brute-force character of its advance.

Despite copious Good War analogies, the 21st-century military does not mirror the citizen armies of the 1940s. The U.S. military is entirely professional and increasingly supplemented by highly paid mercenaries. The privileged do not serve proportionately. Foreign deployments do not touch the lives of most citizens as they did in World War II. And rather than levying a supertax to pay for the war, the government cut taxes as America began the war on terror, in direct contrast to the higher taxes imposed in World War II. The public debt has soared, exploding another Good War myth born of the unique circumstances of the 1940s, that wars are always beneficial for the economy.

A destructive aspect of mythologizing World War II is the so-called Munich analogy. Building on criticism of France and Britain’s failure to confront Hitler firmly at Munich in 1938, Americans for decades have argued that “appeasement” never works: you must always confront hostile nations with force. This attitude downgrades the value of diplomacy. That Munich teaches such a straightforward moral is debatable. If there is a clear lesson from 1938, it might be about the need for allies in a diplomatic crisis: the refusal of the democracies and the Soviets to collaborate sealed Czechoslovakia’s fate.

Discussion of the Literature

Early literature followed the major campaigns and political decisions, along with biographies of the major military and political players. Some writing was partisan, but there is now consensus on the broad narrative of U.S. military campaigns. Conspiracy theories accusing FDR of manipulating Japanese hostility as a “back door to war” have lost credibility. The “Europe first” policy no longer excites controversy, particularly as the United States still fielded the dominant Pacific forces. The two-pronged assault on Japan is endorsed by most historians. Perhaps only two strategic issues still provoke argument. First, the ploy of attacking Italy to draw in German forces from elsewhere worked, but many Allied soldiers there felt they were sacrificed, being refused adequate military resources.

Second, the increasing destructiveness of the Allied air war remains contentious. Focusing on Dresden, war veteran Kurt Vonnegut Jr., graphically exposed the devastation on the ground in his 1969 novel Slaughterhouse-Five. Some critics argue that targeting civilians generally backfired, as production and morale could rise under attack. But defenders of area bombing contend Axis output was slowed and the will to carry on weakened. Regarding the atom bombs, some division remains as to whether Truman had no political choice but to use them.

While the overall efficiency of the U.S. military machine is agreed upon, specifics remain open to debate. Some contemporaries and later historians have criticized a seeming organizational rigidity that led to unwieldiness in action. Thus, statistically, the more flexible Germans outfought the Americans in Normandy, inflicting a higher-percentage damage. Certain aspects of military training have also been questioned. Although the figure has been challenged, field studies suggested that as few as 25 percent of riflemen fired their weapons in combat. This, combined with the inferiority of U.S. armor, led to the philosophy of “overwhelming firepower” to save friendly lives. An abundance of projectiles was delivered by aircraft, artillery, and heavy weapons units, clearing the ground ahead of infantry but also killing civilians and ruining infrastructure.

“War psychosis,” or rising rage at the enemy as conflict deepens, intensified the brute-force quality of fighting on all sides as hostilities persisted, eroding norms of behavior. Pioneering work has been done on the abuse of civilians by American forces, but more investigation is required, particularly regarding sexual assault. Research is made difficult because the military threw a blanket of silence over the incidence of rape. Treatment of minorities in uniform, including women and homosexuals, also needs further examination. The occurrence and consequences of homosexuality in the higher military ranks also remain unexplored.

Regarding the home front, a consensus has emerged that, despite a significant minority of women wishing to remain in the workforce and resenting barriers to access and promotion, a majority were prepared to be homemakers and consumers, rather than producers, once the war ended. Many male veterans shared their dream of suburban living in one-family homes, unburdened by multigenerational living as in the Depression. The middle class was broadened, but some critics feared this influx of the “newly arrived” promoted the spread of crude material values and an intolerance of intellectual diversity.

More recent scholarship suggests that it was in the 1960s and 1970s that the daughters of 1940s women rebelled against the suburban housewife model in a renewed feminist movement.

The sons of World War II veterans, many of whom rejected their fathers’ hawkish patriotism, opposed the Vietnam War. Recent scholarship suggests a serious generational divide, the sons of World War II veterans resenting emotional bullying. Ethnic minorities sought equality in the 1960s, swelling the ranks of the civil rights movement. But historians suggest their resistance to discrimination actually began during World War II, with sit-ins and threatened marches on Washington. Finally, gays in the population at large found that the war enhanced their opportunities for forming pocket communities of like-minded people. Scholarship on homosexuality, along with the rise of teen culture in the 1940s, is still at a relatively early stage of development.

The immensity of American victory in World War II led inevitably to the mythologizing of the conflict as the “Good War.” To fit the upbeat picture, the torture and death of Holocaust victims had to be refigured as an ultimately inspiring and uplifting tale of human endurance and transcendence. The mythology was challenged immediately by some veterans and social commentators. But authors increasingly embellished the conflict as the Good War, lauding World War II Americans as the generation that almost single-handedly saved the world for democracy and proclaiming D-Day the most important event of the war, even of the 20th century. The myth peaked around the start of the 21st century and declined after 9/11 when the United States entered a very different period of conflict, the endless low-grade fighting of the war on terror. Remaining veterans and younger scholars increasingly challenge the romantic picture of World War II.

Primary Sources

Indispensable document collections include the National Archives’ World War II records and the Library of Congress Military History Centers. Also see the Smithsonian Archives and Libraries, along with the United States Holocaust Museum. Important oral histories include Studs Terkel’s The Good War and James Jones’s WW II. The sociologist Samuel A. Stouffer headed a Pentagon research team that surveyed over five hundred thousand soldiers to determine attitudes on wide-ranging subjects.1 Memoirs of military service constitute an excellent primary source. Examples include, for Europe, William R. Cubbins, The War of the Cottontails; and Audie Murphy, To Hell and Back. On the Pacific, good instances are William Manchester, Goodbye Darkness, and E. B. Sledge, With the Old Breed.2

Books by distinguished reporters include Frank Gervasi, The Violent Decade; Eric Sevareid, Not So Wild a Dream; John Steinbeck, Once There Was a War; and George Weller, Weller’s War.3 J. Glenn Gray, The Warriors; and Paul Fussell, Wartime, represent a distinct genre of provocative retrospection. Novels by veterans captured war’s nuance. Famous examples include James Jones, The Thin Red Line; and Norman Mailer, The Naked and the Dead.4 Sociological analysis by contemporaries highlighted cultural changes brought by war. Two seminal works are David Riesman, The Lonely Crowd; and William H. Whyte, The Organization Man.5

Links to Digital Materials

A thirty-seven-page compendium of suggested further reading to accompany Michael C. C. Adams, The Best War Ever, is posted on the JHUP website.

Military veteran Nate Pelletier has examined why World War II soldiers seemingly suffered lower rates of post-traumatic stress disorder than later veterans. He points out that about 15 percent of World War II soldiers experienced combat. In contrast, by Vietnam, the percentage had risen to 50 percent. And, by the 21st century, above 70 percent of soldiers experience the 360-degree, 24/7 nature of contemporary combat.

Further Reading

Adams, Michael C. C. The Best War Ever: America and World War II. 2d ed. Baltimore: Johns Hopkins University Press, 2015.Find this resource:

Notes:

(1.)
Studs Terkel, The Good War: An Oral History of World War Two (New York: Ballantine, 1985); James Jones, WW II (New York: Ballantine, 1976); and Samuel A. Stouffer, et al. Studies in Social Psychology in World War II: The American Soldier, 4 vols. (Princeton, NJ: Princeton University Press, 1949).

(2.)
William R. Cubbins, The War of the Cottontails: A Bomber Pilot with the Fifteenth Air Force (Chapel Hill, NC: Algonquin, 1989); Audie Murphy, To Hell and Back (Blue Ridge Summit, PA: TAB Military Classics, 1988); William Manchester, Goodbye Darkness: A Memoir of the Pacific War (New York: Little, Brown, 1980); and E. B. Sledge, With the Old Breed: At Peleliu and Okinawa (New York: Presidio, 1981).

(3.)
Frank Gervasi, The Violent Decade: A Foreign Correspondent in Europe, Asia, and the Middle East, 1939–1945 (New York: Norton, 1989); Eric Sevareid, Not So Wild a Dream (New York: Atheneum, 1975); John Steinbeck, Once There Was a War (London: Mondain, 1990); and George Weller, Weller’s War: A Legendary Foreign Correspondent’s Saga of World War II on Five Continents (New York: Crown, 2009).

(4.)
J. Glenn Gray, The Warriors: Reflections on Men in Battle (Lincoln: University of Nebraska Press, 1970); Paul Fussell, Wartime: Understanding and Behavior in the Second World War (New York: Oxford University Press, 1989); James Jones, The Thin Red Line (New York: Scribner’s, 1962); and Norman Mailer, The Naked and the Dead (New York: Henry Holt, 1948).

(5.)
David Riesman, The Lonely Crowd: A Study of Changing American Character (New Haven, CT: Yale University Press, 1961); and William H. Whyte, The Organization Man (New York: Simon & Schuster, 1956).

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, AMERICAN HISTORY (americanhistory.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).