Joshua Gleich

Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood ...
More

Over the past seventy years, the American film industry has transformed from mass-producing movies to producing a limited number of massive blockbuster movies on a global scale. Hollywood film studios have moved from independent companies to divisions of media conglomerates. Theatrical attendance for American audiences has plummeted since the mid-1940s; nonetheless, American films have never been more profitable. In 1945, American films could only be viewed in theaters; now they are available in myriad forms of home viewing. Throughout, Hollywood has continued to dominate global cinema, although film and now video production reaches Americans in many other forms, from home videos to educational films.

Amid declining attendance, the Supreme Court in 1948 forced the major studios to sell off their theaters. Hollywood studios instead focused their power on distribution, limiting the supply of films and focusing on expensive productions to sell on an individual basis to theaters. Growing production costs and changing audiences caused wild fluctuations in profits, leading to an industry-wide recession in the late 1960s. The studios emerged under new corporate ownership and honed their blockbuster strategy, releasing “high concept” films widely on the heels of television marketing campaigns. New technologies such as cable and VCRs offered new windows for Hollywood movies beyond theatrical release, reducing the risks of blockbuster production. Deregulation through the 1980s and 1990s allowed for the “Big Six” media conglomerates to join film, theaters, networks, publishing, and other related media outlets under one corporate umbrella. This has expanded the scale and stability of Hollywood revenue while reducing the number and diversity of Hollywood films, as conglomerates focus on film franchises that can thrive on various digital media. Technological change has also lowered the cost of non-Hollywood films and thus encouraged a range of alternative forms of filmmaking, distribution, and exhibition.

Jeffrey Helgeson

Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing ...
More

Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing global political and economic system. Once focused closely on institutional dynamics in the workplace and electoral politics, labor history has expanded and refined its approach to include questions about the families, communities, identities, and cultures workers have developed over time. With a critical eye on the limits of liberal capitalism and democracy for workers’ welfare, labor historians explore individual and collective struggles against exclusion from opportunity, as well as accommodation to political and economic contexts defined by rapid and volatile growth and deep inequality.

Particularly important are the ways that workers both defined and were defined by differences of race, gender, ethnicity, class, and place. Individual workers and organized groups of working Americans both transformed and were transformed by the main struggles of the industrial era, including conflicts over the place of former slaves and their descendants in the United States, mass immigration and migrations, technological change, new management and business models, the development of a consumer economy, the rise of a more active federal government, and the evolution of popular culture.

The period between 1896 and 1945 saw a crucial transition in the labor and working-class history of the United States. At its outset, Americans were working many more hours a day than the eight for which they had fought hard in the late 19th century. On average, Americans labored fifty-four to sixty-three hours per week in dangerous working conditions (approximately 35,000 workers died in accidents annually at the turn of the century). By 1920, half of all Americans lived in growing urban neighborhoods, and for many of them chronic unemployment, poverty, and deep social divides had become a regular part of life. Workers had little power in either the Democratic or Republican party. They faced a legal system that gave them no rights at work but the right to quit, judges who took the side of employers in the labor market by issuing thousands of injunctions against even nonviolent workers’ organizing, and vigilantes and police forces that did not hesitate to repress dissent violently. The ranks of organized labor were shrinking in the years before the economy began to recover in 1897. Dreams of a more democratic alternative to wage labor and corporate-dominated capitalism had been all but destroyed. Workers struggled to find their place in an emerging consumer-oriented culture that assumed everyone ought to strive for the often unattainable, and not necessarily desirable, marks of middle-class respectability.

Yet American labor emerged from World War II with the main sectors of the industrial economy organized, with greater earning potential than any previous generation of American workers, and with unprecedented power as an organized interest group that could appeal to the federal government to promote its welfare. Though American workers as a whole had made no grand challenge to the nation’s basic corporate-centered political economy in the preceding four and one-half decades, they entered the postwar world with a greater level of power, and a bigger share in the proceeds of a booming economy, than anyone could have imagined in 1896. The labor and working-class history of the United States between 1900 and 1945, then, is the story of how working-class individuals, families, and communities—members of an extremely diverse American working class—managed to carve out positions of political, economic, and cultural influence, even as they remained divided among themselves, dependent upon corporate power, and increasingly invested in a individualistic, competitive, acquisitive culture.

David Torstensson

On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. ...
More

On January 5, 2014—the fiftieth anniversary of President Lyndon Johnson’s launch of the War on Poverty—the New York Times asked a panel of opinion leaders a simple question: “Does the U.S. Need Another War on Poverty?” While the answers varied, all the invited debaters accepted the martial premise of the question—that a war on poverty had been fought and that eliminating poverty was, without a doubt, a “fight,” or a “battle.”

Yet the debate over the manner—martial or not—by which the federal government and public policy has dealt with the issue of poverty in the United States is still very much an open-ended one.

The evolution and development of the postwar American welfare state is a story not only of a number of “wars,” or individual political initiatives, against poverty, but also about the growth of institutions within and outside government that seek to address, alleviate, and eliminate poverty and its concomitant social ills. It is a complex and at times messy story, interwoven with the wider historical trajectory of this period: civil rights, the rise and fall of a “Cold War consensus,” the emergence of a counterculture, the Vietnam War, the credibility gap, the rise of conservatism, the end of “welfare,” and the emergence of compassionate conservatism. Mirroring the broader organization of the American political system, with a relatively weak center of power and delegated authority and decision-making in fifty states, the welfare model has developed and grown over decades. Policies viewed in one era as unmitigated failures have instead over time evolved and become part of the fabric of the welfare state.

Ana Elizabeth Rosas

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
...
More

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.

On August 4, 1942, the Mexican and U.S. governments launched the bi-national guest worker program, most commonly known as the Bracero Program. An estimated five million Mexican men between the ages of 19 and 45 separated from their families for three-to-nine-month contract cycles at a time, in anticipation of earning the prevailing U.S. wage this program had promised them. They labored in U.S. agriculture, railroad construction, and forestry, with hardly any employment protections or rights in place to support themselves and the families they had left behind in Mexico. The inhumane configuration and implementation of this program prevented most of these men and their families from meeting such goals. Instead, the labor exploitation and alienation that characterized this guest worker program and their program participation paved the way for, at best, fragile family relationships. This program lasted twenty-two years and grew in its expanse, despite its negative consequences, Mexican men and their families could not afford to settle for being unemployed in Mexico, nor could they pass up U.S. employment opportunities of any sort. The Mexican and U.S. governments’ persistently negligent management of the Bracero Program, coupled with their conveniently selective acknowledgement of the severity of the plight of Mexican women and men, consistently cornered Mexican men and their families to shoulder the full extent of the Bracero Program’s exploitative conditions and terms.

Matt Garcia

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of ...
More

In September 1962, the National Farm Workers Association (NFWA) held its first convention in Fresno, California, initiating a multiracial movement that would result in the creation of United Farm Workers (UFW) and the first contracts for farm workers in the state of California. Led by Cesar Chavez, the union contributed a number of innovations to the art of social protest, including the most successful consumer boycott in the history of the United States. Chavez welcomed contributions from numerous ethnic and racial groups, men and women, young and old. For a time, the UFW was the realization of Martin Luther King Jr.’s beloved community—people from different backgrounds coming together to create a socially just world. During the 1970s, Chavez struggled to maintain the momentum created by the boycott as the state of California became more involved in adjudicating labor disputes under the California Agricultural Labor Relations Act (ALRA). Although Chavez and the UFW ultimately failed to establish a permanent, national union, their successes and strategies continue to influence movements for farm worker justice today.

Peter Cole

The history of dockworkers in America is as fascinating and important as it is unfamiliar. Those who worked along the shore loading and unloading ships played an invaluable role in an ...
More

The history of dockworkers in America is as fascinating and important as it is unfamiliar. Those who worked along the shore loading and unloading ships played an invaluable role in an industry central to both the U.S. and global economies as well as the making of the nation. For centuries, their work remained largely the same, involving brute manual labor in gangs; starting in the 1960s, however, their work was entirely remade due to technological transformation. Dockworkers possess a long history of militancy, resulting in dramatic improvements in their economic and workplace conditions. Today, nearly all are unionists, but dockworkers in ports along the Atlantic and Gulf coasts belong to the International Longshoremen’s Association (ILA), while the International Longshore and Warehouse Union (ILWU) represents them in Pacific Coast ports as well as in Hawaii and Alaska (along with British Columbia and Panama). In the mid-1930s, the ILA and ILWU became bitter rivals and remain so. This feud, which has cooled slightly since its outset, can be explained by differences in leadership, ideology, and tactics, with the ILA more craft-based, “patriotic,” and mainstream and the ILWU quite left wing, especially during its first few decades, and committed to fighting for racial equality. The existence of two unions complicates this story; in most countries, dockworkers belong to a single union. Similarly, America’s massive economy and physical size means that there are literally dozens of ports (again, unlike many other countries), making generalizations harder. Unfortunately, popular culture depictions of dockworkers inculcate unfair and incorrect notions that all dockworkers are involved with organized crime. Nevertheless, due to decades of militancy, strikes, and unionism, dockworkers in 21st-century America are—while far fewer in number—very well paid and still do important work, literally making world trade possible in an era when 90 percent of goods move by ship for at least part of their journey to market.

Vanessa May

Domestic work was, until 1940, the largest category of women’s paid labor. Despite the number of women who performed domestic labor for pay, the wages and working conditions were often ...
More

Domestic work was, until 1940, the largest category of women’s paid labor. Despite the number of women who performed domestic labor for pay, the wages and working conditions were often poor. Workers labored long hours for low pay and were largely left out of state labor regulations. The association of domestic work with women’s traditional household labor, defined as a “labor of love” rather than as real work, and its centrality to southern slavery, have contributed to its low status. As a result, domestic work has long been structured by class, racial, and gendered hierarchies. Nevertheless, domestic workers have time and again done their best to resist these conditions. Although traditional collective bargaining techniques did not always translate to the domestic labor market, workers found various collective and individual methods to insist on higher wages and demand occupational respect, ranging from quitting to “pan-toting” to forming unions.

Chad Pearson

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
...
More

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.

Employers began organizing to reduce the power of organized labor in the late 19th and early 20th centuries. Irritated by strikes, boycotts, and the unions’ desire to achieve exclusive bargaining rights, employers demanded the right to establish open shops—workplaces that promoted individualism over collectivism. Rather than recognize closed or union shops, employers demanded the right to hire and fire whomever they wanted, irrespective of union status. They established an open-shop movement, which was led by local, national, and trade-based employers. Some formed more inclusive “citizens’ associations,” which included clergymen, lawyers, judges, academics, and employers. Throughout the first three decades of the 20th century, this movement succeeded in busting unions, breaking strikes, and blacklisting labor activists. It united large numbers of employers and was largely successful. The movement faced its biggest challenges in the 1930s, when a liberal political climate legitimized unions and collective bargaining. But employers never stopped organizing and fighting, and they continued to undermine the labor movement in the following decades by invoking the phrase “right to work,” insisting that individual laborers must enjoy freedom from the so-called union bosses and compulsory unionism. Numerous states, responding to pressure from organized employers, began passing “right to work” laws, which made union organizing more difficult as workers were not obligated to join unions or pay their “fair share” of dues. The multi-decade employer-led anti-union movement succeeded in fighting organized labor at the point of production, in politics, and in public relations.

Landon R. Y. Storrs

The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War ...
More

The second Red Scare refers to the fear of communism that permeated American politics, culture, and society from the late 1940s through the 1950s, during the opening phases of the Cold War with the Soviet Union. This episode of political repression lasted longer and was more pervasive than the Red Scare that followed the Bolshevik Revolution and World War I. Popularly known as “McCarthyism” after Senator Joseph McCarthy (R-Wisconsin), who made himself famous in 1950 by claiming that large numbers of Communists had infiltrated the U.S. State Department, the second Red Scare predated and outlasted McCarthy, and its machinery far exceeded the reach of a single maverick politician. Nonetheless, “McCarthyism” became the label for the tactic of undermining political opponents by making unsubstantiated attacks on their loyalty to the United States.

The initial infrastructure for waging war on domestic communism was built during the first Red Scare, with the creation of an antiradicalism division within the Federal Bureau of Investigation (FBI) and the emergence of a network of private “patriotic” organizations. With capitalism’s crisis during the Great Depression, the Communist Party grew in numbers and influence, and President Franklin D. Roosevelt’s New Deal program expanded the federal government’s role in providing economic security. The anticommunist network expanded as well, most notably with the 1938 formation of the Special House Committee to Investigate Un-American Activities, which in 1945 became the permanent House Un-American Activities Committee (HUAC). Other key congressional investigation committees were the Senate Internal Security Subcommittee and McCarthy’s Permanent Subcommittee on Investigations. Members of these committees and their staff cooperated with the FBI to identify and pursue alleged subversives. The federal employee loyalty program, formalized in 1947 by President Harry Truman in response to right-wing allegations that his administration harbored Communist spies, soon was imitated by local and state governments as well as private employers. As the Soviets’ development of nuclear capability, a series of espionage cases, and the Korean War enhanced the credibility of anticommunists, the Red Scare metastasized from the arena of government employment into labor unions, higher education, the professions, the media, and party politics at all levels. The second Red Scare did not involve pogroms or gulags, but the fear of unemployment was a powerful tool for stifling criticism of the status quo, whether in economic policy or social relations. Ostensibly seeking to protect democracy by eliminating communism from American life, anticommunist crusaders ironically undermined democracy by suppressing the expression of dissent. Debates over the second Red Scare remain lively because they resonate with ongoing struggles to reconcile Americans’ desires for security and liberty.

Ramón A. Gutiérrez

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.
...
More

This is an advance summary of a forthcoming article in the Oxford Research Encyclopedia of American History. Please check back later for the full article.

Mexican immigration to the United States is a topic of particular interest at this moment for a number of political reasons. First, and probably foremost, Mexicans are currently the single largest group of foreign-born residents in the country. In 2013, the United States counted 41.3 million individuals of foreign birth; 28 percent, or 11.6 million, were Mexican. If census data are aggregated more broadly, adding together the foreign-born and persons of Mexican ancestry who are citizens, the number totals 31.8 million in 2010, or roughly 10 percent of the country’s total population of 308.7 million. What has nativists and those eager to restrict immigration particularly concerned is that the Mexican origin population has been growing rapidly, by 54 percent between the 2000 and 2010 censuses, or from 11.2 million to 31.8 million persons. This pace of growth has slowed, but not enough to calm racial and xenophobic fears of the citizenry fearful of foreigners and terrorists.

Mexican immigration to the United States officially began in 1846 and has continued into the present without any significant period of interruption, also making it quite distinct. The immigration histories of national groups that originated in Asia, Africa, and Europe are much more varied in trajectory and timing. They usually began with massive movements, driven by famine, political strife or burgeoning economic opportunities in the United States, and then slowed, tapered off, or ended abruptly, as was the case with Chinese immigration from 1850 to 2015. This fact helps explain why Mexico has been the single largest source of immigrants in the United States for the longest period of time.

The geographic proximity between the two countries, compounded by profound economic disparities, has continuously attracted Mexican immigrants, facilitated by a border that is rather porous and that has been poorly patrolled for much of the 20th century. The United States and Mexico are divided by a border that begins at the Pacific Ocean, at the twin cities of San Diego, California and Tijuana, Baja California. The border moves eastward until it reaches the Rio Grande at El Paso, Texas and Ciudad Júarez, Chihuahua. From there the border follows the river’s flow in a southeastern direction, until its mouth empties into the Gulf of Mexico where Brownsville, Texas and Matamoros, Tamaulipas sit. This expanse of over 1,945 miles is poorly marked. In many places, only old concrete markers, sagging, dry-rotted fence posts with rusted barbed wire, and a river that has continually changed its course, mark the separation between these two sovereign national spaces.

Since 1924, when the U.S. Border Patrol was created mainly to prohibit the unauthorized entry of Chinese immigrants, not Mexicans, American attempts to effectively regulate entries and exits has been concentrated only along known, highly trafficked routes that lead north. The inability of the United States to patrol the entire length of its border with Mexico has meant that any Mexican eager to work or live in the United States has rarely found the border an insurmountable obstacle, and if they have encountered it temporarily so, they have simply hired expensive professional smugglers (known as coyotes) to maximize safe passage into the United States without border inspection or official authorization. In 2014, there were approximately 11.3 million such unauthorized immigrants in the United States; 49 percent, or 5.6 million of them were Mexican.

Over the long course of history Mexican immigration is best characterized as the movement of unskilled workers toiling in agriculture, railroad construction, and mineral extraction; for the last two decades, they have worked in construction and service industries as well. This labor migration has evolved through five distinct phases, each marked by its own logic, demands, and governance.

PRINTED FROM the OXFORD RESEARCH ENCYCLOPEDIA, AMERICAN HISTORY (americanhistory.oxfordre.com). (c) Oxford University Press USA, 2016. All Rights Reserved. Personal use only; commercial use is strictly prohibited. Please see applicable Privacy Policy and Legal Notice (for details see Privacy Policy).