Breaking

Few things remain the same after the passage of half a century, and the American labor force is no exception. Between 1969 and the present day, nearly every aspect of the country's workforce has changed. The demographic makeup of who goes to work is radically different than it was 50 years ago, as is the type of work individuals do, how they do it, how they're paid, and even how they save for retirement. Certain industries like computer programming, coding, and alternative energy sectors were all but unimaginable half a century ago. From the rise of computers and smartphones to the demise of manufacturing, the American workforce continues to adapt, evolve, and grow, grow, grow. With each passing year, American labor looks a little bit different than it did the year before. Here's a look at the most dramatic changes that have shaken, altered, and transformed America's workforce from the space age to the digital age. ALSO: Industries with the worst gender pay gaps

The single biggest change to the U.S. labor force over the last half a century might just be the sheer number of workers that joined it. What the Bureau of Labor Statistics classifies as the "civilian, noninstitutional" population ages 16 and up has skyrocketed from a little more than 137 million in 1970 to about 253 million today, due mostly to general population growth.

Fewer than 70% of 25- to 34-year-olds in 1970 were active in the workforce. Today, that number is nearly 88% with millennials taking on a wide array of professions across diverse industries. Demand for younger workers has increased too, as more jobs require specialized training in computers, coding, or fluency in social media. While globalization reduces demand for manufacturing jobs, aging baby boomers means more demand for prime-aged workers in the health care industry.

Another big reason for the increase in prime-aged workers is a massive rush of women to the workforce, particularly younger women who for the first time in history began to delay or opt out of marriage and childbirth altogether. In 1970, just 45% of women were actively working. Today, that percentage has jumped to 83%.

Today, a difference of only about 10 percentage points separates men and women in the workforce. In 1970, the gap was much wider at 36.3%. That same year, the Equal Pay Act was passed to require equal pay and working conditions for men and women. Wage gaps have been closing since then as well—although that gap isn’t expected to close completely for almost another half century.

The Bureau of Labor Statistics has developed an "economic dependency ratio," which measures the number of people in the workforce against the number of people, including children, who don't work and therefore must be provided for. In 1970, it was 140.4, meaning for every 100 people working, 140 were not. Today, that number has dropped to 97.4, which reflects a steep drop in the number of children the average family has raised over that time period.

Although U.S. manufacturing output has actually increased in terms of output, the number of manufacturing jobs has declined dramatically in recent decades. These jobs peaked at 19.4 million in 1979. By 2010, that number was down 11.5 million despite a steep population increase in the ensuing years. Manufacturing payrolls increased a bit in the 2010s, but manufacturing is still the lowest it’s been since before America entered World War II.

Instead of slowing down, one industry in the manufacturing sector has enjoyed a meteoric rise in recent decades. Computer and electronics manufacturing grew by 2,607% between 1987 and 2017, demonstrating how advances in technology consistently mold industry and force workers to evolve with the changing times.

Manufacturing jobs declined by about one-third since 1990, which has led to a fairly predictable change in in-demand job skills. As manufacturing jobs have dried up and automation has skyrocketed with AI and programmable machinery, demand for manual and physical skills like welding and carpentry has shrunk.

While manual and physical skills are no longer as marketable, a premium has been put on workers with social, personal, writing, and analytical skills. That’s in part because the number of service-oriented jobs that requiring extensive knowledge, experience, education, and training has more than doubled.

Roughly 50 million workers in 1980 held jobs that required an above-average level of job preparation. By 2015, those whose occupations required above-average education, training, or experience had grown to 83 million—an increase of nearly 70%. A wider range of industries means more specialized training and more options for America's diverse workforce population.

Employment in the combined fields of educational services has skyrocketed since 1990. During the same time period in which one in three manufacturing jobs disappeared, educational services jobs grew by 105%—more than any other industry.

The health care and social assistance fields are not far behind educational services in terms of occupational growth. The number of people employed in that industry grew by 99% between 1990 and 2015. Industries with the most explosive job growth are also more likely to be creating jobs that require additional training or higher levels of skills.

Rounding out the top three fields that added the most jobs between 1990-2015 are professional and business services, which grew by 81% during that time. The three fields of educational services, health care and social services, and professional and business services combined to account for 20 million of the 32 million new jobs added during that quarter century of change.

The last three decades have witnessed a direct correlation between higher wages and jobs that require advanced social and analytical skills. Women are now more likely than men to be employed in those occupations.

Since women are now slightly more likely to work in higher-paying industries that focus on in-demand analytical and social skills, and are far less likely to work in lower-paying industries that rely on less-coveted physical skills, the gender-based wage gap has shrunk. In 1980, women earned an average of 60 cents for every dollar men earned. By 2015, that number had grown to 80 cents, cutting the wage gap in half. Progress isn't perfect, however: That gap isn't expected to close completely until 2059.

High-level positions in fields favoring analytical and social skills over physical skills now regularly go to people with college degrees. While college grads represent just 36% of the total workforce, they now claim more than half of the higher-paying jobs that focus on in-demand analytical and social skills. Those with some high school education, high school diplomas, or some college education are crammed into all but 14% of the occupations that require lower-paying physical skills.

In 1980, a little less than 60% of 16- to 24-year-olds were working. After several ensuing dips and peaks, the percentage of young people in the workforce has dropped significantly. By 2015, less than half of that demographic was working, much of which can be attributed to higher college enrollment rates and many young adults going directly from college to masters programs.

More seniors today are choosing to stay in the workforce longer. Forty years ago, just 12% of adults 65 and older were working. By 2015 it was about one in five, suggesting better health for many seniors—and a more difficult time saving for retirement.

One of the biggest trends of the last half a century is not what has happened, but what hasn't. When adjusted for inflation, wages have remained virtually the same since 1980—but those numbers represent the overall average. Parsing things more finely, the median wage for women has risen by nearly one-third, while the median wage for men has actually fallen.

In the nearly two decades between 1996 and 2014, the percentage of people who remained with the same company for five years or more has crossed the critical 50% mark, up from 46% in 1996 to 51% in 2014. That dynamic is attributed in large part to the increasing percentage of older Americans in the workforce.

In the more than three decades between 1980 and the rise of Obamacare in 2013, participation in employer-sponsored health insurance dropped significantly. In 1980, 77% of workers were covered by their employers. 33 years later, that number fell to 69%. While employers with 50 or more full-time employees are required to provide health insurance, many companies have found workarounds by utilizing contractors, freelancers, and part-time employees.

Just as with health insurance, the American workforce has also experienced a steep decline in retirement benefits, but that trend has been a much more recent phenomenon. Almost exactly half the workforce had an employer-sponsored 401(k) or pension in 1980. By 2001, it was up to 57%, but that number dropped all the way down to 45% by 2015.

From Uber to online freelance work, the last decade has witnessed the rise of what the Pew Research Center calls "alternative working arrangements." In 1995, just 10% of the American workforce earned a living by providing their products or services piecemeal on a contract, non-employee basis. By 2015, that number had jumped to 15.8%.

The last few years have witnessed a meteoric rise of contract labor—an astonishing one in three people earn money freelancing either full-time or on a supplementary basis. The move toward contract work, which is one of the most significant shifts in modern American labor history, is partly because of the educational gap, partly because of changing attitudes toward work in general, partly because employers can save money that way, and partly because new technologies have made remote freelancing plausible, accessible, and cheap.

For generations, traditional pensions guaranteed retirees money to live on once their earning years had passed. Pensions, which came with predictable fixed payments, are now nearing extinction. Just 13% of private-sector workers have a traditional pension today compared to 38% in 1979.

The declining pension is due in part to something else that emerged in 1978 and changed the course of labor history. That year, an obscure provision in a tax code update, Section 401(k), paved the way for tax-deferred, employer-sponsored retirement vehicles that enabled workers to directly invest in the stock market as individuals. But unlike pensions, these retirement plans exposed the average employee to the whims of the global markets.

By 1990, 19 million Americans had $384 billion invested in 401(k)s, a number that would grow to a whopping $4.8 trillion today. Along the way, several changes were made to allow people to invest more and make it easier for companies to auto-enroll their employees, rendering the pension all but a memory and establishing the risky and unpredictable 401(k) as the unofficial retirement vehicle for the American labor force.

The rise of the immigrant worker—who earns a median weekly salary of $730 compared to $885 for native-born workers—created a peculiar and relatively new category of work sometimes called "jobs Americans don't want." These occupations are usually low-skill, low-pay, and often dangerous jobs such as taxi drivers, maids, slaughterhouse workers, janitors, porters, and laborers.

Although the first temp agency emerged in the 1940s, the trend towards temporary employees began in earnest about 50 years ago in the late 1960s and early 1970s. Staffing agencies then began a model that essentially allowed companies to rent employees from them instead of hiring workers in-house, a sort of run-up to the gig economy. Today, there are about 20,000 temp agencies in the U.S. with 39,000 offices that place more than three million workers per week.

If you spent the last 50 years working in American offices, you witnessed the single-greatest transformative event in the history of white-collar work: the computer revolution, which started with the humble word processor. The device's emergence in the 1970s allowed for in-text edits and much faster, much more accurate document production. Originally pitched as a feminist tool that could empower overworked secretaries, the word processor immediately made the typewriter feel primitive, but it, too, would soon become a relic.

Apple unveiled the Apple II computer in 1977 and four years later, IBM rolled out the PC. By the early- to mid-1980s, virtually every employee in nearly every office in America was familiar with computers. Countless man-hours were saved by having computers shoulder traditional office grunt work. Soon, entire filing cabinets would be stored on floppy disks, PowerPoint would change the conference room presentation, and the office IT pro would become the guardian angel of frustrated computer users everywhere.

The role of secretary—or, as it was later known, office support or administrative assistant—was long-dominated by women who could parlay the position into a middle-class life, even if they never went to college. The position has fallen victim to a combination of corporate downsizing, technology, a changing workplace, and heightened educational requirements. Women lost 1.6 million administrative support jobs between 2007 and 2013, nearly double the 865,000 they lost in the seven years running up to the recession.

Unions have long been an empowering force for the American worker, who relied on unions to negotiate wages, lobby the government for protections, and guard them against corporate excess. However, a changing job market, as well as so-called right-to-work laws and detrimental Supreme Court decisions, have led to a steep decline in union membership. In 1983, 20% of American workers claimed union membership. That number dropped to 10.7% by 2017.

Title VII of the 1964 Civil Rights Act for the first time offered protections against workplace discrimination based on gender, race, religion, color, and country of origin. In 1980, the Equal Employment Opportunity Commission determined that sexual harassment is a form of discrimination and is therefore covered by Title VII, which the Supreme Court upheld in 1986 when it determined that speech can create a hostile work environment and can, therefore, be discriminatory.

The 1991 Civil Rights Act modified Title VII to add more protections and also gave victims of workplace harassment the unprecedented right to become plaintiffs in a jury trial. It also gave victims the right to seek punitive compensation.

Just 12.6% of the workforce in 1992 had not achieved a high school diploma. While that number might seem low, there were more high school dropouts at that time than there were workers with advanced degrees. But those demographics flipped over the last quarter century and now, those with less than a high school diploma are the smallest segment of the workforce—just 7.7% of workers—behind advanced degree holders.

Just 17.2% of the workforce had earned a bachelor's degree in 1992, making college grads the #3 biggest chunk of the labor pool behind those with some college education or just a high school diploma. By 2016, nearly one worker in four was a college graduate with a bachelor's degree, and while that demographic remains the third-largest group of workers, the gap has slimmed to almost nothing and college grads are poised to move into the No. 1 spot.

In the last 40 years, Americans have been working longer hours for more weeks a year. The average worker in 1980 put in 38.1 hours for 43 weeks a year. Today, the average employee works 46.8 weeks a year for a total of 38.7 hours a week.

In 2007, Apple changed the world and the workforce yet again when it released the world's first smartphone. The iPhone launched the mobile revolution, and a peculiar new blended work/home dynamic soon took over the American workplace from the office to the construction site. Mostly out of convenience, workers began using their own familiar devices to handle work-related tasks, even when they had access to company devices while at work, leading industry watchers to dub the trend BYOD—bring your own device.

The concept of being on call was once the realm of doctors and detectives. But the rise of powerful personal mobile devices quickly created what the Wall Street Journal has called the "always-on" work culture. From email to text messages to workplace collaboration apps, the mobile revolution has made it hard for the modern employee to disconnect, which has led to a disturbing erosion of work/life balance.

For most of the last 50 years, hiring managers knew only what the candidates wrote on their resumes before meeting for the first time at a job interview. In the last decade, however, social media has allowed potential employers to vet their candidates in extraordinary depth before the first meeting ever takes place. Roughly 70% of potential employers now use social media to take granular dives into things like a prospect's sports team allegiances, political views, social activities, and sense of humor before the two parties ever meet in person.

Social media may allow employers to become familiar with the people they're considering hiring—but once the candidate is part of the team, nearly half of all employers use those same social networks to snoop on and police their employees' behavior outside the office. One in three employers report using their findings as causes to discipline or terminate a current worker. From badmouthing the boss and lying about absences to inappropriate posts and revelations of drug use, the private life of the modern American worker has become company business.

In the late-1950s, America still employed more people in goods-producing jobs than service jobs, but the trend away from that dynamic was already obvious. The last 50 years have seen a steady and rapid ascent of service jobs that coincided with a freefall in goods-producing jobs. Over the last half-century, the former has increased by more than 36% while the latter has plummeted by more than 62%.

Workers have looked forward to their paychecks for as long as people have been holding jobs, but the last several decades have witnessed a major transformation in how those checks are delivered. Today, most checks are actually just proof that a payment was made instead of an actual paper that requires cashing. That's thanks to the rise of the Automatic Clearing House (ACH) in the 1970s, which paved the way for employers to pay their workers faster and more securely through direct deposit.

From horse-and-buggy repairmen to bowling alley pin boys, technology and other advancements inevitably kill industries and relegate certain occupations to the trash heap of history—and the last five decades have left plenty of victims in their wake. Among the industries that are either dead or dying thanks to changes over the last 50 years are photo film processing, video rental, magnetic media manufacturing, directory (White and Yellow Pages) publishing, and sound recording studios.

The number of people who work second jobs is at a historic low. In the last quarter-century, the number of people who moonlight on the books has dropped from close to 6.5% to less than 5% today—though Forbes suggests that number may be misleading.

Tech has already replaced many human jobs in fields like transportation, agriculture, and aerospace—and that historical trend is growing, not slowing down. Some experts predict a near total collapse of all careers that aren't directly related to servicing and improving the very technology that is rendering the human worker obsolete. While that prediction is dire and probably overstated, it is certainly true that more and more jobs are becoming tech jobs.

American workers have a long history of hostility toward immigrants, picket-line breakers, or anyone else they see as offering competition for "their" jobs. Today, however, the biggest threat to job security does not come from human rivals. Thanks to amazing advancements in artificial intelligence, connected machines, and automation, as much as one-third of the workforce will likely soon have to be retrained for new work after being replaced by technology that would have been seen as science fiction just 10 or 20 years ago.

50 ways the workforce has changed in 50 years

Few things remain the same after the passage of half a century, and the American labor force is no exception. Between 1969 and the present day, nearly every aspect of the country's workforce has changed. The demographic makeup of who goes to work is radically different than it was 50 years ago, as is the type of work individuals do, how they do it, how they're paid, and even how they save for retirement. Certain industries like computer programming, coding, and alternative energy sectors were all but unimaginable half a century ago. From the rise of computers and smartphones to the demise of manufacturing, the American workforce continues to adapt, evolve, and grow, grow, grow. With each passing year, American labor looks a little bit different than it did the year before. Here's a look at the most dramatic changes that have shaken, altered, and transformed America's workforce from the space age to the digital age. ALSO: Industries with the worst gender pay gaps

The single biggest change to the U.S. labor force over the last half a century might just be the sheer number of workers that joined it. What the Bureau of Labor Statistics classifies as the "civilian, noninstitutional" population ages 16 and up has skyrocketed from a little more than 137 million in 1970 to about 253 million today, due mostly to general population growth.

Fewer than 70% of 25- to 34-year-olds in 1970 were active in the workforce. Today, that number is nearly 88% with millennials taking on a wide array of professions across diverse industries. Demand for younger workers has increased too, as more jobs require specialized training in computers, coding, or fluency in social media. While globalization reduces demand for manufacturing jobs, aging baby boomers means more demand for prime-aged workers in the health care industry.

Another big reason for the increase in prime-aged workers is a massive rush of women to the workforce, particularly younger women who for the first time in history began to delay or opt out of marriage and childbirth altogether. In 1970, just 45% of women were actively working. Today, that percentage has jumped to 83%.

Today, a difference of only about 10 percentage points separates men and women in the workforce. In 1970, the gap was much wider at 36.3%. That same year, the Equal Pay Act was passed to require equal pay and working conditions for men and women. Wage gaps have been closing since then as well—although that gap isn’t expected to close completely for almost another half century.

The Bureau of Labor Statistics has developed an "economic dependency ratio," which measures the number of people in the workforce against the number of people, including children, who don't work and therefore must be provided for. In 1970, it was 140.4, meaning for every 100 people working, 140 were not. Today, that number has dropped to 97.4, which reflects a steep drop in the number of children the average family has raised over that time period.

Although U.S. manufacturing output has actually increased in terms of output, the number of manufacturing jobs has declined dramatically in recent decades. These jobs peaked at 19.4 million in 1979. By 2010, that number was down 11.5 million despite a steep population increase in the ensuing years. Manufacturing payrolls increased a bit in the 2010s, but manufacturing is still the lowest it’s been since before America entered World War II.

Instead of slowing down, one industry in the manufacturing sector has enjoyed a meteoric rise in recent decades. Computer and electronics manufacturing grew by 2,607% between 1987 and 2017, demonstrating how advances in technology consistently mold industry and force workers to evolve with the changing times.

Manufacturing jobs declined by about one-third since 1990, which has led to a fairly predictable change in in-demand job skills. As manufacturing jobs have dried up and automation has skyrocketed with AI and programmable machinery, demand for manual and physical skills like welding and carpentry has shrunk.

While manual and physical skills are no longer as marketable, a premium has been put on workers with social, personal, writing, and analytical skills. That’s in part because the number of service-oriented jobs that requiring extensive knowledge, experience, education, and training has more than doubled.

Roughly 50 million workers in 1980 held jobs that required an above-average level of job preparation. By 2015, those whose occupations required above-average education, training, or experience had grown to 83 million—an increase of nearly 70%. A wider range of industries means more specialized training and more options for America's diverse workforce population.

Employment in the combined fields of educational services has skyrocketed since 1990. During the same time period in which one in three manufacturing jobs disappeared, educational services jobs grew by 105%—more than any other industry.

The health care and social assistance fields are not far behind educational services in terms of occupational growth. The number of people employed in that industry grew by 99% between 1990 and 2015. Industries with the most explosive job growth are also more likely to be creating jobs that require additional training or higher levels of skills.

Rounding out the top three fields that added the most jobs between 1990-2015 are professional and business services, which grew by 81% during that time. The three fields of educational services, health care and social services, and professional and business services combined to account for 20 million of the 32 million new jobs added during that quarter century of change.

The last three decades have witnessed a direct correlation between higher wages and jobs that require advanced social and analytical skills. Women are now more likely than men to be employed in those occupations.

Since women are now slightly more likely to work in higher-paying industries that focus on in-demand analytical and social skills, and are far less likely to work in lower-paying industries that rely on less-coveted physical skills, the gender-based wage gap has shrunk. In 1980, women earned an average of 60 cents for every dollar men earned. By 2015, that number had grown to 80 cents, cutting the wage gap in half. Progress isn't perfect, however: That gap isn't expected to close completely until 2059.

High-level positions in fields favoring analytical and social skills over physical skills now regularly go to people with college degrees. While college grads represent just 36% of the total workforce, they now claim more than half of the higher-paying jobs that focus on in-demand analytical and social skills. Those with some high school education, high school diplomas, or some college education are crammed into all but 14% of the occupations that require lower-paying physical skills.

In 1980, a little less than 60% of 16- to 24-year-olds were working. After several ensuing dips and peaks, the percentage of young people in the workforce has dropped significantly. By 2015, less than half of that demographic was working, much of which can be attributed to higher college enrollment rates and many young adults going directly from college to masters programs.

More seniors today are choosing to stay in the workforce longer. Forty years ago, just 12% of adults 65 and older were working. By 2015 it was about one in five, suggesting better health for many seniors—and a more difficult time saving for retirement.

One of the biggest trends of the last half a century is not what has happened, but what hasn't. When adjusted for inflation, wages have remained virtually the same since 1980—but those numbers represent the overall average. Parsing things more finely, the median wage for women has risen by nearly one-third, while the median wage for men has actually fallen.

In the nearly two decades between 1996 and 2014, the percentage of people who remained with the same company for five years or more has crossed the critical 50% mark, up from 46% in 1996 to 51% in 2014. That dynamic is attributed in large part to the increasing percentage of older Americans in the workforce.

In the more than three decades between 1980 and the rise of Obamacare in 2013, participation in employer-sponsored health insurance dropped significantly. In 1980, 77% of workers were covered by their employers. 33 years later, that number fell to 69%. While employers with 50 or more full-time employees are required to provide health insurance, many companies have found workarounds by utilizing contractors, freelancers, and part-time employees.

Just as with health insurance, the American workforce has also experienced a steep decline in retirement benefits, but that trend has been a much more recent phenomenon. Almost exactly half the workforce had an employer-sponsored 401(k) or pension in 1980. By 2001, it was up to 57%, but that number dropped all the way down to 45% by 2015.

From Uber to online freelance work, the last decade has witnessed the rise of what the Pew Research Center calls "alternative working arrangements." In 1995, just 10% of the American workforce earned a living by providing their products or services piecemeal on a contract, non-employee basis. By 2015, that number had jumped to 15.8%.

The last few years have witnessed a meteoric rise of contract labor—an astonishing one in three people earn money freelancing either full-time or on a supplementary basis. The move toward contract work, which is one of the most significant shifts in modern American labor history, is partly because of the educational gap, partly because of changing attitudes toward work in general, partly because employers can save money that way, and partly because new technologies have made remote freelancing plausible, accessible, and cheap.

For generations, traditional pensions guaranteed retirees money to live on once their earning years had passed. Pensions, which came with predictable fixed payments, are now nearing extinction. Just 13% of private-sector workers have a traditional pension today compared to 38% in 1979.

The declining pension is due in part to something else that emerged in 1978 and changed the course of labor history. That year, an obscure provision in a tax code update, Section 401(k), paved the way for tax-deferred, employer-sponsored retirement vehicles that enabled workers to directly invest in the stock market as individuals. But unlike pensions, these retirement plans exposed the average employee to the whims of the global markets.

By 1990, 19 million Americans had $384 billion invested in 401(k)s, a number that would grow to a whopping $4.8 trillion today. Along the way, several changes were made to allow people to invest more and make it easier for companies to auto-enroll their employees, rendering the pension all but a memory and establishing the risky and unpredictable 401(k) as the unofficial retirement vehicle for the American labor force.

By 2017, more than 17% of American jobs were held by foreign-born workers—about 27.4 million people. Nearly half, about 48%, are Hispanics and another quarter are Asian.

The rise of the immigrant worker—who earns a median weekly salary of $730 compared to $885 for native-born workers—created a peculiar and relatively new category of work sometimes called "jobs Americans don't want." These occupations are usually low-skill, low-pay, and often dangerous jobs such as taxi drivers, maids, slaughterhouse workers, janitors, porters, and laborers.

Although the first temp agency emerged in the 1940s, the trend towards temporary employees began in earnest about 50 years ago in the late 1960s and early 1970s. Staffing agencies then began a model that essentially allowed companies to rent employees from them instead of hiring workers in-house, a sort of run-up to the gig economy. Today, there are about 20,000 temp agencies in the U.S. with 39,000 offices that place more than three million workers per week.

If you spent the last 50 years working in American offices, you witnessed the single-greatest transformative event in the history of white-collar work: the computer revolution, which started with the humble word processor. The device's emergence in the 1970s allowed for in-text edits and much faster, much more accurate document production. Originally pitched as a feminist tool that could empower overworked secretaries, the word processor immediately made the typewriter feel primitive, but it, too, would soon become a relic.

Apple unveiled the Apple II computer in 1977 and four years later, IBM rolled out the PC. By the early- to mid-1980s, virtually every employee in nearly every office in America was familiar with computers. Countless man-hours were saved by having computers shoulder traditional office grunt work. Soon, entire filing cabinets would be stored on floppy disks, PowerPoint would change the conference room presentation, and the office IT pro would become the guardian angel of frustrated computer users everywhere.

The role of secretary—or, as it was later known, office support or administrative assistant—was long-dominated by women who could parlay the position into a middle-class life, even if they never went to college. The position has fallen victim to a combination of corporate downsizing, technology, a changing workplace, and heightened educational requirements. Women lost 1.6 million administrative support jobs between 2007 and 2013, nearly double the 865,000 they lost in the seven years running up to the recession.

Unions have long been an empowering force for the American worker, who relied on unions to negotiate wages, lobby the government for protections, and guard them against corporate excess. However, a changing job market, as well as so-called right-to-work laws and detrimental Supreme Court decisions, have led to a steep decline in union membership. In 1983, 20% of American workers claimed union membership. That number dropped to 10.7% by 2017.

Title VII of the 1964 Civil Rights Act for the first time offered protections against workplace discrimination based on gender, race, religion, color, and country of origin. In 1980, the Equal Employment Opportunity Commission determined that sexual harassment is a form of discrimination and is therefore covered by Title VII, which the Supreme Court upheld in 1986 when it determined that speech can create a hostile work environment and can, therefore, be discriminatory.

The 1991 Civil Rights Act modified Title VII to add more protections and also gave victims of workplace harassment the unprecedented right to become plaintiffs in a jury trial. It also gave victims the right to seek punitive compensation.

Just 12.6% of the workforce in 1992 had not achieved a high school diploma. While that number might seem low, there were more high school dropouts at that time than there were workers with advanced degrees. But those demographics flipped over the last quarter century and now, those with less than a high school diploma are the smallest segment of the workforce—just 7.7% of workers—behind advanced degree holders.

Just 17.2% of the workforce had earned a bachelor's degree in 1992, making college grads the #3 biggest chunk of the labor pool behind those with some college education or just a high school diploma. By 2016, nearly one worker in four was a college graduate with a bachelor's degree, and while that demographic remains the third-largest group of workers, the gap has slimmed to almost nothing and college grads are poised to move into the No. 1 spot.

In the last 40 years, Americans have been working longer hours for more weeks a year. The average worker in 1980 put in 38.1 hours for 43 weeks a year. Today, the average employee works 46.8 weeks a year for a total of 38.7 hours a week.

In 2007, Apple changed the world and the workforce yet again when it released the world's first smartphone. The iPhone launched the mobile revolution, and a peculiar new blended work/home dynamic soon took over the American workplace from the office to the construction site. Mostly out of convenience, workers began using their own familiar devices to handle work-related tasks, even when they had access to company devices while at work, leading industry watchers to dub the trend BYOD—bring your own device.

The concept of being on call was once the realm of doctors and detectives. But the rise of powerful personal mobile devices quickly created what the Wall Street Journal has called the "always-on" work culture. From email to text messages to workplace collaboration apps, the mobile revolution has made it hard for the modern employee to disconnect, which has led to a disturbing erosion of work/life balance.

For most of the last 50 years, hiring managers knew only what the candidates wrote on their resumes before meeting for the first time at a job interview. In the last decade, however, social media has allowed potential employers to vet their candidates in extraordinary depth before the first meeting ever takes place. Roughly 70% of potential employers now use social media to take granular dives into things like a prospect's sports team allegiances, political views, social activities, and sense of humor before the two parties ever meet in person.

Social media may allow employers to become familiar with the people they're considering hiring—but once the candidate is part of the team, nearly half of all employers use those same social networks to snoop on and police their employees' behavior outside the office. One in three employers report using their findings as causes to discipline or terminate a current worker. From badmouthing the boss and lying about absences to inappropriate posts and revelations of drug use, the private life of the modern American worker has become company business.

In the late-1950s, America still employed more people in goods-producing jobs than service jobs, but the trend away from that dynamic was already obvious. The last 50 years have seen a steady and rapid ascent of service jobs that coincided with a freefall in goods-producing jobs. Over the last half-century, the former has increased by more than 36% while the latter has plummeted by more than 62%.

Workers have looked forward to their paychecks for as long as people have been holding jobs, but the last several decades have witnessed a major transformation in how those checks are delivered. Today, most checks are actually just proof that a payment was made instead of an actual paper that requires cashing. That's thanks to the rise of the Automatic Clearing House (ACH) in the 1970s, which paved the way for employers to pay their workers faster and more securely through direct deposit.

From horse-and-buggy repairmen to bowling alley pin boys, technology and other advancements inevitably kill industries and relegate certain occupations to the trash heap of history—and the last five decades have left plenty of victims in their wake. Among the industries that are either dead or dying thanks to changes over the last 50 years are photo film processing, video rental, magnetic media manufacturing, directory (White and Yellow Pages) publishing, and sound recording studios.

The number of people who work second jobs is at a historic low. In the last quarter-century, the number of people who moonlight on the books has dropped from close to 6.5% to less than 5% today—though Forbes suggests that number may be misleading.

Tech has already replaced many human jobs in fields like transportation, agriculture, and aerospace—and that historical trend is growing, not slowing down. Some experts predict a near total collapse of all careers that aren't directly related to servicing and improving the very technology that is rendering the human worker obsolete. While that prediction is dire and probably overstated, it is certainly true that more and more jobs are becoming tech jobs.

American workers have a long history of hostility toward immigrants, picket-line breakers, or anyone else they see as offering competition for "their" jobs. Today, however, the biggest threat to job security does not come from human rivals. Thanks to amazing advancements in artificial intelligence, connected machines, and automation, as much as one-third of the workforce will likely soon have to be retrained for new work after being replaced by technology that would have been seen as science fiction just 10 or 20 years ago.

Not a single industry has remained untouched by the changes that swept the American labor force over the last half-century. From technology and culture to laws and education, here's how the American workforce has changed since 1969.