Peggy Drexler writing at the Daily Beast, seems to think the answer is "true." She argues that "for men born after 1980, theirs is a generation of adjustment." She adds, "They have seen a rebuilding of architectures of support in everything from girl's sports to female-only scholarships to the broad encouragement for females to break down barriers." She then goes on to talk to some millennial men, who, it turns out, have some mixed feelings about women's advances in the workplace.

It's certainly true that gender roles have been changing in the last 35 years. But when you say that the millennials are "a generation of adjustment", the implication is that theirs is especially a generation of adjustment—that they are adjusting in a way that few have adjusted before.

This isn't true. The fact is, the adjustments that millennials are making are part of a long trend towards integrating women into the workplace that began, not with them, but with the generations preceding them. In Marriage, A History, from 2005, Stephanie Coontz points out that "Every single decade of the twentieth century has seen an increase in the proportion of women in the workforce." That trend accelerated after World War II, when there were lots of low-paid clerical and sales jobs to fill—and kept accelerating through the 1960s.

As a result, the family and its relationship to work changed drastically from the 1950s to the 1970s. Post-war, the two-person family with one breadwinner became both possible and normative in a way that, Coontz shows, it had never been before, and never was again. In the 1950s, Coontz says, surveys showed that most Americans believed that people who were single by choice were "sick" or "immoral". By 1975, only 25 percent thought that. The major transformation in attitudes towards gender, marriage, the family and, by implication, work, happened before the millennials were born. If there is a "generation of adjustment", that generation is not the millennials. It's the folks who grew up between the '50s and the '70s—the baby boomers and some of their kids.

What's wrong with the Daily Beast saying otherwise? What does it matter, really, if people do a little hand-waving about changing gender roles and work and the millennials? After all, gender roles are still in flux. What's the harm?

Related Story

The harm, I'd argue, is that framing changing gender roles as a phenomenon tied to the millennials in particular obscures why those roles are changing. The massive shift in the relationship between women and work since the '50s has not been caused by college scholarships for women, nor by leaning in, as the Beast article has it. It's been caused, instead by two major, obvious, and often ignored facts. The first is contraception. And the second is a decisive and lasting drop in the standard of living.

Contraception is today so taken for granted that I think people forget how radically it has transformed not just women's lives, but society as a whole. As Coontz points out, in the 1960s, "For the first time in history any woman with a modicum of educational and economic resources could, if she wanted to, separate sex from childbirth, lifting the specter of unwanted pregnancy that had structured women's lives for thousands of years." And also for the first time in history, women could control their own workforce participation. Instead of having child after child after child, women who didn't want to be celibate (which is the vast majority of owmen) could plan children around their career, rather than vice versa. This puts a rather different spin, for example, on the millennial who Drexler quotes as saying that women are "getting extra support." It's true that in comparison to the rest of recorded history, women are getting more support. But the absolutely most important form that support takes is not some sort of affirmative action. It's the pill—which has, finally, allowed women to compete in the workforce on an equal footing with men.

The drop in the standard of living since the 1950s isn't as revolutionary as contraception, but it's still pretty important. As Coontz argues, during the post-war period it was possible for a husband (it was virtually always a husband) to make enough money to support a wife and family in a middle-class lifestyle. Married women didn't have to work—and so the vast majority of them didn't. Instead they stayed at home and took care of the kids.

Inflation and then globalization have made the one-earner middle-class family an impossibility—and that, in turn, has meant that women, married and unmarried, have had to enter the workforce in large numbers. The impetus for women to go to work is certainly in part that people like Betty Friedan realized that being a housewife was not very fulfilling. But the impetus was also, and importantly, that virtually no one can afford to stay at home and be a housewife anymore. Coontz points out that while purchasing power of average Americans doubled between 1947 and 1973, real wages between 1973 and the late 1980s fell by 27 percent, comparable to the decline during the Great Depression.

All of which makes Drexler's article look hopelessly confused. She's arguing that changing gender roles since the 1980s have put men and women in competition for jobs, stoking some mild male resentment. But what's really put men and women in competition for jobs isn't changing gender roles. It's economic stagnation and a precipitous decline in real wages. And what's needed, therefore, isn't more leaning in by women, or more serious listening to men's complaints about women leaning in. What's needed is some sort of effort to deal with the changing landscape, not of gender, but of class.

Not that gender is irrelevant. On the contrary, much of the unspoken rationale for America's crappy social safety net—with work-based healthcare and no day care and so on—is the continuing image of the 1950s family as an ur-standard. You don't need day care because mom's at home; you don't need government healthcare because all the daddies work. Articles like Drexler's, which erase the past, paradoxically keep those antiquated gender roles around. The "traditional" family is always something we've just left behind, always something we're just adjusting to. The truth, though, is that these changes are of long standing, and the adjustments we need to make have little to do with the ambivalent feelings of male millennials, and a whole lot to do with policy changes that are long, long past their time.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.