When I ask my medical students to describe their image of a woman who elects to birth with a midwife rather than with an obstetrician, they generally describe a woman who wears long cotton skirts, braids her hair, eats only organic vegan food, does yoga, and maybe drives a VW microbus. What they don't envision is the omnivorous, pants-wearing science geek standing before them.

Indeed, they become downright confused when I go on to explain that there was really only one reason why my mate -- an academic internist -- and I decided to ditch our obstetrician and move to a midwife: Our midwife could be trusted to be scientific, whereas our obstetrician could not.

Many medical students, like most American patients, confuse science and technology. They think that what it means to be a scientific doctor is to bring to bear the maximum amount of technology on any given patient. And this makes them dangerous. In fact, if you look at scientific studies of birth, you find over and over again that many technological interventions increase risk to the mother and child rather than decreasing it.

But most birthing women don't seem to know this, even if their obstetricians do. Paradoxically, these women seem to want the same thing I wanted: a safe outcome for mother and child. But no one seems to tell them what the data indicate is the best way to get there. The friend who dares to offer half a glass of wine is seen as guilty of reckless endangerment, whereas the obstetrician offering unnecessary and risky procedures is considered heroic.

Ethicists talk about birthing choices as if they are informed and autonomous, but I can't count how many women have said to me they "chose" pain meds during birth even though they were never told the risks.

When I was pregnant, in 2000, and my mate and I consulted the scientific medical literature to find out how to maximize safety for me and our child, here's what we learned from the studies available: I should walk a lot during my pregnancy, and also walk around during my labor; doing so would decrease labor time and pain. During pregnancy, I should get regular check-ups of my weight, urine, blood pressure, and belly growth, but should avoid vaginal exams. I should not bother with a prenatal sonogram if my pregnancy continued to be low-risk, because doing so would be extremely unlikely to improve my or my baby's health, and could well result in further tests that increased risk to us without benefit.

According to the best studies available, when it came time to birth at the end of my low-risk pregnancy, I should not have induction, nor an episiotomy, nor continuous monitoring of the baby's heartbeat during labor, nor pain medications, and definitely not a c-section. I should give birth in the squatting position, and I should have a doula -- a professional labor support person to talk to me throughout the birth. (Studies show that doulas are astonishingly effective at lowering risk, so good that one obstetrician has quipped that if doulas were a drug, it would be illegal not to give one to every pregnant woman.)

In other words, if the regular low-tech tests kept indicating I was having a medically uninteresting pregnancy, and if I wanted to scientifically maximize safety, I should give birth pretty much like my great-grandmothers would have: with the attention of a couple of experienced women mostly waiting it out, while I did the work. (They called it labor for a reason.) The only real notable difference was that my midwife would intermittently use a fetal heart monitor -- just every now and then -- to make sure the baby was doing okay.

My obstetrician and his practice had made clear that they were rather uncomfortable with this kind of "old-fashioned" birth. So we left, and engaged a midwife who was committed to being much more modern. And the birth I had was pretty much as I have described. Yes, it hurt, but my doula and midwife had prepared me mentally for that, assuring me that this kind of special pain did not have to result in fear or harm.

We did end up with one technological intervention: because my son had meconium in his fluid (this means he'd defecated in the womb), the midwife explained to me that right after birth, the pediatricians would be scooping him up to suck out his trachea (his windpipe). The idea was to prevent pneumonia. They did this, and three months later over breakfast my husband presented me the results of a randomized control trial that had just come out: it showed that babies in this situation who only had their mouths and not their tracheas cleaned actually had lower rates of pneumonia compared to those who got the tracheal intervention. Another intervention that turned out not to be worth it.

So why is it that, over a decade later, when the evidence still supports a low-interventionist type of pregnancy and birth management for low-risk cases, we've made virtually no inroads to making birth more scientific in the United States.

I put that question to a few scholars who work on this issue. One of them, Libby Bogdan-Lovis of the Center for Ethics and Humanities in the Life Sciences at Michigan State University, happens also to have been my doula. (Lucky me.) Libby noted that a big part of the problem is the way birth is conceived in America -- as "dangerous, risky, and in need of control to ensure a good outcome."

Libby pointed out that institutional strictures contribute to the problem: "Insurance companies generally cover hospital birth, not home birth, they are more inclined to compensate doctors over midwives, they compensate doctors and hospital-based midwives for doing something over doing nothing, and the health care system's risk management approach backs those who can demonstrate that they did everything possible in terms of intervention." All this in spite of the fact that, as Libby notes, "attempts to control birth are fraught with real medicalized risk and commonly lead to cascades of interventions."

Raymond De Vries, a sociologist in the University of Michigan's Center for Bioethics and Social Science in Medicine, has compared birth in the U.S. to that in the Netherlands, where he is a visiting professor at the University of Maastricht. He finds that, in the U.S., "obstetricians are the experts and the experts have come to see birth as dangerous and frightening." De Vries suggests that the organization of maternity care in this country -- "the limited choices that American women have for bringing their baby into the world, what women are not told about dangers of intervening in birth, and the misuse of science to support the new technologies of birth" -- actually constitutes an ethical problem, although we typically do not recognize it as one. Medical ethicists "would rather look to the [comparatively rare] problems of in vitro fertilization and preimplantation genetic diagnosis than to the every day issues of how we organize birth here in the U.S.; they would rather talk about preserving women's 'choices' than to explore how those choices are bent by culture."

So true. Ethicists love to talk about women's birthing choices as if they are informed and autonomous, but I can't count how many women have said to me that they "chose" pain medication during birth even though they were never told the risks of pain medication, never had anyone express confidence in them that they could birth without medication, and were never offered a doula to walk and talk them through the pain. What kind of "choice" is that? As Libby Bogdan-Lovis told me, "Today's average childbearing woman thinks the notion of an unmedicated birth is the equivalent of suggesting that women should eagerly embrace torture."

If I wanted to maximize safety, I should give birth like my grandmothers would have: with the attention of a couple of experienced women mostly waiting it out, while I did the work.

I think of all the choices I made, the one that shocked my peers most was not getting a prenatal ultrasound. But just a few years before I became pregnant, a major U.S. study -- involving over 15,000 pregnancies -- published in the New England Journal of Medicine showed that routine ultrasounds did not leave babies safer. That work was led by Bernard Ewigman, now chair of family medicine at the University of Chicago and NorthShore University Health System.

I recently called Dr. Ewigman and asked him why so many low-risk pregnancies now involve routine ultrasounds. He suggested that it was partly emotional -- people like to "see" their babies -- and partly due to the unsubstantiated belief that knowing something is necessarily going to lead to better outcomes than not knowing. But, he agreed, routine prenatal sonograms in low-risk pregnancies (that is, pregnancies in which there have been no problems) do not appear to be supported by science, if the outcome you're seeking is reducing illness and death in mothers and children. Routine prenatal sonograms don't seem to be dangerous, but they are also not health-giving.

Dr. Ewigman told me, "The approach you took to your pregnancy was rational and well informed. But most decision-making when it comes to medical issues involving a pregnant woman or baby are not well informed and not based on rational thinking." He added: "We're all very interested in having healthy babies and it is pretty easy to make the kind of cognitive errors that people make, and attribute to technology benefits that don't exist. At the same time, when there are problems in a pregnancy, that very same technology can be life-saving. It is easy to make the [problematic mental] leap that technology is always going to be necessary for a good outcome."

Dr. Ewigman and I talked about how some people derive false certainty from prenatal sonograms, thinking that if the clinicians see nothing unusual, the baby will be born perfectly healthy. I explained to him that that was one reason I didn't bother; I knew from my own research on birth anomalies how often sonograms mislead. He observed that our culture has "a real fascination with technology, and we also have a strong desire to deny death. And the technological aspects of medicine really market well to that kind of culture." Whereas a low-interventionist approach to medical care -- no matter how scientific -- does not.

I'm not against taking into account, when making birthing choices, the kinds of hard-to-measure outcomes that may matter deeply to some pregnant women. I get that there are some women who don't want a baby shower like mine, where most of the gifts consist of yellow and green baby clothes, instead of pink or blue. I get that some want to have those fuzzy pictures of the babies in their wombs. I get that some might want to abort if a sonogram were to show a major anomaly.

And I get that some women want a particular experience of birth -- I mean, I really get that now that I have had a birth that left me feeling more powerful, more humble, more focused, and more devoted to my lover than I ever thought I could feel.

But I wish American women were told the truth about birth -- the truth about their bodies, their abilities, and the dangers of technology. Mostly I wish all pregnant women could hear what Libby Bogdan-Lovis, my doula, told me: "Birthing a baby requires the same relinquishing of control as does sex -- abandoning oneself to the overwhelming sensation and doing so in a protective and supportive environment." If only more women knew how sexy a scientific birth can be.

About the Author

Alice Dreger is a professor of clinical medical humanities and bioethics at Northwestern University's Feinberg School of Medicine. She has written for The New York Times, The Wall Street Journal, and The Washington Post.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

We can all agree that Millennials are the worst. But what is a Millennial? A fight between The New York Times and Slate inspired us to try and figure that out.

This article is from the archive of our partner .

We can all agree that Millennials are the worst. But what is a Millennial? A fight between The New York Times and Slate inspired us to try and figure that out.

After the Times ran a column giving employers tips on how to deal with Millennials (for example, they need regular naps) (I didn't read the article; that's from my experience), Slate's Amanda Hess pointed out that the examples the Times used to demonstrate their points weren't actually Millennials. Some of the people quoted in the article were as old as 37, which was considered elderly only 5,000 short years ago.

The age of employees of The Wire, the humble website you are currently reading, varies widely, meaning that we too have in the past wondered where the boundaries for the various generations were drawn. Is a 37-year-old who gets text-message condolences from her friends a Millennial by virtue of her behavior? Or is she some other generation, because she was born super long ago? (Sorry, 37-year-old Rebecca Soffer who is a friend of a friend of mine and who I met once! You're not actually that old!) Since The Wire is committed to Broadening Human Understanding™, I decided to find out where generational boundaries are drawn.