Recent history has witnessed the exciting politicization of meat. Scores of recent books and articles (not to mention lively exchanges on the Atlantic Food Channel) have raised the profile of meat production to a mainstream environmental cause, illuminating the hazards that industrial meat—which is 99 percent of the meat we eat—poses to our soil, air, and water. The ethical dimensions of eating meat have also started to make meaningful inroads into public consciousness. Now more than ever, everyday meat eaters are considering the moral implications of raising billions of animals for food that our bodies can easily do without. ("We don't need to eat it," Dr. Amy Lanou, senior nutrition scientist for the Physicians Committee for Responsible Medicine, says of meat.)

Reactions to these concerns vary. Some have bolted to the fringes—with one extreme cohort going vegan or vegetarian while a more defensive group (mostly people in the business of meat production) has dug in its heels around the troubling justification that eating meat is a guiltless act because humans have always done it (as if history should rationalize contemporary behavior!). The quiet majority has fallen between the extremes, weighing the prevailing arguments and, to one extent or another, approaching the meat counter with a little more humility, if not a dose of carnivorous agnosticism. No matter our stance on meat, chances are good that we're at least listening, thinking, and maybe even plotting a dietary change or two.

This product, which supporters promise will have comparable taste to conventional meat, has enormous potential to confront the environmental and ethical concerns that so many agnostic carnivores find troubling.

But one issue to which concerned consumers have generally turned a tin ear is "in-vitro meat." Although the cost is currently prohibitive, the technology is widely available to produce meat from the cultured cells of animals rather than the animals themselves. Also called "cultured meat" or "synthetic meat," this product, which supporters promise will have comparable taste to conventional meat, has enormous potential to confront the environmental and ethical concerns that so many agnostic carnivores find troubling. Speaking for the Humane Society of the United States, Paul Shapiro, senior director of the group's Factory Farming Campaign, explained in an e-mail that "in vitro meat has the potential to prevent an enormous amount of suffering ."

Anyone who cares about animals and the environment must acknowledge Shapiro's point. In so far as cultured meat would obviate the need to raise flesh for human consumption, it would arguably be the most pivotal development in 10,000 years of farming. An industry that currently generates substantial amounts of greenhouse gas (6 to 9 percent of U.S. totals, 18 to 51 percent globally), pollutes already endangered water supplies, consumes millions of acres of corn and soy (and of course the pesticides and fertilizers needed to grow them), uses the vast majority of antibiotics made, accounts for massive amounts of deforestation, and destroys riparian zones worldwide could be replaced by an industry with comparatively minimal environmental impact, zero dependence on agricultural chemicals or land, and, most critically, no need to kill a single animal in the quest to meet our insatiable demand for meat. Not one beast. It almost sounds too good to be true.

Unless, of course, you have an interest at stake. Agribusiness is hardly eager to see meat move from the feedlot to the laboratory, and it comes as no surprise that the National Cattleman's Association has handled the idea of in vitro meat with all the finesse of a cattle prod. Interestingly enough, the meat industry is not alone. In one of the stranger cases of mortal enemies waking up as snug bedfellows, advocates of sustainable agriculture appear to agree with agribusiness that in vitro meat should be kept off the radar screen of our culinary future. Their reasons are revealing. And troubling.

Kate McMahon, who represents Friends of the Earth, complained to CNN that "At a time when hundreds of small-scale, sustainable farming operations are filing for bankruptcy every day, it is unethical to consider purchasing petri dish meat." Unethical! Slow Food USA is skeptical for reasons that defy easy summation, but here's president Josh Viertel's take on "test-tube flesh": "The problems with cruelty to animals are born of that gap [between producer and consumer]. I see [test tube flesh] as a solution that just increases that gap ... This is a technology that's just going to give more to companies and create a larger distance between us."

Frankly, these responses boggle my noodle. Both McMahon and Viertel seem to forget that an integral aspect of animal cruelty is not just how an animal is treated while it's alive but also the inconvenient truth that—no matter how they are raised—the animals we eat ultimately succumb to a violent death, one that they are smart enough to anticipate, sentient enough to suffer through, and, were they given an option, wise enough to avoid. On some (philosophical?) level, the humanity of the treatment is compromised the moment the death blow lands—this is certainly "one of the problems with cruelty to animals." In fact, dare one interpret that gruesome moment as penultimate to anything else, consider the reaction of a man as cold-steeled and tough-minded as Anthony Bourdain who, after witnessing the slaughter of a six-month-old hand-fed pig, left this unforgettable response in Kitchen Confidential:

For a guy who'd spent twenty-eight years serving dead animals and sneering at vegetarians, I was having an unseemly amount of trouble getting with the program. I had to suck it up. . . It took four strong men, experts at this sort of thing, to restrain the pig, then drag and wrestle him up onto his side ... With the weight of two men pinning him down, and another holding his hind legs, the main man with the knife, gripping him by the head, leaned over and plunged the knife all the way into the beast's thorax, just above the heart. The pig went wild. The screaming penetrated the fillings in my teeth ... With an incredible shower of fresh blood, the pig fought mightily ...They finally managed to wrestle the poor beast back up onto the cart again, the guy with the mustache working the blade back and forth like a toilet plunger ...

Anyone who has seen anything remotely like this intuitively understands the truth of the matter: the pig does not go gently for the very basic reason that the pig does not want to go.

The fact that the harsh reality of animal death for billions of creatures does not immediately overwhelm concerns about the economic viability of several hundred small farms (McMahon's point), much less the emotional distance separating privileged consumers from their local farmers (Viertel), suggests that Friends of the Earth and Slow Food are just as removed from the reality of agriculture as anyone else. Contrary to what it may seem, what we're ultimately witnessing with the sustainable food movement's opposition to in vitro meat is not so much a warped moral calculus—Viertel, I know firsthand, is far too thoughtful and intelligent for such a flaw—but rather something much more prosaic: the protection of sacred turf.

Viertel's and McMahon's comments followed me around—nagged me, actually—for days after I read them. But then I realized something: the politics of meat is the politics of self interest—no matter what side of the debate one is on—and, as is always the case, everyone's interest is fiercely protected except that of the animals. Just as corn and soy are the bread and butter of Big Ag, the persistence of small, traditionally conceptualized farms practicing time honored agricultural techniques is the sine qua non of the sustainable food movement. Without these small family farms, and without animals being humanely raised to be slaughtered, the movement's turf would shrink. The knowledge that science and technology could have the potential to fundamentally redefine (and improve) the very agricultural tradition that so many organizations are designed to protect is knowledge we can hardly expect interested parties to evaluate in fair terms. My guess is that it probably terrifies them.

And I kind of feel their pain. But still: a pound of flesh sacrificed by a small cadre of sustainable animal farms would be an immeasurable gain for the billions unable to articulate their position on the matter of their own death. Anyway, I know how my fork would vote.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

Why did Trump’s choice for national-security advisor perform so well in the war on terror, only to find himself forced out of the Defense Intelligence Agency?

How does a man like retired Lieutenant General Mike Flynn—who spent his life sifting through information and parsing reports, separating rumor and innuendo from actionable intelligence—come to promote conspiracy theories on social media?

Perhaps it’s less Flynn who’s changed than that the circumstances in which he finds himself—thriving in some roles, and flailing in others.

In diagnostic testing, there’s a basic distinction between sensitivity, or the ability to identify positive results, and specificity, the ability to exclude negative ones. A test with high specificity may avoid generating false positives, but at the price of missing many diagnoses. One with high sensitivity may catch those tricky diagnoses, but also generate false positives along the way. Some people seem to sift through information with high sensitivity, but low specificity—spotting connections that others can’t, and perhaps some that aren’t even there.