The newspaper headline heralded a breakthrough: "ANCIENT CHILD-RAISING TECHNIQUES ARE REBORN." The story, on a front page of a section of The Boston Globe, concerned a group of cutting-edge young mothers who are resolved to throw conventional wisdom to the winds and raise their infants in the most enlightened way possible. Just how revolutionary is this new approach, which is called "attachment parenting"? Among other things, it calls for breast-feeding your child. Wearing your infant in a sling. Being responsive to your baby's cries. Avoiding long separations.

Scoff as you may, these women might be on to something. Indeed, some of the suggested behaviors seem to have been taken up by billions of mothers worldwide, with hardly a conscious thought. The Globe invoked one mother from a maternally advanced suburb.

[Ms. Brown] said she just naturally practiced attachment parenting without knowing her decisions had a name. "To find out there were other parents who did those kinds of things and it was called something, I thought it was great," she said.

It should come as no surprise that the instinctive responses of Suburban Mom and Cro-Magnon Mom fall into a roughly similar pattern. The human organism—the corporeal thing itself, its needs and wants, its likes and dislikes, its limitations, its shape—is the most conservative force in human society.

This fact is so obvious that it virtually recedes into oblivion. A few years ago archaeologists succeeded in dating a cache of ancient shoes found in a cave overlooking the Missouri River. The shoes went back to as early as 6,000 B.C., and came in a variety of styles—some looked like Mexican huaraches, some resembled a woman's sling-backs, some were sandals, some were slip-ons. Newspaper accounts of the discovery marveled that this ancient footwear, fashioned eight millennia before Gucci and Rockport, would not look out of place on modern feet. Why, you could wear it now! But of course you could—feet are feet. For similar reasons, a pharaoh's glove and The Gloved One's glove look surprisingly alike; a condottiere's helmet and Darth Vader's helmet derive from the same functional principle. Rings and combs, trousers and condoms—they hold their shape over time. Leafing recently through a book about Roman Britain, I came across a photograph of an ancient bikini, unearthed during an excavation in central London. The triangular patch was exactly where you would expect; the anterior thong was aligned with the familiar gluteal groove.

Over the ages and across countless cultures our beds have looked like beds, our chairs like chairs, our houses like houses. Our active lives are defined by the body's thresholds of heat and cold, pain and pleasure, energy and fatigue. Our eyesight is fixed within a specified range (better than that of bats, inferior to that of eagles), and so is our hearing. The sheer physical demands of hauling the body to work seem to be influenced by some inherent governor: a famous study of commuting, for instance, suggested that although distances have changed with technological advances, people in all eras and cultures have budgeted about the same amount of time for daily travel (on average, about half an hour one way).

The built-in conservatism of the body has its analogue in the brain. Certain images and proportions are naturally pleasing to human sight; one is the golden mean, known as phi, which shows up everywhere from nautilus spirals to classical architecture. Evolutionary psychology is replete with examples of behavior that appear to be hard-wired. The rudimentary fact that our ancestors were once hunted for food by other species may have left a mark in various ways. We clump together socially—a defensive mechanism. Experiments show that children learn about predators faster than they do about far more prevalent threats—an atavistic phenomenon that one anthropologist calls Jurassic Park syndrome. The notion of ourselves as potential objects of predation surfaces from time to time even among the urbane. Recall Noël Coward's purported remark during the parade at Elizabeth II's coronation. As the massive Queen of Tonga came into view, someone pointed to the thin little man beside her and asked if that was her husband. Coward replied, "No, that's her lunch."

On balance, is the conservatism of the human body a force for good? It certainly serves as a kind of ballast—and as a brake. Entrepreneurs can extol the glories of having 500 cable channels, but luckily we still have only one set of eyes. The Internet has made the totality of human knowledge accessible to everyone, but the individual cerebral cortex presents the same bottleneck it always did. Everyone finds the limitations on human memory severe and annoying, but researchers remind us how useful the act of forgetting can be. Remembering too much, they note, is powerfully at odds with abstraction and creativity.

And yet we chafe under this regime, and continually press against it. Two new books take up the subject in various ways. One of them, Our Own Devices, by Edward Tenner, looks at how everyday technology, from shoelaces to eyeglasses to keyboards, affects the way we use our bodies. It concludes with a chapter on the coming age of augmented humanity, an age of "technological symbiosis," when a wide range of enabling devices will be not just portable but implantable. Tenner points out that if such things as pacemakers and cochlear implants are included, one American in ten has already received some sort of mechanized upgrade.

Many of these things, of course, are simply making up for the usual human deficiencies; they fall into the Precambrian era of our sensibility, compared with what's to come. The philosopher Carl Elliott's Better Than Well looks at a more recent stage: "enhancement technologies" as they apply to everything from aging to body height to motivation to happiness. Pharmaceuticals and the knife between them address a multitude of issues; genetic enhancement will address even more.

It is not clear that we will know when to stop. A newspaper report the other day brought word of the dawning age of face transplants. To be sure, the medical pioneers contend that the purpose is purely therapeutic—for burn victims and the like. But we all know that it's only a matter of time before satisfied recipients start emerging from Swiss clinics after the mystifying "disappearances" of people like Brad Pitt and Jennifer Lopez.

Tqhere is an almost infinite number of possible technical ways to transform humanity," the novelist and apocalyptic visionary Bruce Sterling has observed.

We can start from our firm kinship with the microscopic and work upward through every scale. Genetic ways. Mitochondrial ways. Tissue. Bone. Nerves. Guts. Through blood, lymph, and hormone. Through our senses, through our neurons. We are large, physical, multi-cellular entities; every aspect of our being offers up a scientific, technical, and industrial carnival.

The prospect of revolutionary change—the much anticipated "posthuman future"—is at once exciting and terrifying. And yet I retain considerable faith in the staying power of our pre-posthuman selves. Enhancement arrives with the audacity of Napoleon; the body responds with the inertial resistance of those two great Russian generals, January and February.

Besides, the most effective steps toward enhancement usually turn out to be the small ones. In that spirit, consider a study published this year by psychologists at the University of Texas at Austin. The findings, published in the journal Psychological Science, suggest that people who use a diverse array of pronouns have stronger immune systems, lower levels of stress, and less need to see the doctor than people who say "I" "I" "I" all the time. The study speculated that the willingness to perceive the world from many angles is a healthier outlook than solipsism. Can preposition therapy be far away? You, me, us, them: this is a form of enhancement we can all embrace.

Most Popular

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Since the end of World War II, the most crucial underpinning of freedom in the world has been the vigor of the advanced liberal democracies and the alliances that bound them together. Through the Cold War, the key multilateral anchors were NATO, the expanding European Union, and the U.S.-Japan security alliance. With the end of the Cold War and the expansion of NATO and the EU to virtually all of Central and Eastern Europe, liberal democracy seemed ascendant and secure as never before in history.

Under the shrewd and relentless assault of a resurgent Russian authoritarian state, all of this has come under strain with a speed and scope that few in the West have fully comprehended, and that puts the future of liberal democracy in the world squarely where Vladimir Putin wants it: in doubt and on the defensive.

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Modern slot machines develop an unbreakable hold on many players—some of whom wind up losing their jobs, their families, and even, as in the case of Scott Stevens, their lives.

On the morning of Monday, August 13, 2012, Scott Stevens loaded a brown hunting bag into his Jeep Grand Cherokee, then went to the master bedroom, where he hugged Stacy, his wife of 23 years. “I love you,” he told her.

Stacy thought that her husband was off to a job interview followed by an appointment with his therapist. Instead, he drove the 22 miles from their home in Steubenville, Ohio, to the Mountaineer Casino, just outside New Cumberland, West Virginia. He used the casino ATM to check his bank-account balance: $13,400. He walked across the casino floor to his favorite slot machine in the high-limit area: Triple Stars, a three-reel game that cost $10 a spin. Maybe this time it would pay out enough to save him.

A report will be shared with lawmakers before Trump’s inauguration, a top advisor said Friday.

Updated at 2:20 p.m.

President Obama asked intelligence officials to perform a “full review” of election-related hacking this week, and plans will share a report of its findings with lawmakers before he leaves office on January 20, 2017.

Deputy White House Press Secretary Eric Schultz said Friday that the investigation will reach all the way back to 2008, and will examine patterns of “malicious cyber-activity timed to election cycles.” He emphasized that the White House is not questioning the results of the November election.

Asked whether a sweeping investigation could be completed in the time left in Obama’s final term—just six weeks—Schultz replied that intelligence agencies will work quickly, because the preparing the report is “a major priority for the president of the United States.”

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.