It may not be the most banal sentence I ever wrote, but the banality was of a very high order. The sentence was "Children represent what society is going to become." I won't hide behind extenuating circumstances. The fact is that two decades ago these words flowed from my pen and were published under my name. Sadly, they failed to achieve oblivion. They were soon chanced upon by the governor of Maryland, Harry Hughes, who recognized a sentiment that he could stand foursquare behind. And, by golly, he didn't care who knew it! The governor incorporated the sentence into his annual State of the State address, taking pains to mention the author by name and adding the sober gloss "He is right."

This moment of searing embarrassment explains in part why I cringe whenever politicians vie with one another in their concern for the nation's children. The last presidential campaign was notable in this regard, with both candidates arguing that grown-ups bear a sacred responsibility to lift burdens from the young. George W. Bush criticized Al Gore for advocating policies that amounted to "a staggering tax increase on the next generation." Al Gore spoke of the need to "free our children from the burdens of the past being put on their shoulders in the future." It would be refreshing to hear someday a frank avowal by a national leader that the real plan is to let future generations deal with vast amounts of the present mess, the way we always have. Americans might even welcome the suggestion that some explicit portion of the unfnished national agenda—tort reform? metric conversion? mag-lev trains?—be deposited in a vault every year, to be opened a generation later.

Pious rhetoric about children won't disappear anytime soon. And yet, oddly, the comment cited by the governor is not quite as banal today as it was when it was written. Indeed, in certain quarters—notably among some people without children, who resent the attention to child-friendly social policies—that sort of nod to the up-and-coming generation has actually become a source of grievance. The ranks of the childless have not yet coalesced into a full-fedged movement, but there are indications that one may soon emerge.

I bumped up against an indication in the street one day. Encountering an acquaintance unseen for years, I sought to fill her in on various doings, and mentioned that someone known to both of us had recently borne twins. The woman looked at me, her face composed into a mask of withering pity. She said, "Why must people breed?"

The basic story line of the evolution of bourgeois family life in the West, as laid out by Philippe Ariès and other historians, is by now familiar. Children were once an economic necessity, and childhood was accordingly invested with little sentiment. In the Middle Ages, Ariès observed in his book Centuries of Childhood (1960),

[Children] immediately went straight into the great community of men, sharing in the work and play of their companions, old and young alike. The movement of collective life carried along in a single torrent all ages and classes, leaving nobody any time for solitude and privacy.

That all began to change with the rise of the middle class, the spread of learning, the decline of infant mortality, and so on—the usual suspects. "Henceforth," Ariès observed, "it was recognized that the child was not ready for life, and that he had to be subjected to a special treatment, a sort of quarantine, before he was allowed to join the adults."

For centuries this quarantine mainly took the form of school. But now, if activists for the rights of the childless have their way, it may acquire far greater scope—the exclusion of children from designated areas of public life. The ranks of the childless used to be an insignifcant force in America. Not any longer. Childless people are today one of the fastest-growing segments of the work force. One woman in five over the age of forty has never given birth to a child, and most of those women never will. By 2005, according to the Census Bureau, the number of households with children will have been overtaken by the number of households consisting of single people or of married couples without children.

Not all these people have a gripe. Many of them do. Their Magna Carta is Elinor Burkett's The Baby Boon: How Family-Friendly America Cheats the Childless (2000). Burkett argues that people without children, like her, suffer from the consequences of a kind of affirmative action directed at parents—the time off for "family leave," the on-site day-care centers, the flextime, the tax deduction for dependent children, the tax breaks for child care and college tuition. In the workplace, Burkett says, it is the childless who pick up the slack when a parent, or "breeder," must run off to coach a Little League game or "to watch Susi dance Swan Lake." At shopping malls, she reports, the childless seethe when confronted with the sign PARKING RESERVED FOR EXPECTANT MOTHERS AND PARENTS WITH INFANTS. Don't even bring up the recent suggestion, by Cornel West and Sylvia Ann Hewlett, that parents be allowed to cast an additional vote for each of their dependent children.

Burkett anticipates the day when the childless, whom activists prefer to call "the childfree," will begin to demonstrate their clout.

The newly empowered childfrees will sow havoc at school board meetings, over both bloated budgets and curricula that don't teach non-reproduction as an "alternative lifestyle." They'll tie "family-friendly" corporations into legal knots with lawsuits alleging that the equal protection clause of the Constitution must be applied to nonreproducers.

Burkett looks forward to child-free sections in restaurants and airplanes and recreational areas, child-free hours in supermarkets and department stores, child-free options in housing. "Few issues send the childless into longer or more ear-splitting tirades," she writes, "than the lack of such 'adult-only' spaces in which they might shop, dine, or swim without being drowned out by wailing infants or rammed into by rambunctious toddlers."

I don't intend to join the growing debate over child-free zones. Understandably, many people deem the idea abhorrent; as the father of three, I confess to finding it all too compelling. But child-free zones are merely the latest instance of a far broader phenomenon—the proliferation of areas designed to be free of some undesirable social characteristic.

There are "smoke-free zones" in public accommodations, "truck-free zones" on residential streets, "campaign-free zones" around polling places, "skateboard-free zones" in parks, "cell-phone-free zones" in restaurants, "alcohol-free zones" in sports arenas. There are "car-free zones," "trailer-free zones," and "pesticide-free zones." Cities and towns have proclaimed themselves "nuclear-free zones," and activists in Aspen, Colorado, once sought to declare their city a "fur-free zone." Moves are afoot to declare schools "commercial-free zones" (to keep advertising away) and to make the human genome an "IP-free zone" (to prevent sequencing data from being claimed as intellectual property).

A portion of Cape Cod has been set aside as a "gull-free zone," and in Maryland officials concerned about an infestation of Cygnus olor have called for the enforcement of "swan-free zones" in the Chesapeake Bay. Mount Everest, littered with garbage left by decades of trekkers, is now a "bottle-free zone." Officials worried about disruptions on campuses and city streets have sought to create "picket-free zones" and "protest-free zones" (confining demonstrators to an Orwellian precinct called a "free-speech zone"). The city council of Cincinnati, still angry with the commissioner of baseball for refusing to lift a lifetime ban imposed on Pete Rose, has proposed turning Cincinnati into a "Bud Selig-free zone." West Hollywood, worried about inhumane traps and nonmedical experiments on animals, is officially a "cruelty-free zone." Attempts have been made to enact "violence-free zones" and "hate-free zones." Peter Yarrow, of the folk group Peter, Paul and Mary, has mounted a campaign to turn classrooms into "ridicule-free zones."

Some of these exclusionary zones make sense; others are absurd; many are unenforceable. Considered together, though, they suggest the degree to which the idea of customized quarantine has taken hold—the idea that we have at our disposal a diverse array of membranes against contaminating encroachments, and need only deploy them according to taste.

What we may soon be needing, if only for purposes of biodiversity, is "zone-free zones," the social equivalent of wilderness preserves. Here, in pristine tracts set aside for the ages, refugees from civilization could savor a remnant of society in a wild and unzoned state. The zone-free-zone experience might, of course, be somewhat unsettling. Ominous tendrils of cigarette smoke from the trailer park might greet one's entrance. Wailing infants and rambunctious toddlers would be continually underfoot. Skateboarders in fur coats, talking on cell phones, could swoop by at any moment, scattering the gulls and swans. Some visitors, inevitably, would ignore the warnings about venturing too close to speeding trucks or angry picketers. Your guide, Bud Selig, might from time to time motion for the group to hush, cupping his ear: "Listen," he would whisper, as the sound of lacerating sarcasm wafted from a distant window. "Sixth-graders."

Summertime expeditions into a zone-free zone could become an essential part of growing up, as enlightened parents from Cambridge to Berkeley sought a broadening experience for their young—an immersion (camp brochures will say) in that "single torrent" of collective life evoked by Ariès. It is vitally important to instill a sense of historical perspective in the young. Children, after all, represent what society is going to become.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

“All the world has failed us,” a resident of the Syrian city of Aleppo told the BBC this week, via a WhatsApp audio message. “The city is dying. Rapidly by bombardment, and slowly by hunger and fear of the advance of the Assad regime.”

In recent weeks, the Syrian military, backed by Russian air power and Iran-affiliated militias, has swiftly retaken most of eastern Aleppo, the last major urban stronghold of rebel forces in Syria. Tens of thousands of besieged civilians are struggling to survive and escape the fighting, amid talk of a rebel retreat. One of the oldest continuously inhabited cities on earth, the city of the Silk Road and the Great Mosque, of muwashshah and kibbeh with quince, of the White Helmets and Omran Daqneesh, is poised to fall to Bashar al-Assad and his benefactors in Moscow and Tehran, after a savage four-year stalemate. Syria’s president, who has overseen a war that has left hundreds of thousands of his compatriots dead, will inherit a city robbed of its human potential and reduced to rubble.

Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.

It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.

They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.

A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.

A recent study shows that people who simply ate more fiber lost about as much weight as those who went on a complicated diet.

By this time of year, many peoples’ best-laid New Year’s Resolutions have died, just seven short weeks after they were born. One reason why it’s difficult to lose weight—the most common resolution—is that dieting is so confusing.

For instance, the American Heart Association's recommended diet is one of the most effective food plans out there. It’s also one of the most complicated. It requires, according to a recent study, “consuming vegetables and fruits; eating whole grains and high-fiber foods; eating fish twice weekly; consuming lean animal and vegetable proteins; reducing intake of sugary beverages; minimizing sugar and sodium intake; and maintaining moderate to no alcohol intake.” On top of that, adherents should derive half of their calories from carbs, a fifth from protein, and the rest from fat—except just 7 percent should be saturated fat. (Perhaps the goal is to keep people busy doing long division so they don't have time to eat food.)