Tired of academia, eager to begin a life without algebra, and distracted by the unrelenting Santa Barbara sunshine, I decided to take a gap year. Who knew it would turn out to be 40?

My short-lived college experience pretty much resembled that nightmare you still have. You know, the one where you walk into a large hall, are handed a blue book and then realize you forgot to study or attend class? So, in 1968, after completing one year of college, I moved on to a satisfying life of writing jobs and political campaigns, marriage, raising children.

Not having a college degree was by no means a serious issue, especially in Los Angeles where folks just want to know how many miles you ran that morning or if you have any good screenplay ideas. But upon moving to New York in 1983—where inquiring and glittering minds immediately want to know, “where did you go to school”—the diploma-less me began to feel the sting.

I usually answered “University of California” and left it at that. But it continued to gnaw at me and I began to crave closure. The unfilled hole also left a sour feeling of intellectual insecurity regardless of how many times I was reminded that Jobs, Zuckerberg, Gates, and countless artists didn’t complete their four years. Eventually, I couldn’t deny it anymore. I knew I had missed one of life’s great social experiences, not to mention always feeling under-read, under-tested, and under-challenged.

When I applied to Columbia’s General Studies program (for those who did not complete college the first time for various reasons), my friends were astonished. Why now, when I wasn’t looking for a new career? My answer was threefold: My husband and I would soon be entering empty nestdom, so it seemed a good time to fill it; I had always done everything late in life (marriage at 34, first child 39, braces in my 40s); I regretted not having had a real campus experience. Now that I am educationally embedded, folks generally first say, as did writer-publisher-Harvard grad James Atlas, “I wish I could go back, when I know so much more about life. Books grow with us.” Such sentiments are quickly followed by, “But I could never take a test or write a long paper.”

I am clearly not alone in my quest for academic validation: Well over half a million of the students enrolled in degree-granting institutions are over the age of 50. “One advantage about returning to college later in life is that the student will likely have a greater sense of purpose and focus and thus be able to capitalize better on what is offered,” says Margaret Gatz, a psychology professor at University of Southern California. “Another advantage is that the older student brings a lifetime of experiences and knowledge to the new information being presented and thus can have a richer learning experience.”

Gatz points out potential barriers, including competing demands. (Every time I tell my adviser that I can’t imagine how students could be taking four, even five classes at a time, he reminds me they are not also running a household and writing plays. Oh, that.) Another hurdle might be physical stamina. “The older student will be surrounded by college-age youth who have agile memories and who can stay up all night to cram for an exam or finish a paper,” says Gatz. “This just means that the older student must be craftier.”

Fortunately, there is increasing evidence that older students can succeed and that it will even keep our minds sharper. “There is as much variation in an aging brain as there is in a school child’s brain,” says New York psychiatrist Roger Gould. “If you and your brain are healthy, the only limitations to learning new mental skills and information are your motivation and natural intelligence.”

James Fallon, a neuroscientist at University of Irvine, claims “people are at their maximum cognitive abilities are in their 60s. It’s the ideal time to balance their executive functions, which younger students don’t necessarily have yet, with intellectual techniques which are likely still there but haven’t been used for a long time.” Fallon, who is 66, says, “I have never been more creative and productive.”

Fallon’s views are corroborated by the Seattle Longitudinal Study, which tracked the cognitive abilities of thousands of adults over 50 years. The report shows that middle-aged adults performed better on four out of six tests than they did as young adults. Likewise, University of Virginia psychology professor Timothy Salthouse last year completed a comprehensive study entitled Consequences of Age-Related Declines and concluded, “cognitive ability is important in daily life and there is currently little evidence that its importance declines with increasing age. In fact, there are some reasons to expect that the role of cognitive ability in late adulthoods is even more important now because individuals of all ages are being asked to take more responsibility for financial and medical decisions.”

Related Story

Gatz offers us older students some key tips: Attend lectures; study actively (keep testing yourself along the way); focus on one thing at a time; look for opportunities to discuss the material. And, says Gatz, “Emphasize understanding over memorizing: Older learners have an advantage in recalling the gist of what they have read or been told, so relate what you are learning to what you already know.”

That said, some “craftiness” is called for. Before I went back to school, I didn’t even know what mnemonics meant (let alone how to spell it), but I sure do now. I got through Music and Art Appreciation classes by attaching every composer or artist to a letter in the alphabet, or connecting them to some personal memory. I devise a way of organizing my notes and memorizing what the pages look like, so when handed the dreaded blue exam books, I immediately draw pictures of my studied pages. How much do I retain? Probably about 10 percent. But hey, how much Judaism did you retain after your bar mitzvah?

The knowledge I do retain has served me well, and in unexpected ways. There I was in the Pitti Palace Museum in Florence last summer, explaining the genesis of some obscure paintings I recognized from a class. Even though I only got a C plus (believe me, I was grateful) in my Science of Psychology class, much of the information has informed my non-academic writing ever since. And when I took my son to Normandy last year, it had new meaning due to the course I had just completed on World War Two and Memory.

Of course, there are the social issues, realizing you may feel like your co-students, but you sure don’t look like them anymore. Remember that other nightmare you have: the one where you are the last picked for every team? Welcome to my life. My three partners rolled their eyes when I was assigned to their group for an excursion to Coney Island and subsequent 20-page report. But they quickly warmed to the idea when they learned I had a car and was adept at editing. My friend Robin, who returned to college in her 40s, admitted that almost as gratifying as graduating magna cum laude was when one of her fellow students said, “You’re just like one of us.”

The regular students may take awhile to warm to us, but the schools themselves are seeing the gains in the returnees: “Such students bring experience and motivation that enriches the whole community,” says Judson Shaver, president of Marymount Manhattan College. “Typically, they set a high bar for their young colleagues. MMC has had many such students, some of whom have gone on to serve as Trustees.”

As for the professors, they tend to have mixed feelings and strict agendas and are sometimes just not that into you. “Older students know what they want, which is good,” says Andre Aciman, author and professor at the Graduate Center at the City University of New York. “But there can be issues. Many grew up in the ‘60s when they learned to talk back and say what they think. There is much more decorum now in classrooms. It can also be a problem when they think they know more than the teacher.”

I am trying my best. I have learned to raise my hand again. I have learned to hold my tongue, even if it means not saying I found a mistake in the syllabus, that my father funded the release of the My Lai massacre report being discussed, or that I dated one of the authors of the Port Huron Statement.

Bottom line? Being back in college is really rigorous and there’s not a day I don’t consider giving up, convinced that I am not now and will never be a critical thinker. But then—it’s fun to hear my husband brag that he’s the only one in his family of four not currently getting a college degree. That student I.D. is saving me a lot of money at the museums. I can opt out of really boring social events because I have to study for my midterms. I am getting the chance to do something over when I am at the right time and place to savor it.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

“All the world has failed us,” a resident of the Syrian city of Aleppo told the BBC this week, via a WhatsApp audio message. “The city is dying. Rapidly by bombardment, and slowly by hunger and fear of the advance of the Assad regime.”

In recent weeks, the Syrian military, backed by Russian air power and Iran-affiliated militias, has swiftly retaken most of eastern Aleppo, the last major urban stronghold of rebel forces in Syria. Tens of thousands of besieged civilians are struggling to survive and escape the fighting, amid talk of a rebel retreat. One of the oldest continuously inhabited cities on earth, the city of the Silk Road and the Great Mosque, of muwashshah and kibbeh with quince, of the White Helmets and Omran Daqneesh, is poised to fall to Bashar al-Assad and his benefactors in Moscow and Tehran, after a savage four-year stalemate. Syria’s president, who has overseen a war that has left hundreds of thousands of his compatriots dead, will inherit a city robbed of its human potential and reduced to rubble.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.

It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.

They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.

A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.