The crime was brutal. On November 4, 1989, after a night of heavy drinking, David Scott Detrich and a male coworker picked up a woman walking along the side of the road in Tucson, Arizona. After scoring some cocaine, the trio went back to her place, where, according to court documents, Detrich slit the woman’s throat and stabbed her 40 times. Later, the two men dumped her body in the desert.

A jury convicted Detrich of kidnapping and first-degree murder in 1995, and a judge sentenced him to death.

Detrich is still on death row today as the appeals process drags on, but in 2010, his lawyers achieved a victory of sorts. They claimed that Detrich had received “ineffective assistance of counsel” at his trial, because his original legal team had failed to present evidence of neuropsychological abnormalities and brain damage that might have swayed the court to give him a lesser sentence. A federal appeals court agreed. The ruling said, in effect, that Detrich had been denied his Constitutional right to a fair trial because his lawyers hadn’t called an expert witness to talk about his brain.

That judicial opinion is just one of nearly 1,600 examined in a recent study documenting the expanding use of brain science in the criminal-justice system. The study, by Nita Farahany at Duke University, found that the number of judicial opinions that mention neuroscientific evidence more than doubled between 2005 and 2012.

“There are good reasons to believe that the increase in published opinions involving neurobiology are just the tip of the iceberg,” says Owen Jones, a law professor at Vanderbilt who directs the MacArthur Foundation Research Network on Law and Neuroscience. The vast majority of criminal cases never go to trial, Jones says, and only a small fraction of those that do result in written opinions. The rest are virtually impossible to track.

She found that in cases that used neuroscientific evidence, defendants got a favorable outcome.

It’s a trend that makes some neuroscientists uneasy, as they see the potential for their findings to be misused in court. But like it or not, Farahany’s findings suggest, neuroscience is already entrenched in the U.S. legal system.

A handful of cases have made headlines in recent years, as lawyers representing convicted murderers have introduced brain scans and other tests of brain function to try to spare their client the death penalty. It didn’t always work, but Farahany’s analysis suggests that neuroscientific evidence—which she broadly defines as anything from brain scans to neuropsychological exams to bald assertions about the condition of a person’s brain—is being used in a wider variety of cases, and in the service of more diverse legal strategies, than the headlines would suggest. In fact, 60 percent of the cases in her sample involved non-capital offenses, including robbery, fraud, and drug trafficking.

Cases like Detrich’s are one example. Arguing for ineffective assistance of counsel is pretty much a legal Hail Mary. It requires proving two things: that the defense counsel failed to do their job adequately, and (raising the bar even higher) that this failure caused the trial to be unfairly skewed against the defendant. Courts have ruled previously that a defense attorney who slept through substantial parts of a trial still provided effective counsel. Not so, at least in some cases, for attorneys who failed to introduce neuroscience evidence in their client’s defense.

Farahany found 516 claims in which neuroscience evidence was used to argue for ineffective assistance of counsel (a claim, in the legal sense, is a formal demand or assertion). Only 29 percent of these claims were successful, but that’s still a surprisingly high number, she says: “The basic rap on IAC is it almost always fails.”

Neuroscience evidence was also used in 395 competency claims, which are claims regarding a defendants’ mental wherewithal to stand trial, waive their rights, or enter a plea.

In the 2008 case of Miguel Angel Ruiz, a 17-year-old accused of killing his mother, a jury in California apparently was unmoved by two neuropsychologists, who testified that Ruiz had severe language deficits resulting from a brain injury. The jury rejected the claim that Ruiz was not competent to stand trial, but the trial judge disagreed. “I’m not inclined to set aside a jury’s decision lightly or unadvisedly,” he wrote in his decision to do just that. “I just couldn’t believe the jury could return a finding of competency based on the evidence I heard.”

In the Ruiz case, it seems the neuroscience evidence had an impact. The effect on Detrich’s case, which is still bouncing around the courts, is less clear. Overall, it’s hard to know how influential this type of evidence really is, Farahany says. Each case is unique, which makes it difficult to do a fair comparison of outcomes. Even so, she found that in cases that used neuroscientific evidence, defendants got a favorable outcome—a reduced sentence, a new hearing, or some other kind of break—about 20 to 30 percent of the time. That’s considerably better than the 12 percent reported in a 2015 paper that looked at an entire year’s worth of appeals in criminal cases in the U.S.—nearly 70,000 of them. “Percentage-wise, people seem to do a little better with neuroscience than they do with out,” Farahany says.

Courts have always concerned themselves with the mental states of individuals, says Francis Shen, a law professor at the University of Minnesota. What’s changing is the influence of modern neuroscience on how we talk about these things. “The logic is the same, but the stuff a jury or judge will hear is now technical in a way it didn’t used to be,” Shen says.

“Once it becomes clinically relevant, it will become legally relevant.”

The big question, of course, is whether this this growing influence will end up being good or bad for justice. Brain scans or other biomarkers that could more definitively diagnose mental illness would indeed be a boon for the legal system, says Shen. But while scans today can easily pick up signs of major brain damage, scans that reliably pick up the more subtle signs of schizophrenia, for example, are still a ways off. “Once it becomes clinically relevant, it will become legally relevant,” Shen says.

It gets murkier, though, for biomarkers of legally relevant mental characteristics like impulsiveness. One problem is that most neuroscience research involves making comparisons of brain scans collected from dozens of people, while most court cases are concerned with a specific individual. “Seeing something on an individual's scan does not tell us they have impulse-control problems, it merely raises the likelihood by some unknown amount,” says Martha Farah, a cognitive neuroscientist at the University of Pennsylvania.

This is a tension that seems likely to grow. The scientific and legal communities have different methods and standards for evaluating evidence. Scientists have a culture of objectivity and a statistical mindset when it comes to uncertainty. The evidence they’re dealing with often comes from experiments of their own design.

In the courtroom, on the other hand, the evidence comes from real life. It’s messy and incomplete. And when someone’s liberty—or life—hangs in the balance, there’s no time to wait for more data to come in.

Most Popular

His paranoid style paved the road for Trumpism. Now he fears what’s been unleashed.

Glenn Beck looks like the dad in a Disney movie. He’s earnest, geeky, pink, and slightly bulbous. His idea of salty language is bullcrap.

The atmosphere at Beck’s Mercury Studios, outside Dallas, is similarly soothing, provided you ignore the references to genocide and civilizational collapse. In October, when most commentators considered a Donald Trump presidency a remote possibility, I followed audience members onto the set of The Glenn Beck Program, which airs on Beck’s website, theblaze.com. On the way, we passed through a life-size replica of the Oval Office as it might look if inhabited by a President Beck, complete with a portrait of Ronald Reagan and a large Norman Rockwell print of a Boy Scout.

“Well, you’re just special. You’re American,” remarked my colleague, smirking from across the coffee table. My other Finnish coworkers, from the school in Helsinki where I teach, nodded in agreement. They had just finished critiquing one of my habits, and they could see that I was on the defensive.

I threw my hands up and snapped, “You’re accusing me of being too friendly? Is that really such a bad thing?”

“Well, when I greet a colleague, I keep track,” she retorted, “so I don’t greet them again during the day!” Another chimed in, “That’s the same for me, too!”

Unbelievable, I thought. According to them, I’m too generous with my hellos.

When I told them I would do my best to greet them just once every day, they told me not to change my ways. They said they understood me. But the thing is, now that I’ve viewed myself from their perspective, I’m not sure I want to remain the same. Change isn’t a bad thing. And since moving to Finland two years ago, I’ve kicked a few bad American habits.

Why the ingrained expectation that women should desire to become parents is unhealthy

In 2008, Nebraska decriminalized child abandonment. The move was part of a "safe haven" law designed to address increased rates of infanticide in the state. Like other safe-haven laws, parents in Nebraska who felt unprepared to care for their babies could drop them off in a designated location without fear of arrest and prosecution. But legislators made a major logistical error: They failed to implement an age limitation for dropped-off children.

Within just weeks of the law passing, parents started dropping off their kids. But here's the rub: None of them were infants. A couple of months in, 36 children had been left in state hospitals and police stations. Twenty-two of the children were over 13 years old. A 51-year-old grandmother dropped off a 12-year-old boy. One father dropped off his entire family -- nine children from ages one to 17. Others drove from neighboring states to drop off their children once they heard that they could abandon them without repercussion.

Trinidad has the highest rate of Islamic State recruitment in the Western hemisphere. How did this happen?

This summer, the so-called Islamic State published issue 15 of its online magazine Dabiq. In what has become a standard feature, it ran an interview with an ISIS foreign fighter. “When I was around twenty years old I would come to accept the religion of truth, Islam,” said Abu Sa’d at-Trinidadi, recalling how he had turned away from the Christian faith he was born into.

At-Trinidadi, as his nom de guerre suggests, is from the Caribbean island of Trinidad and Tobago (T&T), a country more readily associated with calypso and carnival than the “caliphate.” Asked if he had a message for “the Muslims of Trinidad,” he condemned his co-religionists at home for remaining in “a place where you have no honor and are forced to live in humiliation, subjugated by the disbelievers.” More chillingly, he urged Muslims in T&T to wage jihad against their fellow citizens: “Terrify the disbelievers in their own homes and make their streets run with their blood.”

The same part of the brain that allows us to step into the shoes of others also helps us restrain ourselves.

You’ve likely seen the video before: a stream of kids, confronted with a single, alluring marshmallow. If they can resist eating it for 15 minutes, they’ll get two. Some do. Others cave almost immediately.

This “Marshmallow Test,” first conducted in the 1960s, perfectly illustrates the ongoing war between impulsivity and self-control. The kids have to tamp down their immediate desires and focus on long-term goals—an ability that correlates with their later health, wealth, and academic success, and that is supposedly controlled by the front part of the brain. But a new study by Alexander Soutschek at the University of Zurich suggests that self-control is also influenced by another brain region—and one that casts this ability in a different light.

Should you drink more coffee? Should you take melatonin? Can you train yourself to need less sleep? A physician’s guide to sleep in a stressful age.

During residency, Iworked hospital shifts that could last 36 hours, without sleep, often without breaks of more than a few minutes. Even writing this now, it sounds to me like I’m bragging or laying claim to some fortitude of character. I can’t think of another type of self-injury that might be similarly lauded, except maybe binge drinking. Technically the shifts were 30 hours, the mandatory limit imposed by the Accreditation Council for Graduate Medical Education, but we stayed longer because people kept getting sick. Being a doctor is supposed to be about putting other people’s needs before your own. Our job was to power through.

The shifts usually felt shorter than they were, because they were so hectic. There was always a new patient in the emergency room who needed to be admitted, or a staff member on the eighth floor (which was full of late-stage terminally ill people) who needed me to fill out a death certificate. Sleep deprivation manifested as bouts of anger and despair mixed in with some euphoria, along with other sensations I’ve not had before or since. I remember once sitting with the family of a patient in critical condition, discussing an advance directive—the terms defining what the patient would want done were his heart to stop, which seemed likely to happen at any minute. Would he want to have chest compressions, electrical shocks, a breathing tube? In the middle of this, I had to look straight down at the chart in my lap, because I was laughing. This was the least funny scenario possible. I was experiencing a physical reaction unrelated to anything I knew to be happening in my mind. There is a type of seizure, called a gelastic seizure, during which the seizing person appears to be laughing—but I don’t think that was it. I think it was plain old delirium. It was mortifying, though no one seemed to notice.

A professor of cognitive science argues that the world is nothing like the one we experience through our senses.

As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.

“All the world has failed us,” a resident of the Syrian city of Aleppo told the BBC this week, via a WhatsApp audio message. “The city is dying. Rapidly by bombardment, and slowly by hunger and fear of the advance of the Assad regime.”

In recent weeks, the Syrian military, backed by Russian air power and Iran-affiliated militias, has swiftly retaken most of eastern Aleppo, the last major urban stronghold of rebel forces in Syria. Tens of thousands of besieged civilians are struggling to survive and escape the fighting, amid talk of a rebel retreat. One of the oldest continuously inhabited cities on earth, the city of the Silk Road and the Great Mosque, of muwashshah and kibbeh with quince, of the White Helmets and Omran Daqneesh, is poised to fall to Bashar al-Assad and his benefactors in Moscow and Tehran, after a savage four-year stalemate. Syria’s president, who has overseen a war that has left hundreds of thousands of his compatriots dead, will inherit a city robbed of its human potential and reduced to rubble.

Democrats who have struggled for years to sell the public on the Affordable Care Act are now confronting a far more urgent task: mobilizing a political coalition to save it.

Even as the party reels from last month’s election defeat, members of Congress, operatives, and liberal allies have turned to plotting a campaign against repealing the law that, they hope, will rival the Tea Party uprising of 2009 that nearly scuttled its passage in the first place. A group of progressive advocacy groups will announce on Friday a coordinated effort to protect the beneficiaries of the Affordable Care Act and stop Republicans from repealing the law without first identifying a plan to replace it.

They don’t have much time to fight back. Republicans on Capitol Hill plan to set repeal of Obamacare in motion as soon as the new Congress opens in January, and both the House and Senate could vote to wind down the law immediately after President-elect Donald Trump takes the oath of office on the 20th.

Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.

It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.

They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.

A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.