By Meeri Kim
Patients suffering from pagophagia compulsively crave and chomp on ice, even scraping buildup off freezer walls for a fix. The disorder appears to be caused by an iron deficiency, and supplements of the mineral tend to ease the cravings. But what is it about ice that makes it so irresistible?
A new study proposes that, like a strong cup of coffee, ice may give those with insufficient iron a much-needed mental boost. Fatigue is the most common symptom of iron-deficiency anemia, which occurs when the body can’t produce enough oxygen-carrying hemoglobin because of low iron.
“I had a friend who was suffering from iron-deficiency anemia who was just crunching through massive amounts of ice a day,” said study author Melissa Hunt, a clinical psychologist at the University of Pennsylvania. “She said: ‘It’s like a cup of coffee. I don’t feel awake until I have a cup of ice in my hand.’ ”
Hunt and her colleagues had both anemic and healthy subjects complete a standardized, 22-minute attention test commonly used to diagnose attention deficit hyperactivity disorder. Just before the test, participants were given either a cup of ice or lukewarm water to consume.
Iron-deficient subjects who had sipped on water performed far more slugglishly on the test than controls, as expected. But those who ate ice beforehand did just as well as their healthy counterparts. For healthy subjects, having a cup of ice instead of water appeared to make no difference in test performance.
“It’s not like craving a dessert. It’s more like needing a cup of coffee or that cigarette,” Hunt said.

James Hamblin
People whose faces are perceived to look more "competent" are more likely to be CEOs of large, successful companies. Having a face that people deem "dominant" is a predictor of rank advancement in the military. People are more likely to invest money with people who look "trustworthy." These sorts of findings go on and on in recent studies that claim people can accurately guess a variety of personality traits and behavioral tendencies from portraits alone. The findings seem to elucidate either canny human intuition or absurd, misguided bias.
There has been a recent boom in research on how people attribute social characteristics to others based on the appearance of faces—independent of cues about age, gender, race, or ethnicity. (At least, as independent as possible.) The results seem to offer some intriguing insight, claiming that people are generally pretty good at predicting who is, for example, trustworthy, competent, introverted or extroverted, based entirely on facial structure. There is strong agreement across studies as to what facial attributes mean what to people, as illustrated in renderings throughout this article. But it's, predictably, not at all so simple.
Christopher Olivola, an assistant professor at Carnegie Mellon University, makes the case against face-ism today, in the journal Trends in Cognitive Sciences. In light of many recent articles touting people's judgmental abilities, Olivola and Princeton University's Friederike Funk and Alexander Todorov say that a careful look at the data really doesn't support these claims. And "instead of applauding our ability to make inferences about social characteristics from facial appearances," Olivola said, "the focus should be on the dangers."

David DiSalvo @neuronarrative
One of the lively debates spawned from the neuroscience revolution has to do with whether humans possess free will, or merely feel as if we do. If we truly possess free will, then we each consciously control our decisions and actions. If we feel as if we possess free will, then our sense of control is a useful illusion—one that neuroscience will increasingly dispel as it gets better at predicting how brain processes yield decisions.
For those in the free-will-as-illusion camp, the subjective experience of decision ownership is not unimportant, but it is predicated on neural dynamics that are scientifically knowable, traceable and—in time—predictable. One piece of evidence supporting this position has come from neuroscience research showing that brain activity underlying a given decision occurs before a person consciously apprehends the decision. In other words, thought patterns leading to conscious awareness of what we’re going to do are already in motion before we know we’ll do it. Without conscious knowledge of why we’re choosing as we’re choosing, the argument follows, we cannot claim to be exercising “free” will.
Those supporting a purer view of free will argue that whether or not neuroscience can trace brain activity underlying decisions, making the decision still resides within the domain of an individual’s mind. In this view, parsing unconscious and conscious awareness is less important than the ultimate outcome – a decision, and subsequent action, emerging from a single mind. If free will is drained of its power by scientific determinism, free-will supporters argue, then we’re moving down a dangerous path where people can’t be held accountable for their decisions, since those decisions are triggered by neural activity occurring outside of conscious awareness. Consider how this might play out in a courtroom in which neuroscience evidence is marshalled to defend a murderer on grounds that he couldn’t know why he acted as he did.