tag:blogger.com,1999:blog-275437302010-04-21T16:29:51.183-04:00We're Only Human...Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.comBlogger142125tag:blogger.com,1999:blog-27543730.post-68649341351109231982010-04-21T16:10:00.003-04:002010-04-21T16:29:51.192-04:00"I feel your disease"<p><br />Make no mistake. Flu season isn’t over. I was in my doc’s office the other day for something routine, and he’s still pushing the H1N1 vaccine. I don’t know why the other patients were in the waiting room, but I do know a few of them were sniffling and sneezing. There was a video on the TV about the importance of hand washing during flu season, to prevent the spread of germs. My throat started to feel a little sore. </p><p>I’m fine. I apparently escaped without exposure to anything sickening. But my mind was on high alert the entire time I was there. Such waiting room vigilance is not <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/doctor-706727.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 129px; FLOAT: left; HEIGHT: 129px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/doctor-706726.jpg" /></a>unusual, and indeed has long been recognized as a kind of behavioral immune system. Simply seeing signs of disease triggers thoughts and emotions that motivate us to take extra precautions around any possible contagion.<br /><br />And it may trigger our bodies as well, according to new research from the University of British Columbia. Psychologist Mark Schaller and his colleagues suspected that psychological defenses might be just part of a broader immune response—one involving the natural killer cells and cytokines and other biochemical defenses that fight off invading germs. They decided to test this idea—by seeing if they could trick healthy bodies into action. Here’s the study:</p><p>The scientists recruited healthy men and women and had them watch slide shows. All of the volunteers watched a 10-minute slide show about furniture; this was deliberately boring, to act as a control condition. Then a bit later, half the volunteers watched a fairly disgusting slide show, with images of skin lesions and oozing pox, in addition to garden variety sneezes and coughs. The other volunteers watch a slide show about guns—not just guns, but people brandishing firearms, and mostly pointing the weapons directly at the viewer. </p><p>The guns were important, because guns are very threatening—especially when they’re aimed at you—but they’re not related at all to disease or infection. The scientists wanted to rule out threat—any threat—as the cause of any immune response they measured. And that’s just what they saw. They drew blood from the volunteers before and after each slide show, and measured the levels of a cytokine called IL-6, a major fighter in the immune war. Those who had viewed the depictions of sickness showed a dramatic jump in IL-6 production—more than 23 percent. These same volunteers had no biological response to looking at furniture and—more important—the volunteers who looked at brandished weapons also showed no significant immune response.</p><p>One possible interpretation of these results is that looking at pox and sores is stressful, and that the stress triggered the immune response. But the scientists ruled that out. They measured self-reported stress, and in fact those who had watched the guns were under more—not less—stress. The psychologists also ruled out personality as an explanation: They measured traits like neuroticism and agreeableness as well as the volunteers’ perceived vulnerability to illness—none of these traits distinguished the gun viewers from the disease viewers. The only explanation, it appears, is that simply seeing other people’s sickness prompted the volunteers’ immune systems to act as if they were under attack. </p><p>Is this a good thing? Perhaps not as good as it sounds. As the researchers explain this week in the<a href="http://pss.sagepub.com/content/early/2010/04/02/0956797610368064.full.pdf+html"> on-line version of Psychological Science</a>, a direct link between perception and immune response may have helped our ancient ancestors respond quickly and efficiently to pathogens. It may even have helped us evolve as a social species by permitting early humans to gather in groups. But that doesn’t mean it’s still a good thing. All sorts of social cues can today mimic actual disease threats, causing the immune system to respond aggressively even when there is no real threat. Too many false starts could compromise immune function over time, with serious consequences for human health and welfare. </p><p>Excerpts from “We’re Only Human” appear regularly in <em>The Huffington Post</em> and <em>Scientific American Mind</em>. Wray Herbert also writes the<a href="http://www.trueslant.com/wrayherbert/"> “Full Frontal Psychology” blog at True/Slant</a>.<br /><br /><br /><br /></p><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-6864934135110923198?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-83709755259712396122010-04-20T13:12:00.003-04:002010-04-20T15:41:33.348-04:00The ironic power of stereotypeBrent Staples is an editorial writer for the <em>New York Times</em> and a University of Chicago-trained psychologist. He is also African-American, and back in the 70s, when he was doing his graduate studies, he discovered that he could threaten white people simply by walking down the streets of his Hyde Park neighborhood. When white couples saw him coming, especially at night, they would lock arms, stop all conversation, and stare straight ahead. Sometimes they would cross to the other side of the street.<br /><br />The white Chicagoans were obviously being influenced by the stereotype of the dangerous young black man. But the more sinister effects of the stereotype were on <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/vivaldi-711149.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 198px; FLOAT: left; HEIGHT: 301px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/vivaldi-711147.jpg" /></a>Staples himself. At first he played with this new-found power, deliberately using it to “scatter the pigeons.” But he also felt guilty about discomfiting innocent strangers, and ultimately he figured out a way to defuse his own potent symbolism. He did this simply by whistling—whistling Vivaldi. Somehow, whistling the sweet refrains of the Venetian composer’s <em>Four Seasons</em> was enough to trump the stereotype and put the neighbors at ease.<br /><br />But Staples wasn’t at ease. Whether he was exploiting the stereotype or resenting it or actively countering it, it was on his mind, distracting him from other matters. Social psychologist Claude Steele borrows from Staples’s experience for the title and central metaphor of his new book, <a href="http://books.wwnorton.com/books/Whistling-Vivaldi/"><em>Whistling Vivaldi</em> </a>(W.W. Norton), an illuminating tour through many years work on stereotypes and “stereotype threat.” Stereotypes are rampant in society, Steele argues, but his purpose here is not to whine about the unfairness of these caricatured views. Instead, he takes us inside his and others’ labs to show precisely how stereotypes commandeer the mind and do their psychological damage.<br /><br />Steele, who is also African-American, is especially interested in performance—in school, sports and the workplace—and indeed his work began with his curiosity about the sub-par performance of even the best African-American university students. He has a theory about academic failure, which basically goes like this: Even in the absence of overt racism, stereotypes about unintelligent African-Americans are always “in the air.” That is, African-American students are aware of these common caricatures, and this awareness makes them anxious—anxious about reinforcing the group stereotype and contributing to its legitimacy. This anxiety, through a variety of physiological pathways, actually depletes the students’ cognitive reserves—leading, ironically, to exactly the poor academic performance that the stereotype predicts.<br /><br />Steele marshals study after study to demonstrate the power of such stereotype threat. In a typical experiment, for example, he had both white and African-American students take a rigorous test, but beforehand he told only some of the students that it was a test of intelligence; the others believed it was a test of no particular importance. The African-American students who thought their intelligence was being assessed, and compared to white intelligence, did much worse on the exam—worse than the whites and worse than the African-Americans who were under no stereotype threat.<br /><br />And it’s not just African-Americans who suffer under stereotype threat. If women believe they are being compared to men in math, they indeed perform worse on math tests. If white men are told that their natural athletic ability is being measured, they choke in a golf contest against African-American golfers; but if they’re told that their golf acumen is being tested, they outperform African-Americans. Indeed, fifteen years of such studies has demonstrated the effects of stereotype threat in Latinos, third-grade schoolgirls, Asian American students, U.S. soldiers, female business students, older Americans, German and French students, aspiring psychologists. The list goes on.<br /><br />Steele’s unique contribution is taking us inside the mind of the stereotype victim, and it’s not a pretty sight. When we’re unnerved by an unsavory caricature, our minds race; we’re vigilant; we’re arguing internally against the stereotype; denying its relevance; disparaging anyone who would use such a stereotype; pitying ourselves; trying to be stoic. In short, we’re doing everything except high level thinking—the kind that leads to academic excellence. We’ve channeled our limited cognitive power into dealing with the threatening caricature.<br /><br />Steele ends <em>Whistling Vivaldi</em> with prescriptions for countering the effects of stereotype threat—creating self-affirming narratives, for example, and mind-sets that emphasize growth and change rather than fixed abilities. These are proven strategies for creating “identity safety,” but they need to begin early in children’s lives. Ignoring the perils of stereotypes is just another way of whistling in the dark.<br /><br />Excerpts from “We’re Only Human” appear in <em>The Huffington Post</em> and <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-8370975525971239612?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-2567520341907549542010-04-12T11:32:00.003-04:002010-04-12T15:03:01.429-04:00Dog tired: What our hounds can teach us about self-control<p><br />We humans have much more self-discipline than other animals. We can and do set goals—losing 25 pounds, going to college—and then go without certain pleasures to achieve those goals. We’re far from perfect at this, but there’s no question that better self-control sets us apart from more lowly beasts. </p><p>Scientists have long argued that delaying gratification requires a sense of “self.” Having a sense of personal identity allows us to compare what we are today, at this very moment, with what we want to be—an idealized self. Aspiring to this idealized self is what fosters uniquely human self-control powers. </p><p>Well maybe—or maybe not. New research is now suggesting a much more primitive explanation for our powers of self-discipline—one that brings us down a notch or two <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/zooey.tri-738209.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 133px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/zooey.tri-738206.jpg" /></a><br />in the animal kingdom. Indeed, it appears that, even with our lofty goals, we may rely on the same basic biological mechanism for self-discipline as our four-legged best friends. Here’s the science.<br /><br />Psychological scientist Holly Miller and her colleagues at the University of Kentucky knew from previous research that human self-control relies on the brain’s “executive” powers, which coordinate thought and action. It’s further known that this kind of cognitive processing is fueled by glucose, and that depletion of the brain’s fuel supply compromises self-discipline. But is this a uniquely human system? Or do less evolved animals rely on sugar-powered executive powers as well?<br /><br />To find out, Miller recruited a group of dogs ranging in age from ten months to more than ten years old. Some were pure breeds, like Australian shepherds and Vizlas, while others were mutts. The dogs were all familiar with a toy called a Tug-a-Jug, which is just a clear cylinder with treats inside; dogs can easily manipulate the Tug-a-Jug to get a tasty payoff. In the experiment, some of the dogs were ordered by their owners to “sit” and “stay” for ten minutes. That’s a long time to sit still; it was meant to exhaust the hounds mentally, and thus to deplete their fuel reserves. The other dogs, the controls, merely sat in a cage for ten minutes.<br /><br />Then all the dogs were given the familiar Tug-a-Jug, except that it had been altered so that it was now impossible to get the treats out. The hungry dogs could see and hear the treats—but not get at them. The idea was to see if the previous demand for self-discipline made the dogs less, well, dogged in working for the treats. And it did, unmistakably. Compared to the dogs who had simply been caged, those who had stayed still for ten minutes gave up much more quickly—after less than a minute, compared to more than two minutes for the controls. In other words, exerting self-discipline had used up much of their sugar supply—and weakened the executive powers needed for goal-directed effort.<br /><br />Executive powers? In old Shep? These findings suggest that self-control may not be a crowning psychological achievement of humanity, and indeed may have nothing to do with self-awareness. It may simply be biology—and beastly biology at that. These are humbling results, so the scientists decided to recheck them in a different way. In a second experiment, they recruited another group of dogs, this time including Shetland sheepdogs and border collies. As before, some of the dogs sat and stayed for ten minutes while the others were caged. But this time, half of the obedient dogs got a sugar drink following the exercise, while others got an artificially sweetened drink. Miller basically wanted to see if she could restore the dogs’ executive powers by refueling them.<br /><br />Which is exactly what happened. <a href="http://pss.sagepub.com/content/21/4/534.full.pdf+html">As reported in the April issue of the journal <em>Psychological Science</em></a>, the dogs who exerted self-control, then got replenished with sugar, were just like the dogs who had not been exhausted to begin with. They persisted just as much with the Tug-a-Jug, even though it was frustrating and demanding to do so. The depleted dogs who were not replenished gave up in short order. In short, they all acted just like humans.<br /><br />So we’re not unique—at least not in this regard. It appears that hallmark sense of human identity—our selfhood—is not a prerequisite for self-discipline. Whatever it is that makes us go to the gym and save for college is fueled by simple sugar—much like our hound’s decision to sit still and stay.<br /><br />Articles from “We’re Only Human” also run regularly in <em>The Huffington Post </em>and in <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<br /><br /></p><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-256752034190754954?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-64649190280897073652010-04-07T14:20:00.006-04:002010-04-07T16:07:23.547-04:00How to read minds like a wizardFans of the Harry Potter books will be familiar with the art of Legilimency. Legilimency is an advanced form of wizardry, the supernatural ability to coax thoughts and feelings and memories from another’s mind. It’s a magical skill encompassing mind reading and lie detection—and it’s black magic in the wrong hands. Dumbledore, headmaster of Hogwarts, is a master Legilimens, as are the evil Snape and Voldemort. Harry never quite masters the difficult craft.<br /><br />Many of us Muggles wouldn’t mind a touch of telepathy from time to time—though for much more ordinary purposes. Wouldn’t it be helpful to know—to <em>really </em>know—what your <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/harry-potter-781172.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 130px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/harry-potter-781161.jpg" /></a>colleagues are thinking about that paper you just presented? Or how about that blind date? Did she find you witty? Attractive? Foolish? Humans are actually very bad at mind reading. Indeed, studies have shown that we do no better than chance when intuiting how much people like us.<br /><br />Well, it may now be possible to do better than that. We may not have supernatural powers, but we do have untapped cognitive powers that might be harnessed to help us more accurately assess what others think of us. Two psychological scientists have been exploring why we misinterpret others’ thoughts so often, and they have been using these insights to construct a tool for ordinary, everyday telepathy.<br /><br />Tal Eyal of Ben Gurion University in Israel and Nicholas Epley of the University of Chicago started with “construal theory.” That’s just psychological jargon meaning that we perceive different people and things in our world at different levels of detail. Think of two houses; you’re standing in the yard right next to one of them, and the other is on a hill a quarter mile away. The distant house is only a vague outline; it’s got two stories, a pitched roof, windows and a door. By contrast, you see the house next to you in all its detail, right down to the marigolds in the flower boxes and the chipped green paint on the shutters.<br /><br />And this is how we construe ourselves and others as well—which is why we have so much trouble reading minds. We see ourselves in all our glorious (or inglorious) detail, so we assume that others do as well. But in fact others see us as off in the distance, drawn only in broad strokes. Eyal and Epley figured that if we can somehow manage to take the long view of ourselves—the view that others routinely take—then we might be able to get a more accurate sense of what others think and feel about us.<br /><br />Here’s how they tested this idea in the laboratory. They had each of a large group of volunteers pose for a photograph, which was displayed on a computer screen. The volunteers were told that someone of the opposite sex would be rating their attractiveness—not unlike a blind date. But some were told that they would be judged later that day, while others were told that the judging wouldn’t take place for several months. This was the laboratory equivalent of psychological distance, which the scientists anticipated would determine how people read the minds of their judges.<br /><br />To find out, those in the beauty contest were asked to write down how they expected the other person to describe the photograph—and how that person would rate their attractiveness. And other volunteers—the judges—in fact did this, describing the photo and rating the person’s looks. And what did they find out? Those who didn’t expect to be judged for several months were much more accurate in "mind reading" others’ opinions and ratings. That’s because imagining themselves as psychologically distant brought them more in sync with the reality of how people see other people. Those who anticipated having their looks judged that very afternoon guessed that their judges would be much pickier and more critical than they were in fact. They expected (wrongly) to be put under a microscope.<br /><br />It’s important to note that the judges’ opinions didn’t change. People always see others in general and abstract ways. What changed were the opinions ascribed to the judges—the mind reading. The actual descriptions are telling. For example, those who were close (in time and psychologically) expected to be described in immediate and close detail—pony tail, weary eyes—where in fact the judges were quite general in their descriptions—Asian, slender, wears glasses or doesn’t. Much like the near and distant houses.<br /><br />The researchers ran another version of this experiment, but this one focused on general impressions rather than looks. In this study, volunteers talked into a microphone for 2 ½ minutes, describing themselves in great detail—their education and hobbies and family and dreams. They knew that others would be listening to this recording and forming an impression of them, but again the distancing varied: As before, some thought they would be evaluated later in the day, while others thought that would occur months later.<br /><br />The results were basically the same as before. <a href="http://pss.sagepub.com/content/early/2010/03/31/0956797610367754.full.pdf+html">As reported on-line in the journal <em>Psychological Science</em></a>, those who had more psychological distance from themselves had a much more realistic sense of how others saw them. They were able to see the “big picture” rather than focusing on trivial flaws and defects that only a microscope can detect. In short, they were better mind readers.<br /><br />This is not simply putting oneself into someone else’s shoes. The scientists emphasize that, and indeed they ran to test to compare construal-based thinking to mere perspective-taking. Perspective-taking didn’t match up. That’s because being in another’s shoes is not a scientific concept; it’s not based on any understanding of human cognition. Psychological distancing is. And as these experiments show, it can be a powerful cognitive tool for everyday telepathy. It may not be Legilimency, but it’s not bad for mere Muggles.<br /><br />Excerpts from “We’re Only Human” also appear in <em>The Huffington Post </em>and <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-6464919028089707365?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-72479634189177935222010-04-05T15:19:00.002-04:002010-04-05T15:30:46.514-04:00Knockoff psychology: I know I'm faking itWithin just a few blocks of my office, street vendors will sell me a Versace t-shirt or a silk tie from Prada, cheap. Or I could get a deal on a Rolex, or a chic pair of Ray Ban shades. These aren’t authentic brand name products, of course. They’re inexpensive replicas. But they make me look and feel good, and I doubt any of my friends can tell the difference.<br /><br />That’s why we buy knockoffs, isn’t it? To polish our self-image—and broadcast that polished version of our personality to the world—at half the price? But does it work? After all, we first have to convince ourselves of our idealized image if we are going to sway anyone else. Can we really become Ray Ban-wearing, Versace-bedecked sophisticates in our own mind—just by dressing up?<br /><br />New research suggests that knockoffs may not work as magically as we’d like—and indeed may backfire. Three psychological scientists—Francesca Gino of Chapel Hill, Michael Norton of Harvard Business School, and Dan Ariely of Duke—have been exploring the power and pitfalls of fake adornment in the lab. They wanted to see if counterfeit stuff might have hidden psychological costs, warping our actions and attitudes in undesirable ways.<br /><br />Here’s an example of their work. The scientists recruited a large sample of young women and had them wear pricey Chloe sunglasses. The glasses were the real thing, but half the women thought they were wearing knockoffs. They wanted to see if wearing counterfeit shades—a form of dishonesty—might actually make the women act dishonestly in other ways.<br /><br />So they had them perform a couple tasks—tasks that presented opportunities for lying and cheating. In one, for example, the women worked on a complicated set of mathematical puzzles—a task they couldn’t possibly complete in the time allowed. When time elapsed, the women were told to score themselves on the honor system—and to take money for each correct score. Unbeknownst to them, the scientists were monitoring both their work and their scoring.<br />And guess what. The women wearing the fake Chloe shades cheated more—considerably more. Fully 70 percent inflated their performance when they thought nobody was checking on them—and in effect stole cash from the coffer.<br /><br />To double-check this distressing result, the scientists put the women through a completely different task, one that forced a choice between the right answer and the more profitable answer. And again the Chloe-wearing women pocketed the petty cash. Notably, the women cheated not only when they expressed a preference for the cheap knockoffs, but also when the real and fake designer glasses were randomly handed out. So it appears that the very act of wearing the counterfeit eyewear triggered the lying and cheating.<br /><br />This is bizarre and disturbing, but it gets worse. The psychologists wondered if inauthentic image-making might not only corrupt personal ethics, but also lead to a generally cynical attitude toward other people. In other words, if wearing counterfeit stuff makes people feel inauthentic and behave unethically, might they see others as phony and unethical, too? To test this, they again handed out genuine and counterfeit Chloe shades, but this time they had the volunteers complete a survey about “someone they knew.” Would this person use an express line with too many groceries? Pad an expense report? Take home office supplies? There were also more elaborate scenarios involving business ethics. The idea was that all the answers taken together would characterize each volunteer as having a generally positive view of others—or a generally cynical view.<br /><br />Cynical, without question. Compared to volunteers who were wearing authentic Chloe glasses, those wearing the knockoffs saw other people as more dishonest, less truthful, and more likely to act unethically in business dealings.<br /><br />So what’s going on here? Well, the scientists ran a final experiment to answer this question, and here are the ironic results they report on-line this week in the journal <em>Psychological Science</em>: Wearing counterfeit glasses not only fails <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/sunglasses-702379.gif"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 167px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/sunglasses-702377.gif" /></a>to bolster our ego and self-image the way we hope, it actually undermines our internal sense of authenticity. “Faking it” makes us feel like phonies and cheaters on the inside, and this alienated, counterfeit “self” leads to cheating and cynicism in the real world.<br /><br />Counterfeiting is a serious economic and social problem, epidemic in scale. Most people buy these fake brands because they are a lot cheaper, but this research suggests there may be a hidden moral cost yet to be tallied.<br /><br />Excerpts from "We're Only Human" also appear in <em>The Huffington P</em>ost and <em>Scientific American Mind</em>. Wray Herbert's book, <em>On Second Thought: Outsmarting Your Mind's Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-7247963418917793522?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com3tag:blogger.com,1999:blog-27543730.post-77994801451848168312010-04-02T12:01:00.002-04:002010-04-02T12:16:21.665-04:00Do you really need those eyeglasses?<div></div><br /><p><br />Most of us use the numbers 20/20 unthinkingly, basically as a synonym for good vision. We take it on faith that 20/20 is an accurate measure of some biological reality. But how straightforward is visual acuity in fact? After all, those eye charts in your optometrist’s office measure not only the sharpness of the image on your eye’s retina, but also your brain’s interpretation of that information. How much liberty does the interpreting mind take with this biological reality?<br /><br />New research is beginning to focus on the psychological dimensions of vision—with some surprising results. The studies are from the Harvard University laboratory of <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/snellen2-770802.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 135px; FLOAT: left; HEIGHT: 135px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/snellen2-770801.jpg" /></a>Ellen Langer, whose books <em>Mindfulness</em> and <em>Counterclockwise</em> challenge many of our assumptions about our physical limitations—especially the limitations we associate with aging. In the new studies, Langer and her colleagues manipulated various beliefs about vision to see if mind-set can affect something as basic as eyesight.<br /><br />Langer’s experiments are always innovative. In one of the vision studies, for example, she started with the widespread belief that Air Force pilots have excellent vision. That’s not an unfounded belief in fact, because 20/20 vision is a prerequisite for fighter pilot training. To exploit this belief, she recruited a group of students from MIT’s ROTC program, many of who aspire to be pilots. She tested their vision with standard eye charts, and then asked some of the volunteers to “become pilots” by flying a flight simulator. She specifically instructed them to actively imagine themselves as pilots, as they used the throttle, compass and other trappings of an actual cockpit to execute flight maneuvers. They even wore green army fatigues to enhance their role-playing.<br /><br />No mention was made of vision, neither to the “pilots” nor to the controls, who merely sat in a stationary cockpit. After a short time, Langer surreptitiously measured all the volunteers’ vision. She had four aircraft “approach” from the front, each with a serial number on the wing. The volunteers were told to read the serial numbers on the four wings which, unbeknownst to them, were the equivalent of different lines on an eye chart. Langer was in effect administering the optometrist’s standard eye exam, under the guise of flight simulation.<br /><br />And what did she find? Unmistakably, the “pilots” showed greater improvement in vision. Four of ten volunteers could see better after playing pilot, compared to none of the controls. Langer reran this experiment, in one case telling the controls they could motivate themselves to have better vision and in another actually giving them eye exercises. But the pilots still outperformed them. In other words, simply believing that pilots have good vision was enough to sharpen the volunteer-pilots’ eyesight.<br /><br />This was obviously an elaborate experiment, and the number of volunteers was necessarily small. So Langer decided to explore the question in a completely different way. In a second experiment, she exploited the belief that athletes have good vision—again not an unreasonable assumption since vision generally enhances coordination. To test this idea, she tested the eyesight of a larger group of volunteers, then had some of them do jumping jacks, while others simply skipped around them room. She wanted all of the volunteers to be equally aroused physically, but she figured that psychologically, jumping jacks would be seen as more athletic than skipping. And indeed when she retested their eyesight, the results echoed those from the pilot study. Fully a third of the volunteers had better vision after acting athletically; only one of the skippers showed such improvement.<br /><br />Now keep in mind that the volunteers did not in fact differ at all on athleticism. All that differed was their psychological mind-set, as a result of jumping or skipping. And it appeared that psychology was enough to sharpen their view of the world.<br /><br />Langer ran a final experiment, this one using the actual optometrist’s eye chart—or versions of it. She wanted to test the power of two common beliefs that most of us take with us when we have our eyes examined: One, that it will be easy to read the top lines of the eye chart. And two, that it will be increasingly difficult to read the farther down the chart one reads. I think it’s fair to say that most adults share those beliefs.<br /><br />But what if the chart is switched around? That’s what Langer did. She created two eye charts that looked in most ways like the standard chart, except for this: In one case, the letters became not smaller but progressively larger moving down the chart. In the other, the chart started not with the huge E, but with a line that would normally be about two-thirds of the way down. In other words, she administered eye exams that exploit fundamental assumptions about optometrists’ eye charts.<br /><br />And again, psychology trumped biology. <a href="http://pss.sagepub.com/content/early/2010/03/19/0956797610366543.full.pdf+html">As reported on-line this week in the journal <em>Psychological Science</em></a>, the volunteers saw letters that they normally couldn’t see when the chart was shifted or reversed. They believed they would be able to read the top of the chart, and so they did—regardless of the actual font size. Taken together, these experimental results suggest that our vision may be compromised, at least in part, by our mindless beliefs. </p><p>Wray Herbert also writes regularly for the <em>Huffington Post</em>, where this article first appeared. His book, <em>On Second Thought: Outsmarting Your Mind's Hard-Wired Habits</em>, will be published by Crown in September.<br /><br /><br /></p><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-7799480145184816831?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-2792892814545389672010-03-30T15:01:00.006-04:002010-03-30T16:08:39.028-04:00American restlessness, American unhappiness?<p><br />Imagine you are a high school basketball player, and a pretty good one. You are a senior, and right now you are the starting point guard for the Rochester Eagles. Last year you started for the Lexington Cougars, in a different state, and the year before that you played the same position for yet another squad, the Flyers of Pottsville. Your family moves a lot because of your father’s work, but you’ve managed to win a spot on the local team wherever you land.<br /><br />So how do you think of yourself at the moment? Do you identify yourself as a proud Rochester Eagle? Or do you think of yourself as simply a talented point guard? </p><p>Well, if you’re like most people, you think of yourself primarily as a journeyman point guard, not as a member of the Eagles—or of any local team for that matter. That’s because you’ve learned from experience that group membership doesn’t last; teams and communities are fleeting. What endures are your grit, and your leadership skill, and your fast hands. In short, you.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/moving-724886.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 117px; FLOAT: left; HEIGHT: 130px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/moving-724884.jpg" /></a><br />This example comes from the work of University of Virginia psychological scientist Shigehiro Oishi, who has for some years been studying the mental and emotional consequences of residential mobility. America is one of the most mobile societies in the world, which means that lots of people are living different versions of the itinerant hoopster’s experience. Surprisingly, psychologists have not paid much attention to this common American experience. But as Oishi’s studies are showing, mobility shapes everything from our sense of identity to our friendships—and even our happiness.<br /><br />It all starts with basic sense of self. Oishi studied a large sample of American college students, some of whom had moved around a lot before college and others of whom had pretty much stayed put. When he asked these students to describe themselves—their most important attributes—the itinerants were much more likely to mention personal traits, while less mobile students were more apt to mention important group affiliations. In fact, the mobile students didn’t belong to many groups; they weren’t joiners. And this tendency weakened their overall sense of community identity.<br /><br />Mobility appears to affect the nature of friendship as well, in a variety of ways. In one study, for example, college freshmen who had moved around a lot reported having more friends—as measured by their Facebook friendships—and they also added more new friends after arriving on campus. But it’s not just the size of the social networks, Oishi has found. Mobile Americans are more likely to form “duty free” relationships, without the deep sense of social obligation that characterizes traditional friendships. Duty-free friendships are based on more on shared interests and similarities of personality, rather than group membership.<br /><br />So who’s happier, those who ramble or those who stay close to home? One would guess that more mobile people might be happier, since that’s why many people move—to find a new life, perhaps a better job or a safer community. But the results are more mixed than that. <a href="http://pps.sagepub.com/content/5/1/5.abstract">As Oishi describes in the journal <em>Perspectives on Psychological Science</em></a>, adults who move often for work feel they have more interesting lives and are more satisfied with their marriages and family life. But itinerant adults also report more frequent health issues, like stomach aches and shortness of breath, than do less mobile adults. It’s possible that when people pull up stakes for a better life, they overestimate the novelty and opportunity of moving, and underestimate the social disruption and its consequences.<br /><br />The stomach aches and other ailments may be the tip of the iceberg. When Oishi analyzed a decade of data from 7000 adults, he found that those who moved frequently in childhood were more likely to have died during the course of the study. Perhaps unsurprisingly, introverts suffered more from the negative consequences of mobility, including increased mortality. In short, the American pattern of residential mobility may have a dark side that has yet to be fully revealed.<br /><br />When the French social critic Alexis de Tocqueville traveled in U.S. in the 1830s, he was struck by Americans’ restlessness, even in the midst of their prosperity. He was also struck by the “cloud” that darkened many American faces. This sadness, he believed, was explained by the fact that Americans are constantly thinking about the good things they might be missing.<br /><br />Tocqueville didn’t have the advantage of modern genetics to help him understand the paradoxical American character. Today we know that nations founded by immigrants—like the United States and Australia—have much higher rates of mobility than older nations, such as China and Germany. Population geneticists now believe that these national differences might be explained by the genetic distribution of personality traits, and indeed a cluster of novelty-seeking genes has been found in populations that have migrated long distances. It’s possible that these genes were adaptive when Americans were a migratory people. Whether or not they remain adaptive is an open question.<br /><br />Versions of "We're Only Human" appear in the <em>Huffington Post </em>and <em>Scientific American Mind</em>. Wray Herbert's book, <em>On Second Thought: Outsmarting Your Mind's Hard-Wired Habits</em>, will be published by Crown in September. </p><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-279289281454538967?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com4tag:blogger.com,1999:blog-27543730.post-45506280914873632010-03-25T15:34:00.003-04:002010-03-25T15:56:59.551-04:00Fast food, racing thoughts<a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/arches-789348.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 126px; FLOAT: left; HEIGHT: 86px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/arches-789346.jpg" /></a><br /><div></div><br />Fast food is unhealthy.<br /><br />I know, I know. Few of us need convincing of that fact any more. But as unassailable as it is, the brief against fast food has for years focused almost entirely on the food in fast food—the high fructose corn syrup and artery-busting fats and nutritional bankruptcy of burgers and French fries and soft drinks. But what about the fast in fast food?<br /><br />New science is now suggesting that fast food may be doubly unhealthy—not only nutritionally damaging but psychologically detrimental as well. Indeed, the Colonel and the Golden Arches and the rest of America’s fast-food culture may be unconsciously triggering a general impatience with life that leads to wrongheaded decisions going way beyond food. In short, fast food may lead to fast and frenzied live-for-today lifestyles that may be just as unhealthy as bad cholesterol.<br /><br />At least that’s the theory, which psychologists Chen-Bo Zhong and Sanford DeVoe of the University of Toronto have been exploring using an idea called behavioral priming. This is just a jargony way of saying that cues in our everyday world subliminally spark ideas, which in turn shape our behavior. The Toronto scientists wondered if symbols of our ubiquitous fast-food culture might spark thoughts of time pressures and efficiency—and cause us to act urgently and impatiently.<br /><br />Here’s an example of how they tested this notion in the laboratory. They recruited a large group of volunteers to perform a computer task. The task involved an image at the center of the screen, but other images also flashed very rapidly on the periphery of the screen—so rapidly that the conscious mind could not possibly notice them. Some of the volunteers “saw” familiar fast-food logos—KFC, Taco Bell, McDonald’s, and so forth—while others simply saw neutral images.<br /><br />After this priming, all the volunteers were told to read a short descriptive prose passage. Unbeknownst to them, the researchers were timing them—in order to see if the unconscious thoughts of fast food caused them to read faster. And they did. Even though they were told to take as much time as they liked, those thinking of fast food read much faster than the controls—and faster than they did without any unconscious priming. In other words, the Golden Arches and similar symbols made they feel time pressure where there was none.<br /><br />Now let’s be clear. Sometimes urgency and deadlines are appropriate and needed. We read quickly when we are taking a timed exam, for example, just as we walk quickly when we need to be somewhere soon. So speed is not in itself bad. But this was like speed-reading Emily Dickinson; it doesn’t make any sense. And in fact it’s unhealthy: One measure of Type A personality is speed and impatience in leisure activities like eating and walking and reading.<br /><br />These findings were intriguing, but the psychologists wanted to reexamine the question a different way. So in a second experiment, they again used fast food imagery to prime volunteers’ unconscious thoughts of time and urgency. But this time they rated the desirability of common household products, only some of which were time-saving products. For example, the volunteers might choose a four-slice toaster or a single-slice toaster; a two-in-one shampoo or a regular shampoo. And so forth. The idea was to see if those primed with fast food imagery were more likely to pick an efficient product than were the others. And that’s exactly what they found: Memories of Big Macs sparked a generalized impatience which in turn increased desire to complete household tasks as quickly as possible.<br /><br />I don’t know about you, but I find this alarming. And it gets worse. In a final experiment, the scientists went far afield, testing whether our fast-food culture might actually determine whether or not we save for the future. As they explain it, saving requires delaying gratification, denying one’s needs today for a bigger payoff later on. Failure to save is impatience writ large—over the lifespan. Like the ethos of fast food, lack of financial planning is all about immediate gratification.<br /><br />And the experiment’s findings were unambiguous. As <a href="http://pss.sagepub.com/content/early/2010/03/19/0956797610366090.full.pdf+html">reported on-line last week </a>in the journal <em>Psychological Science</em>, the volunteers primed with fast-food logos were much more likely to accept a smaller amount of money now rather than wait for a larger payment in a week. In short, mere exposure to fast food symbols made people impatient in a way that could threaten their future economic security.<br /><br />It’s hard not to savor the irony in these findings. Fast food was invented to save us time—to get us away from the drudgery of the kitchen so we could enjoy more leisure time. But today, the mere idea of fast food automatically triggers our unconscious sense of haste and urgency and pressure—feelings that shape not only the way we eat, but nearly every aspect of the way we live our lives, including our leisure.<br /><br />Wray Herbert also writes regularly for the <em>Huffington Post</em>, where this article first appeared.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-4550628091487363?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-44125346171472614902010-03-18T15:15:00.002-04:002010-03-18T15:28:43.260-04:00The power of gratitudeLike most parents, I drilled my young kids on the importance of saying “thank you” to others. Nagged them, really. After all, words of gratitude are an important social convention, a way of letting others know you value and appreciate them and their support. Plus saying “thank you” is the right thing to do.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/thanks-778512.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 130px; FLOAT: left; HEIGHT: 98px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/thanks-778510.jpg" /></a><br />What I didn’t teach them—because I didn’t know it at the time—was how they themselves might benefit from saying “thank you.” An emerging body of research is now showing that genuine expressions of gratitude can be tonic not just for the recipient, but for those who are saying “thank you” as well. Indeed, being grateful—and saying so—can change the very way we think about our closest relationships.<br /><br />One scientist who has been rigorously deconstructing gratitude is Nathaniel Lambert of Florida State University. In a recent study, he and several colleagues decided to explore whether the simple act of expressing thankfulness might be linked to a deeper sense of commitment and responsibility toward someone else. To find out, the psychologists recruited a large group of young men and women and gathered information on their most intimate relationships, including the frequency and manner in which they expressed their gratitude toward their partner. They also questioned them about the strength of their relationship, focusing especially on feelings of responsibility for their partner’s happiness and welfare.<br /><br />They wanted to see if there was any connection between thankfulness and the quality of the partnership. And there was, clearly. Those who were more expressive of their gratitude toward their partner saw their commitment as deeper and the relationship as more mutually supportive. They also measured these perceptions six weeks later, to see if gratitude was linked to an increase in relationship quality over time. And, again, it was.<br /><br />These findings are intriguing—but limited. They don’t say anything about whether expressing thanks actually leads to improved feelings about a relationship. So Lambert and his colleagues decided to run another experiment to sort this out. In this study, they actually manipulated gratitude. They had a group of volunteers deliberately increase their verbal or written expressions of thanks toward a close friend. They were instructed to “go the extra mile” in really demonstrating their feelings of gratitude. For comparison, other volunteers merely thought grateful thoughts—without expressing them—while others focused on positive memories of time together. At the end of the three weeks, they compared the volunteers’ attitudes toward their relationship.<br /><br />There was no doubt about cause-and-effect this time. As reported on-line in the journal <em>Psychological Science</em>, those who more frequently spoke or wrote their words of thanks saw their relationship as more mutual and cooperative as a result. Importantly, merely thinking about being grateful did not improve relationships. So words count.<br /><br />What’s going on here? The scientists believe that saying “thank you” sends a message not only to one’s partner but to oneself as well. It changes our self-perceptions. The very act of saying “thank you” reinforces one’s desire for a mutually supportive relationship and increases dependency, which triggers trust and in turn deepens a relationship. In this way, saying “thank you” initiates a spiral of kindness and appreciation in relationships. And what's more, it’s not complicated.<br /><br />For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at <a href="http://www.trueslant.com/wrayherbert/">True/Slant</a>. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind.<br /></em><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-4412534617147261490?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com4tag:blogger.com,1999:blog-27543730.post-31028822970423733092010-03-17T11:43:00.002-04:002010-03-17T11:50:13.181-04:00Emotions by the roomful<p><br />I have a friend who sucks the air out of the room whenever he comes around. He is so blustery and self-absorbed that people don’t interact with him; they capitulate. I also have friends who by their mere presence light up the room, raising the spirits of everyone gathered. I know people who cast a pall over the group and drag it down; others who have a calming effect on gatherings. </p><p>These are all caricatures, of course. Nobody can sway the emotions of an entire room, energizing or subduing or infuriating every member of the group. After all, each of us has his or her own emotional make-up, which is surely more powerful than the mere presence of another person. A roomful is not a human entity, with collective emotions. </p><p>Or is it? It may be humbling to know, but new research suggests that there may be some truth to these caricatures. Each of us is autonomous, of course, with temperament and personality, but some people may have a powerful emotional presence that can indeed influence the feeling of an entire room. </p><p>That’s the idea being explored by two business professors, Noah Eisenkraft of Penn and Hillary Anger Elfenbein of Washington University in St. Louis. The scientists wanted to explore this phenomenon with naturally occurring groups, so they recruited an entire class of first-year MBA students. These 239 students were randomly assigned to work groups, most made up of five students, which were diverse for nationality, gender, and work experience. The group members took all the same classes, worked on group projects, and even socialized frequently outside class. In other words, they spent a lot of time in the same room.<a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/group-730308.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 187px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/group-730293.jpg" /></a><br /><br />The idea was to track these group members’ emotions—and emotional interactions—over an entire semester. So the scientists gave a personality test to start, then after the groups had worked together for a month, they questioned each member about both positive and negative feelings they experienced for each of the other group members—boredom, stress, anger, enthusiasm, and so forth. They also observed the networks that formed over the semester, to see if any one group member was becoming the emotional center of the group. </p><p>The results were mixed and intriguing. The students’ upbeat emotions were largely accounted for by individual emotional make-up—but not entirely. The presence of others also shaped the students’ feelings, with the most dominant group members having the most power to lift others’ spirits. But the big surprise came with negative emotions like sadness and anger. As reported on-line this week in the journal <em>Psychological Science</em>, downbeat emotions were shaped more by others than by individual temperament, and these effects were traceable to individuals with the most extraverted and disagreeable personalities. Importantly, the scientists ruled out emotional “contagion” as an explanation for the phenomenon: It’s not simply that miserable people were dragging others down with them, but something about them was affecting the entire room in the same way—and not in a good way. </p><p>We usually call these people “bad apples.” But if we’re not simply “catching” their bad vibes, what is happening? It’s not entirely clear, the scientists say. It could be that people with an emotional “presence” express themselves differently—with most body-language, for example—or they may convey dominance or warmth or creepiness in very subtle ways. </p><p>Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought</em>, will be published by Crown in September.<br /><br /><br /><br /></p><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-3102882297042373309?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-1180983175767981062010-03-15T11:49:00.006-04:002010-03-15T12:22:25.045-04:00A tool for predicting suicide?<a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/suicide-724063.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 74px; FLOAT: left; HEIGHT: 130px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/suicide-724061.jpg" /></a><br /><br /><br /><br />Suicide is both disturbing and perplexing to survivors, in part because it is so unpredictable. People who are intent on killing themselves often conceal their thoughts—or outright deny them—so family and friends are left puzzling over warning signs they might have missed.<br /><br />Even experienced clinical judgment often misses the mark. As a result, suicide experts have long hoped and searched for a clear behavioral marker of suicide risk. Now they may have found one. Harvard University scientists are reporting that a tool widely used for probing unconscious thoughts might be used to spot suicidal intent—even if the suicidal mind is in denial—and offer new hope for timely intervention to keep people alive.<br /><br />Psychologist Matthew Nock (working with colleagues at both Harvard and nearby Massachusetts General Hospital) decided to adapt a decade-old test called the Implicit Association Test, or IAT, to plumb for warning signs of suicide. Specifically, he wanted to see if people who are suicidal might have stronger implicit associations between themselves and death—associations that might point toward self-destructive intentions. To find out, he tested 157 people seeking treatment in a psychiatric emergency room. The patients were all emotionally distressed, but only some were in the hospital because of attempted suicide. The scientists wanted to see if the IAT could distinguish those who had attempted suicide from those who had not.<br /><br />The IAT is a reaction time test. During their hospital stay, often while sitting in bed, the patients very rapidly classified words on a computer screen, words like: <em>lifeless</em>, <em>thrive</em>, <em>myself</em>, <em>deceased</em>, <em>they</em>, <em>theirs</em>, <em>survive</em>, <em>breathing</em>. And so forth. The idea is to see how rapidly patients connect identity-related words to either life or death words. And the findings were unambiguous. As reported on-line this week in the journal <em>Psychological Science</em>, patients who had attempted suicide prior to admission had much stronger unconscious associations between self and death.<br /><br />But the study didn’t end there. Nock followed all the patients for six months to see how they fared, and he found that the patients with a powerful self-death association in the hospital had a six-fold increase in later suicide attempts. Six-fold is a dramatic difference, and what’s more, the unconscious associations were a much better suicide predictor than depression, previous suicide attempts, or the intuition of the attending clinician.<br /><br />What about the patients’ own predictions? Fourteen of the emergency patients attempted suicide within six months of leaving the hospital. Their self-evaluations <em>were</em> an indicator of their future risk, but an imperfect indicator. The IAT results were a better prognosticator even than the patients’ self-evaluations. This suggests that unconscious thoughts might be a useful detector and predictor of intentions that patients are reluctant to discuss—or intentions of which they themselves are unaware.<br /><br />Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-118098317576798106?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com2tag:blogger.com,1999:blog-27543730.post-69544338736380710872010-03-11T11:49:00.004-05:002010-03-11T14:25:42.988-05:00A willingness to wonderWillingness is a core concept of addiction recovery programs, and a paradoxical one. Twelve-step programs emphasize that individual addicts cannot will themselves into recovery and healthy sobriety, indeed that the ego and self-reliance are often a root cause of their problem. Yet recovering addicts must <em>be </em>willing. That is, they must be open to the possibility that the group and principles are powerful enough to trump a compulsive disease.<br /><br />It’s a tricky concept for many, and must be taken on faith. But now there may be a little bit of science to back it up, too. Psychologist Ibrahim Senay of the University of Illinois—Champaign figured out an intriguing way to create a laboratory version of both willfulness and willingness—and to explore possible connections to intention, motivation, and goal-directed actions. In short, some key traits needed for long-term abstinence.<br /><br />He did this by exploring self-talk. Self-talk is just what it sounds like—that voice in your head that articulates what you’re thinking, spelling out your options and intentions and hopes and fears and so forth. It’s the ongoing conversation with oneself. Senay thought that the form and texture of self-talk—right down to the sentence structure—might be important in shaping plans and actions. What’s more, self-talk might be a tool for exerting the will—or being willing.<br /><br />Here’s how he tested this notion. He had a group of volunteers work on a series of anagrams—changing the word <em>sauce</em> to <em>cause</em>, for example, or <em>when</em> to <em>hewn</em>. But before starting this task, half the volunteers were told to contemplate whether they would work on anagrams, while the others thought about the fact that they would be doing anagrams. It’s a subtle difference, but the former were basically putting their mind into wondering mode, while the latter were asserting themselves. It’s the difference between “I will do this” and “Will I do this?”<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/question-792037.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 112px; FLOAT: left; HEIGHT: 118px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/question-792035.jpg" /></a>The wondering minds completed significantly more anagrams. In other words, they were much more goal-directed than were those who declared their intentions to themselves.<br /><br />This is counterintuitive. Why would asserting one’s intentions to do something undermine that goal? Senay wanted to double-check these surprising results, which he did in this way: He recruited volunteers on the pretense that they were needed for a handwriting study. Some wrote the words <em>I will</em> over and over, while others wrote <em>Will I</em>. The idea was that self-posed questions about the future are fundamentally different than self-declarations. Questions should inspire thoughts about autonomy and motivation to pursue a goal—and in the end make the questioners more successful.<br /><br />To test this, Senay again had the volunteers work on an anagram task. And again, the willful volunteers performed more poorly than the questioners. He ran another version of this experiment, but he changed the goal to exercise rather than anagrams, and got the same result: Those primed with the words <em>Will I</em> had greater intentions to exercise regularly than did those primed with <em>I will</em>. What’s more, when the volunteers were asked why they had decided to exercise more, the quesioners said things like “Because I want to take more responsibility for my own health.” Those primed with <em>I will</em> offered explanations like “Because I would feel guilty or ashamed of myself if I did not.”<br /><br />This last finding is crucial. It indicates that those with questioning minds were more intrinsically motivated to change. Those asserting their will lacked this internal motivation, which explains their weak commitment to future change. Put in terms of addiction recovery, those who were asserting their willpower were closing their minds, narrowing their view of the future. Those who were questioning and wondering were open-minded—and therefore willing to see new possibilities for the future.<br /><br />Selections from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. For more on overcoming addiction, visit <a href="http://www.psychologicalscience.org/onlyhuman/2010/02/psychology-of-recovery.cfm">"The Science of Recovery." </a>Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-6954433873638071087?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-38042179854212774762010-03-09T10:32:00.006-05:002010-03-09T12:08:05.436-05:00Vieux, en bonne sante . . . et bilingueIn French, that means <em>old, healthy . . . and bilingual</em>. I could just as well have used Google Translate to put that phrase into Finnish or Spanish or Chinese. The fact is, I don’t speak any of those languages fluently—any language except English really. Which puts me in good company: When Senator Barack Obama was campaigning for the presidency back in 2008, he told a crowd in Dayton, Ohio: “I don’t speak a foreign language. It’s embarrassing.”<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/tourist-770998.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 150px; FLOAT: left; HEIGHT: 113px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/tourist-770997.jpg" /></a>It is embarrassing. But worse than that, it may be unhealthy. New research suggests that bilingualism may convey previously unrecognized cognitive benefits—benefits that appear early and last a lifetime. These benefits may go well beyond language itself. Indeed, speaking two languages may shape the mind and brain in fundamental ways, creating mental reserves that help stave off the ravages of dementia.<br /><br />That’s the surprising possibility emerging from an ongoing research project at York University in Ontario. Cognitive psychologist Ellen Bialystok has for years been testing and comparing people who speak one or two languages, including children, adults and the elderly. Her overall conclusion is that bilingualism enhances the brain’s “executive control.” That’s a catchall term that encompasses the ability to pay attention, to ignore distractions, to hold information in short-term memory, to do more than one task at a time. It’s mental discipline, and it typically emerges in childhood and declines in old age.<br /><br />Bialystok has tested this many different ways. Here’s one example: She had 4- and 5-year-old kids do a card sorting task. The cards show circles or triangles, some red and some blue, and the kids are told to sort the deck by color. Later they are told to switch—and sort the same cards by shape. Young children usually have great difficulty making this mental switch, but when Bialystok ran the experiment, bilingual kids were much better with the rule change. This indicates heightened executive control.<br /><br />This advantage appears to persist into adulthood. Bialystok (working with various colleagues) compared bilinguals and monolinguals on various lab tests that require mental discipline. The Stroop test is one such test. That’s the one where you have the word R-E-D printed in blue, and you have to rapidly name the ink color rather than read the word. It’s hard—and again the bilinguals consistently did better than subjects who only spoke one language. Or looked at another way, monolinguals had a cognitive deficit—and this deficit appears to increase as adults get older.<br /><br />Right into old age. Bialystok wanted to explore whether enhanced executive control actually has a protective effect in mental aging—specifically, whether bilingualism contributes to the “cognitive reserve” that comes from stimulating social, mental and physical activity. She studied a large group of men and women with dementia, and compared the onset of their first symptoms. The age of onset for dementia was a full four years later in bilinguals than in patients who had lived their lives speaking just one language. That’s a whopping difference. Delaying dementia four years is more than any known drug can do, and could represent a huge savings in health care costs.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/french-756000.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/french-755998.jpg" /></a>Is there any downside to bilingualism? Yes. As reported on-line in the journal <em>Current Directions in Psychological Science</em>, Bialystok’s studies also found that bilinguals have less linguistic proficiency in either of their two languages than do those who only speak that language. They have somewhat smaller vocabularies, for example, and aren’t as rapid at retrieving word meanings. But compared to the dramatic cognitive advantages of learning a second language, that seems a small price to pay. Plus you can travel to Paris without the embarrassment of constantly thumbing through your dog-eared <em>French for Dummies</em>.<br /><br />Wray Herbert’s “We’re Only Human” column appears regularly in the magazine <em>Scientific American Mind</em>. His book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-3804217985421277476?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com2tag:blogger.com,1999:blog-27543730.post-17499840282843587482010-03-05T15:54:00.006-05:002010-03-05T16:37:19.490-05:00Casting light on cheating and greedLouis Brandeis was already one of America’s most famous lawyers when Woodrow Wilson appointed him to the Supreme Court in 1916. He was a tireless and prescient critic of big investment banks—including bankers’ excessive bonuses—an argument he spelled out in his influential book of essays, <em>Other People’s Money and How Bankers Use It</em>. His solution for the problem of concentrated financial power was unfettered public scrutiny, a belief he summarized in his famous statement: “Sunlight is said to be the best of disinfectants; electric light the most efficient policeman.”<br /><br />Justice Brandeis was an intuitive psychologist. When he said that the “broad light of day” would purify men’s actions, he was anticipating a field of research that is just now beginning to illuminate the intricate interplay of the mind, the body, and morality. Light, it appears, does much more than distinguish day from night; it takes away our illusion of anonymity and, in doing so, literally keeps us honest.<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/dark-735552.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 143px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/dark-735531.jpg" /></a>This seems obvious on one level. Streetlights were most likely invented to deter crime, and big power outages are almost inevitably followed by looting. But darkness in that sense is actual cover for criminals, like a mask. The new research suggests that even non-criminals may be influenced by the metaphorical meaning of light and darkness, becoming more dishonest and self-centered as light diminishes.<br /><br />Here’s the science. Three psychologists—Chen-Bo Zhong and Vanessa Bohns of the University of Toronto and Francesca Gino of the University of North Carolina—wanted to explore the idea that metaphorical darkness leads to illusory anonymity, and in turn to moral transgression. In one experiment, they had a group of volunteers perform a complicated mathematical task—so complicated that it was impossible to complete in the time allotted. When they ran out of time, the volunteers were told to pay themselves only for the work they were able to finish. This was all done anonymously, although secretly the scientists were monitoring the volunteers’ actions.<br /><br />Half the volunteers did this sham exercise is a brightly lit room, with twelve overhead light bulbs, while the others did it in a room dimly lit by just four bulbs. The idea was to see if those in the darker room were more likely to cheat than those working in bright light. And they were, indisputably. They not only lied about their performance on the difficult task, they also paid themselves more cash for work they had failed to do. In short, they lied, cheated and stole money.<br /><br />It’s important to note that, while one room was darker than the other, neither room was actually dark. That is, the lack of illumination was not enabling the cheating; and indeed, the task was (ostensibly) anonymous anyway, so there was nothing really to hide. It’s not like they were tip-toeing out of the room with cash. Yet the dim lighting gave volunteers the psychological license to behave unethically.<br /><br />These findings were bizarre enough that the scientists wanted to double-check them. So in a second experiment, instead of dimming the room, they had only some of the volunteers wear sunglasses to dim their view. Then all the volunteers participated in a laboratory exercise called the dictator’s game. The dictator’s game is a test of fairness and greed; one volunteer (the initiator) has a given pot of cash, and is allowed to give away all, some or none of it to another, who can accept or reject it. In this experiment, all the volunteers were initiators; the scientists simply wanted to see how generous or stingy they were, depending on whether they were wearing sunglasses or not.<br /><br />Shades corrupt. As reported on-line in the journal <em>Psychological Science</em>, those with a slightly darkened view of the world gave away considerably less money—less than what’s fair and less than the volunteers not wearing shades. Darkness gave them the sensation that they were more concealed, and that in turn made them greedier people.<br /><br />Think about this for a minute. The researchers were not manipulating light and darkness so that some actually had more cover. They were the ones perceiving a darker world, and that perception was enough to license their transgressions. What’s going on here? Well, the researchers believe that dimming the lights or wearing sunglasses is a kind of egocentric mental “anchor”; because they see the world as somewhat darkened, they assume that others have an obscured view of them as well. They act not as if they have sunglasses on, but as if there has been a widespread power outage that has darkened everyone's world.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/wallstreet-735580.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 150px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/wallstreet-735574.jpg" /></a><br />Kids are notoriously egocentric in this way. They’ll close their eyes when they play hide-and-seek, thinking that they can’t be seen if they themselves can’t see. Apparently, adults don’t outgrow this egocentrism entirely. But what’s cute in a childhood game of hide-and-seek isn’t nearly so cute in grownup games with other people’s money.<br /><br />For more insights into the quirks of human nature, visit the <a href="http://www.trueslant.com/wrayherbert/">“Full Frontal Psychology”</a> blog as True/Slant. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-1749984028284358748?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com2tag:blogger.com,1999:blog-27543730.post-37135264908820650452010-03-03T14:51:00.004-05:002010-03-03T15:02:45.599-05:00An angry voter is an ignorant voterImagine this scenario: You lost your job at the lumber yard early in 2009. Nobody is building new homes these days, and this slowdown has trickled down to suppliers all over the country. What’s worse, you’re dipping into savings just to make your own mortgage payments—on a house that has lost a big chunk of its value. In short, your American dream is in shambles.<br /><br />It’s a dreary but all too familiar scenario. Now imagine further how you feel about this. Is worry your primary emotion? Are you anxious about your wife’s health, and the possibility of an expensive hospitalization? Are you fearful about depleting your kids’ college funds? Where will you all live if you lose the house?<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/foreclose-765609.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 133px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/foreclose-765569.jpg" /></a>Or are you mostly angry? After all, this situation is totally unfair, given how hard you have worked all these years. Who’s to blame? Those fat cat bankers are still drawing their obscene bonuses, while working guys like you are barely eking out a living. Someone’s got to pay for this mess.<br /><br />Both fear and anger are understandable under these dire circumstances. But what are you going to do? Well, there‘s an election coming up later this year. Here’s your chance to at least take some action, to raise your citizen’s voice and be heard. How will you exercise this civic responsibility when you go to the polls in November?<br /><br />We like to think that our democracy is rational, that as voters we educate ourselves on the issues and choose the candidate who best represents our views. Emotions, while natural, would seem to undermine this civic ideal, leading to cynicism and confused thinking and wrongheaded choices. But is it so simple? New research suggests that emotions can indeed skew voting behavior—but in surprising and nuanced ways.<br /><br />University of Massachusetts scientists Michael Parker and Linda Isbell rigged an election to explore the interplay of specific emotions and voting. Not a real election, of course, but a hypothetical Democratic primary election for the Massachusetts state senate. They created two candidates, John Clarkson and Tom Richards, each with detailed positions on a dozen important public issues. The candidates’ positions are spelled out on the candidates’ Web sites, along with general information on each aspiring senator.<br /><br />The researchers recruited a large number of volunteers, all Massachusetts residents, to act as voters in this election. They were directed to the Web sites, and told to peruse as much information as they liked, in any manner they wanted—and to consider whatever they needed to make an informed voting decision. Clarkson and Richards actually agreed on most of the issues, though they stated their views differently. The general information was vague, but made clear that each candidate was well qualified.<br /><br />But here’s the rub: Before the voters started researching the issues and candidates, some were primed for fear and others for anger—much like the scenarios above. The idea was to see if these two basic human emotions shaped civic behavior in different ways. That is, did angry citizens size up candidates one way, and anxious voters a different way? And did these thinking styles translate into different behavior at the polls?<br /><br />The answer is a resounding yea. As reported on-line in the journal <em>Psychological Science</em>, the worried voters were much more deliberate and organized in their thinking than were the angry voters, spending significantly more time exploring the candidates’ Web sites. What’s more, the anxious citizens actually voted for the candidates whose positions they agreed with; in other words, democracy worked the way it’s supposed to work. This may seem obvious, but it wasn’t to the angry citizens, for whom there was no apparent connection among issues and positions and ballot-box choices.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/tea2-765631.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 134px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/tea2-765618.jpg" /></a><br />So what was influencing the angry voters, if not the issues of the day and the candidates promises? Apparently it was the vague general information that guided their choices. In the real world, that means things like basic name recognition, party loyalty, and simplistic political labels. The angry voters didn’t take the time to really concentrate on the issues and positions, and instead let these skimpy generalities guide them. It appears their anger was switching their brain from deliberate mode to automatic mode—to gut feelings more than rational analysis. The worried citizens had too much at stake to trust their gut.<br /><br />For more insights into the quirks of human nature, visit the <a href="http://www.trueslant.com/wrayherbert/">“Full Frontal Psychology”</a> blog at True/Slant. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind.</em> Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-3713526490882065045?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-70152518770252352132010-02-24T15:19:00.006-05:002010-02-26T09:49:33.867-05:00The Mind of a MisanthropeI become misanthropic every February. I avoid social gatherings, and really just want to hole up at home. I always assumed it was the dark evenings and slippery sidewalks and general misery of venturing outside. But truth be told, I don’t want guests visiting me either. Not until the crocuses come through.<br /><br />Or not until cold and flu season is over, more accurately. New research suggests that my anti-social ways may have little to do with friendliness or lack of it. Indeed, my attitudes and actions may be self-protective, part of an ancient, hard-wired psychological immune system, shaped over eons to help humans steer clear of germs.<br /><br />Think of it from an evolutionary point of view. Group living conveyed many survival benefits for early humans, but it also carried risks—most notably the spread of harmful disease. The body’s immune system is very good at fighting off germs, but it’s a costly system to operate. In the parlance of immunology, people are vectors, and another way to avoid sickness is simply to avoid disease carriers in the first place. In this sense, extraversion is costly and introversion is adaptive—especially during flu season.<br /><br />That’s the theory at least, which psychologist Chad Mortensen of Arizona State University has been investigating in his lab. He and his colleagues wanted to see if exposure to germs—or at least the idea of germs and illness—would change people’s basic perceptions about themselves as social beings. To test this, they showed a group of volunteers a slide show about germs and contagious disease, while control subjects watched a slide show about architecture. Afterward, all the volunteers completed a personality inventory, which includes measures of extraversion, agreeableness and openness to experience. Finally, the researchers assessed each volunteer’s feelings of vulnerability to disease— basically, how much they fret about getting sick.<br /><br />They anticipated that the volunteers with disease on their minds would see themselves as more reclusive. And that’s just what they found. The infection-minded volunteers saw themselves as less gregarious than did controls, and the hypochondriacs in the group also saw themselves as less open-minded about people and less cooperative. In other words, the more intense the volunteers’ worry about infection, the less they desired the company of others.<br /><br />That’s striking in itself. But attitudes and self-perceptions are only an effective defense if they change people’s actual behavior. So in a second experiment, the scientists came up with an ingenious way to measure actual avoidance. As before, they primed only some of the volunteers with worries about infection and illness. Then they exposed all the volunteers to pictures of faces, while measuring their arm movements. Very subtle pushing away is an indicator of social avoidance, as when we push away something undesirable; flexing similarly indicates acceptance. As expected and reported on-line in the journal <em>Psychological Science</em>, those primed to fret about germs were more avoidant; and the chronic hypochondriacs were the most avoidant by far.<br /><br />So that’s a pretty nifty defense mechanism. Or at least it was at one time. But these evolved tendencies are often blunt instruments, and this hard-wired bias against germs may go awry in the modern world. For example, sensitivity to disease threats can be indiscriminate, causing people to judge and avoid not only sick people but <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/solitude-765527.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 173px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/solitude-765525.jpg" /></a>also obese people and people with disabilities. And because people who are unfamiliar pose an especially potent threat of unknown diseases, the psychological immune system might also foster xenophobia toward foreigners, anti-gay attitudes, and right-wing authoritarianism. That’s a big price to pay, just to dodge a sore throat and sniffles.<br /><br />For more insights into the quirks of human nature, visit the <a href="http://www.trueslant.com/wrayherbert/">“Full Frontal Psychology” blog </a>at True/Slant. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-7015251877025235213?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-76796061946726835942010-02-18T15:21:00.004-05:002010-02-18T15:37:41.004-05:00Focusing on the Cinematic MindOur household is a rolling Alfred Hitchcock festival. We almost always have at least one of the celebrated director’s films on DVD, and over the years we have watched most of our favorites—<em>Suspicion, North by Northwest, The 39 Steps</em>—time and time again. It’s a tribute to the master’s skills and sensibility that his films have such enduring appeal, because many films from the same time period have a distinctly “old” feel to them. It’s not just the primitive cameras and films. There is something about the rhythm and texture of early cinema that has a very different “feel” than modern films. But it’s hard to put one’s finger on just what that something is. <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/39.3-703400.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 142px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/39.3-703387.jpg" /></a><br /><br />New research may help explain this elusive quality. Cognitive psychologist and film buff James Cutting of Cornell University decided to use the sophisticated tools of modern perception research to deconstruct 70 years of film, shot by shot. He measured the duration of every single shot in every scene of 150 of the most popular films released from 1935 to 2005. The films represented five major genres—action, adventure, animation, comedy and drama. Using a complex mathematical formula, Cutting translated these sequences of shot lengths into “waves” for each film.<br /><br />What Cutting was looking for were patterns of attention. Specifically, he was looking for a pattern called the 1/f fluctuation. The 1/f fluctuation is a concept from chaos theory, and it means a pattern of attention that occurs naturally in the human mind. Indeed, it’s a rhythm that appears throughout nature, in music, in engineering, economics, and elsewhere. In short, it’s a constant in the universe, though it’s often undetectable in the apparent chaos.<br /><br />Cutting found that modern films—those made after 1980—were much more likely than earlier films to approach this universal constant. That is, the sequences of shots selected by director, cinematographer and film editor have gradually merged over the years with the natural pattern of human attention. This explains the more natural feel of newer films—and the “old” feel of earlier ones. Modern movies may be more engrossing—we get “lost” in them more readily—because the universe’s natural rhythm is driving the mind.<br /><br />What does this mean? Cutting doesn’t believe that filmmakers have deliberately crafted their movies to match this pattern in nature. Instead, he believes the relatively young art form has gone through a kind of natural selection, as the edited rhythms of shot sequences were either successful or unsuccessful in producing more coherent and gripping films. The most engaging—and successful—films were subsequently imitated by other filmmakers, so that over time the industry as a whole evolved toward an imitation of this natural cognitive pattern.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/39-703411.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 125px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/39-703409.jpg" /></a><br />Over all, action movies are the genre that most closely approximates the 1/f pattern, followed by adventure, animation, comedy and drama. But as Cutting reports on-line in the journal <em>Psychological Science</em>, individual films from every genre have almost perfect 1/f rhythms. <em>The Perfect Storm</em>, released in 2000, is one of them, as is <em>Rebel Without a Cause</em>, though it was made in 1955. So too is <em>The 39 Steps</em>, Hitchcock’s masterpiece from way back in 1935.<br /><br />For more insights into the quirks of the human mind, visit the<a href="http://www.trueslant.com/wrayherbert/"> “Full Frontal Psychology” </a>blog at True/Slant. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-7679606194672683594?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-65327625117754893752010-02-17T15:51:00.004-05:002010-02-18T15:26:39.955-05:00A Salvo in the Calorie WarThe calorie war is heating up. It’s actually been simmering for some time, sparked by an alarming obesity rate among young Americans and related spikes in diabetes and other health problems. Nobody really disputes this sorry trend anymore, but there is a lot of disagreement over what to do about it. Public health advocates are clamoring for everything from warning labels on junk food to aggressive television marketing campaigns, even for outright prohibitions. Just last week, the Obama administration entered the fray, calling for a total ban on candy and soda in the nation’s schools. <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/junkfood-735183.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 133px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/junkfood-735157.jpg" /></a><br /><br />Some see the past tobacco war as the proper model for this public health campaign. Indeed, one idea that has gotten traction recently is another “sin tax”—this one a fat and sugar tax—to dissuade people from eating junk food. Yale University psychologist and diet expert Kelly Brownell, writing in the prestigious <em>New England Journal of Medicine</em> last spring, called for a penny-per-ounce tax on soda sweetened with sugar or corn syrup. Only such a tax, he believes—and not lectures about nutrition and exercise—will make people eat more sensibly, and what’s more, the revenue could be used to promote healthier foods and habits.<br /><br />Not everyone agrees. Pricing strategies may well be a key to changing behavior, but others favor subsidies over punitive taxes, as a way to encourage people to eat fruits and vegetables and whole grains. The problem is that both these market approaches—taxes and subsidies—are founded on the belief that people make rational economic decisions: Make it cheaper and people will eat more of it, more expensive and people will eat less. But decades of behavioral economics research argues that consumers are not always so rational. And the two strategies have never been tested head to head, to see which one most effectively alters calorie consumption.<br /><br />Until now. Leonard Epstein, a clinical psychologist at the University of Buffalo, decided to explore the persuasiveness of sin taxes and subsidies in the laboratory, and he did so in an innovative way. He and his colleagues turned their lab into a simulated grocery store, “stocked” with images of everything from bananas and whole wheat bread to Dr. Pepper and nachos. A group of volunteers—all mothers—were given laboratory “money” to shop for a week’s groceries for the family. Each food item was priced the same as groceries at a real grocery nearby, and each food came with basic nutritional information.<br /><br />The mother-volunteers went shopping several times in the simulated grocery. First they shopped with the regular prices, but afterward the researchers imposed either taxes or subsidies on the foods. That is, they either raised the prices of unhealthy foods by 12.5 %, and then by 25%; or they discounted the price of healthy foods comparably. Then they watched what the mothers purchased.<br /><br />It’s important to know how the scientists defined healthy and unhealthy foods. They used an index called calorie-for-nutrition value, of CFN, which simply means the number of calories one must eat to get the same nutritional payoff. So for example, nonfat cottage cheese has a very low CFN, because it is packed with nutrition but not with calories; chocolate chip cookies have a much higher CFN. The most sinful food in the store was commercial iced tea, with a whopping CFN equivalent to ten times that of chocolate chip cookies. The researchers also measured the energy density—basically calories—in every food.<br /><br />Then they crunched all the data together, and the findings were striking. To put it bluntly, taxes worked and subsidies did not. Specifically, taxing unhealthy foods reduced overall calorie intake, while cutting the proportion of fat and carbs and upping the proportion of protein in a typical week’s groceries. By contrast, subsidizing the prices of healthy food increased overall calorie consumption without changing the nutritional value at all. Why? As reported on-line last week in the journal <em>Psychological Science</em>, it appears that mothers took the money they saved on subsidized fruits and vegetables and treated the family to some chips and soda pop. Taxes had basically the opposite effect, shifting spending from junk to healthier choices.<a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/junkfood-735231.bmp"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 136px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/junkfood-735209.bmp" /></a><br /><br />The scientists conclude that subsidizing broccoli and yogurt—as appealing as that idea might be to some—is unlikely to bring about the massive weight loss the nation now requires.<br /><br />For more insights into the quirks of human nature, visit the <a href="http://www.trueslant.com/wrayherbert/">“Full Frontal Psychology” blog </a>at <em>True/Slant</em>. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-6532762511775489375?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-65138573947651940542010-02-16T12:22:00.026-05:002010-04-12T15:01:09.405-04:00The Science of Recovery<a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/serenity-733630.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 166px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/serenity-733604.jpg" /></a><br /><div>Over the past few years, I have written many short essays on new findings in psychological science. Most have these have appeared in this blog, "We're Only Human," but many others have been published in <em>Newsweek.com</em> and, more recently, in the "Full Frontal Psychology" blog at <em>True/Slant</em>. At this point, I have written enough that I am beginning to identify clusters of essays all focusing on a particular topic, and I thought it might be useful to organize these topical essays in a way that's more useful to readers.<br /><br />One such topic is the science of recovery. There have been volumes written on the science of alcoholism and others addictions, but surprisingly little on the behavioral and brain science underlying recovery from addiction and relapse prevention. Many recovering alcoholics and addicts believe it is unimportant to understand the why and how of the sober mind, indeed that science cannot fathom the spiritual aspects of 12-step programs. No argument there, but many others may be curious about what science has to say about this program and its principles. For those readers, I have compiled an annotated listing of essays on this subject. Some of these essays address specific steps and principles of recovery--like powerlessness and pride and moral inventory; others deal with what might be called the folk wisdom of recovery. It's a work in progress, and will continue to grow as new science emerges. I also invite reader comments and suggestions of related reading, with the goal being the most thorough resource available on the psychology of sobriety.<br /><br /><br /><a href="http://trueslant.com/wrayherbert/2010/01/21/the-future-is-lookin-sweet/">"The future is lookin' sweet"</a> The HALT principle, specifically the H<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2010/01/science-of-prayer.cfm">"The Science of Prayer"</a> The destructiveness of resentment, and a strategy for defusing it<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2009/12/perils-of-willpower.cfm">"The Perils of Willpower"</a> The counter-intuitive idea that willpower is a character flaw<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2009/07/i-am-lovable-person-not.cfm">"I am a lovable person." "Not"</a> On the harmful message of the self-esteem movement<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2009/03/hey-youre-wearing-me-out.cfm">"Hey, you're wearing me out!"</a> The power and peril of the group<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2009/02/try-little-powerlessness.cfm">"Try a Little Powerlessness"</a> The first step to recovery<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2009/01/paradox-of-temptation.cfm">"The Paradox of Temptation"</a> Relapse prevention and "forbidden fruit"<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2008/10/recipe-for-motivation.cfm">"A Recipe for Motivation"</a> The KISS principle: Keep it simple, stupid<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2007/07/sudoku-in-saloon-i-was-recently.cfm">"Sudoku in the Saloon"</a> Alcohol and aggression<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2008/06/neurons-of-recovery.cfm">"Neurons of Recovery"</a> Honesty, authenticity, moral inventory<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2007/06/two-face-of-pride.cfm">"The Two Faces of Pride"</a> Healthy pride, and perilous pride<br /><br /><a href="http://www.newsweek.com/id/94590">"Destined to Cheat?"</a> Attitudes, beliefs and cheating<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2007/02/pumping-emotional-iron.cfm">"Pumping Emotional Iron"</a> Overtaxing the mind's powers<br /><br /><a href="http://newsweek.com/id/41192">"Who Says Quitters Never Win?"</a> When to throw in the towel on moderation<br /><br /><a href="http://www.newsweek.com/id/35214">"Oops, I did it again"</a> Arrogance and mistakes<br /><br /><a href="http://www.trueslant.com/wrayherbert/2009/08/12/why-does-self-reliance-make-you-sick/">"Why Does Self-Reliance Make You Sick?"</a> The (fatal) risks of social isolation<br /><br /><a href="http://www.newsweek.com/id/37367">"The Empathy Gap"</a> Why we're so bad at predicting cravings<br /><br /><a href="http://www.newsweek.com/id/197006">"Talking the Talk"</a> The value and danger of public declarations<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/2010/03/willingness-to-wonder.cfm">"A Willingness to Wonder"</a> Willpower vs willingness</div><div></div><div></div><div></div><div></div><div><a href="http://www.psychologicalscience.org/onlyhuman/2010/03/power-of-gratitude.cfm">"The Power of Gratitude"</a> The #1 AA topic</div><div></div><div></div><div></div><div></div><div><a href="http://www.psychologicalscience.org/onlyhuman/2010/03/emotions-by-roomful.cfm">"Emotions by the Roomful" </a>The power of the room</div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div><a href="http://trueslant.com/wrayherbert/2010/04/05/im-sorry-ill-change-i-promise/">"I'm sorry. I'll change. I promise." </a>The 9th Step: Trust violation, amends, and foregiveness</div><div></div><div><a href="http://www.psychologicalscience.org/onlyhuman/2010/04/knockoff-psychology-i-know-im-faking-it.cfm">Knockoff psychology: I know I'm faking it </a>Authenticity and phoniness</div><div></div><div></div><div><br /><br /> </div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div>So let's get the discussion going. This blog posting is free for the taking, as is any of the essays in "We're Only Human." Journalists, bloggers, website editors--indeed anyone with an interest in this topic--is encouraged to link to this post or to reproduce it, either electronically or in print. Please link back to The Science of Recovery so we can grow this resource and develop a network of interested readers.</div><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-6513857394765194054?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-54561133902485537532010-02-02T14:34:00.003-05:002010-02-02T15:57:11.663-05:00The "Super Uncles" of SamoaMale homosexuality doesn’t make complete sense from an evolutionary point of view. It appears that the trait is heritable, but since homosexual men are much less likely to produce offspring than heterosexual men, shouldn’t the genes for this trait have been extinguished long ago? What value could this sexual orientation have, that it has persisted for eons even without any discernible reproductive advantage?<br /><br />One possible explanation is what evolutionary psychologists call the “kin selection hypothesis.” What that means is that homosexuality may convey an <em>indirect</em> benefit by enhancing the survival prospects of close relatives. Specifically, the theory holds that homosexual men might enhance their own prospects by being “helpers in the nest.” By acting altruistically toward nieces and nephews, homosexual men—bachelor uncles in effect—would perpetuate the family genes, including their own.<br /><br />Two evolutionary psychologists have been testing this idea for the past several years on the Pacific island of Samoa. Paul Vasey and Doug VanderLaan of Lethbridge University, Canada, chose Samoa because male homosexuals there—called <em>fa’afafine</em>—are widely recognized and accepted as a distinct gender category, neither man nor woman. The <em>fa’afafine</em> tend to be effeminate, and to be exclusively homosexual. This clear demarcation makes it easier to identify a sample for study.<br /><br />The researchers have shown in past research that the <em>fa’afafine</em> behave much more altruistically toward their nieces and nephews than do either Samoan women or heterosexual men. They babysit a lot, tutor the kids in art and music, and help out financially—paying for medical care and education and so forth. That’s interesting in itself, but it’s unclear just why they behave this way. What’s going on cognitively that supports such avuncular acts. In their most recent study, the scientists set out to unravel the psychology of the <em>fa’afafine</em>, to see if their altruism is targeted specifically at kin rather than kids in general.<br /><br />They recruited a large sample of <em>fa’afafine</em>, and comparable samples of women and heterosexual men. They gave them all a series of questionnaires, measuring their willingness to help their nieces and nephews in various ways—caretaking, gifts, teaching—and also their willingness to do these things for other, unrelated kids. The findings, reported on-line this week in the journal <em>Psychological Science</em>, lend strong support to the kin selection idea. Compared to Samoan women and heterosexual men, the <em>fa’afafine</em> showed a much weaker link between their avuncular behavior and their altruism toward kids generally. This cognitive disconnect, the scientists argue, allows the <em>fa’afafine</em> to allocate their resources more efficiently and precisely to their kin—and thus enhance their own evolutionary prospects.<br /><br />But these aren’t your garden variety uncles. From an evolutionary perspective, you can’t make up for not having any offspring just by giving a toy to your nephew, or tossing a football with your niece once in a while. Indeed, to compensate for being childless, each <em>fa’afafine </em>would have to somehow support the survival of two additional nieces or nephews who would otherwise not have existed. In short, the <em>fa’afafine </em>must be “super uncles” to earn their evolutionary keep.<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/samoa2-729419.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 200px; FLOAT: left; HEIGHT: 200px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/samoa2-729417.jpg" /></a><br />Do these findings have any meaning outside of Samoa? Yes and no. Samoan culture is very different from most Western cultures. Samoan culture is very localized, and centered on tight-knit extended families, whereas Western societies tend to be highly individualistic and homophobic. Families are also much more geographically dispersed in Western cultures, diminishing the role that bachelor uncles can play in the extended family, even if they choose to. But in this sense, the researchers say, Samoa’s communitarian culture may be more—not less—representative of the environment in which male homosexuality evolved eons ago. In that sense, it’s not the bachelor uncle who is poorly adapted to the world, but rather the modern Western world that has evolved into an unwelcoming place.<br /><br />For more insights into human nature, visit the <a href="http://www.trueslant.com/wrayherbert/">“Full Frontal Psychology”</a> blog at True/Slant. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-5456113390248553753?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0tag:blogger.com,1999:blog-27543730.post-71509867960025471092010-01-29T12:25:00.002-05:002010-01-29T12:35:04.438-05:00A warm glow in Bangkok<div></div><br />Say you are traveling in a foreign country, trying to find your way through the bustling capital city. Not Paris or London, some place a bit edgier. Bangkok. You don’t speak the language, and you’re a little frazzled. You walk into a café for some respite, and to your surprise to see a fellow you know from back home sitting at a corner table, sipping coffee. He’s hardly a friend, but you know him to say hello. How do you feel? Well, after the initial surprise, you probably feel a warm glow as you walk up and greet him. You’re genuinely happy to see his familiar face in this strange place. He’s like an old friend.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/bangkok-759669.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 320px; FLOAT: left; HEIGHT: 240px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/bangkok-759667.jpg" /></a><br />Now, simply switch cities. You’re back at home and the same basic scenario takes place: You walk into a café, and there’s the same acquaintance, sitting at a corner table sipping coffee. How do you feel today? Well, if you’re like most people, you don’t feel much of anything. You recognize him, but no smile comes to your face. You might nod hello, but you’re really more focused on getting your morning coffee.<br /><br />Same face, similar scenario. So what’s going on here? Are you a couple of hypocrites? Well, don’t feel bad. First of all, he’s probably not feeling all that warmly toward you either. And what’s more, your own mixed feelings are probably beyond your control. That warm glow of recognition may be hard-wired into your neurons, but it’s also tightly entwined with other emotions, notably fears about personal peril and a yearning for safety.<br /><br />At least that’s a theory, which a team of cognitive psychologists have recently been testing in the laboratory. According to Marieke de Vries of Radboud University Nijmegen, in the Netherlands, people naturally feel good when they see something recognizable and familiar. That’s because things that are familiar are—generally speaking—less risky. This is the same impulse that makes us buy the same soap or automobile over and over again: It’s worked in the past, so it’s likely a safe bet again today. With recognizable people, that positive feeling, that sense of comfort, often feels like a warm glow.<br /><br />But it may not be quite that straightforward. De Vries and her colleagues wondered: Wouldn’t the power of familiarity depend somewhat on the context? Specifically, isn’t it possible that mood might modify and shape the mind’s response to familiar and unfamiliar things? Is that what’s occurring when you feel a warm glow in Bangkok and a big yawn back home? They decided to explore this idea experimentally.<br /><br />Instead of using people’s faces, the scientists used abstract patterns of dots. Basically what they did is familiarize volunteers with some patterns and not others; then they measured their responses when they saw the familiar patterns later. But they didn’t simply ask them which ones they liked and which ones they didn’t; in addition to doing that, they attached electrodes to their faces to detect subtle physiological signs of smiling. In other words, they measured the body’s visceral response to familiarity and novelty.<br /><br />But before doing this, they manipulated each volunteer’s mood. They asked some to think of sad events in their lives, and others joyous events; and then they played mood-appropriate music to maintain the gloom or happiness. The idea was that mood “tunes” the mind toward safety concerns. That is, if our mood is good, we assume we must be in a safe place; if we’re feeling edgy or down, that must be because we’re threatened in some way. The researchers predicted that feeling blue (and therefore unsafe) would make familiarity an especially potent cue; feeling happy (and therefore safe) would make that cue much less significant.<br /><br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/starbucks-715675.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 320px; FLOAT: left; HEIGHT: 216px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/starbucks-715670.jpg" /></a>And that’s precisely what they found. As reported on-line in the journal <em>Psychological Science</em>, the volunteers who were melancholy smiled much more at the familiar patterns than did those who were upbeat. Think about that: Familiarity wasn’t all that important to people who were already feeling secure; they already had the safety of their local coffee shop. But people who were feeling uneasy and threatened experienced familiarity as very comforting—even when the familiar stimuli were nothing more than meaningless abstract patterns of dots. No wonder the face of an “old friend” can seem so welcoming in a Bangkok café.<br /><br />For more insights into the quirks of human nature, visit the <a href="http://www.trueslant.com/wrayherbert/">“Full Frontal Psychology”</a> blog at True/Slant. Excerpts from “We’re Only Human” also appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought: Outsmarting Your Mind’s Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-7150986796002547109?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-70850342454460715182010-01-27T14:40:00.003-05:002010-01-27T14:58:27.692-05:00Hyper-binding ain't for sissies<p><br />Imagine this hypothetical scenario: You’re at a cocktail party and the host introduces you to a stranger, whose name is Jeremy. It’s a crowded party, and as you chat with Jeremy, you’re also picking up snippets of another conversation nearby. Something about a big football game on Sunday. It doesn’t concern you, so you try to tune it out. You have a short but pleasant conversation with Jeremy, then go on to mingle with other guests.</p><p>What do you remember when you run into Jeremy the next day? Well, if you’re young, you will probably recognize Jeremy’s face and associate his face with his name. That’s normal social memory. But if you’re older, you may have a very different kind of association: You may inexplicably link Jeremy with the upcoming football game. That overheard chatter about football is an irrelevant piece of information—you don’t even like football much. But your mind has been distracted by it, and it has connected that unimportant tidbit with your newly forged memory of Jeremy. </p><p>This is just a theory, which scientists call “hyper-binding.” That’s really just a jargony way of saying that the elderly remember a lot of useless information by attaching it to important new learning. But according to new research from the University of Toronto, such seemingly haphazard learning might be a blessing in disguise for the elderly. Psychological scientists Karen Campbell, Lynn Hasher and Ruthann Thomas recently ran a laboratory version of the cocktail party conversation to see if the phenomenon is indeed unique to the elderly—and to explore its possible benefits. </p><p>The experiments were fairly technical, but here’s the gist: The researchers recruited two groups of volunteers, the first about 19 years old and the second in the mid-60s. They showed all of them a string of pictures that were superimposed with irrelevant words. That’s like meeting Jeremy and hearing sports chatter at the same time. The volunteers were told to ignore the irrelevant words, and later on they were given a memory test for pictures and words in different combinations. They wanted to compare the older and younger minds at work. </p><p>The results were dramatic. As reported on-line this week in the journal <em>Psychological Science</em>, the older volunteers were clearly unable to ignore the distracting information even when they were instructed to. They stored away the irrelevant words by linking them tightly with their corresponding pictures in memory. What this suggests is that the elderly have weaker mental regulation and a broader “bandwidth,” taking in important and unimportant information indiscriminately. They store this new knowledge for later use and what’s more, they do this without even being aware of it. </p><p>Wouldn’t such distractibility be a terrible hindrance? Wouldn’t it just clutter up the mind with a lot of junk information? Not so, say the Toronto scientists. In fact, it may well be an advantage for the elderly. Aging often brings with it some mild cognitive declines—and indeed the elderly were slower and less accurate in some parts of these memory experiments. But awareness of how events connect in everyday life—even seemingly irrelevant events—may play a critical role in certain kinds of reasoning and judgment. In this way, distractibility may surreptitiously bolster everyday problem-solving. </p><p>The fact is, we never really know for sure what information in our world is important or useless—not when we’re first encountering it. The elderly mind may not be as fleet as it once was, but by being unfiltered, it perhaps is making connections that aren’t literal or obvious, and can be insightful. It might even be the foundation of a novel kind of intuition that comes with aging, or perhaps even what we call wisdom.<a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/wisdom2-736590.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 150px; FLOAT: left; HEIGHT: 100px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/wisdom2-736580.jpg" /></a><br /><br />For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at True/Slant. Excerpts from “We’re Only Human” also appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book, <em>On Second Thought</em>, will be published by Crown in September.<br /></p><div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-7085034245446071518?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-28815019148276168252010-01-19T16:10:00.003-05:002010-01-26T10:51:28.045-05:00The Science of PrayerEveryone who is in any kind of serious relationship—with a partner, a child, a close friend—has been guilty of transgression as one time or another. That’s because we’re not perfect. We all commit hurtful acts, violate trust, and hope for forgiveness.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/prayer-754643.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 94px; FLOAT: left; HEIGHT: 126px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/prayer-754641.jpg" /></a><br />That’s simply a fact, and here’s another one: Nine out of 10 Americans say that they pray—at least on occasion. Florida State University psychologist Nathaniel Lambert put these two facts together and came up with an idea: Why not take all that prayer and direct it at the people who have wronged us? Is it possible that directed prayer might spark forgiveness in those doing the praying—and in the process preserve relationships?<br /><br />This is obviously not a new idea. Indeed it’s ancient, but Lambert and his colleagues decided to test it scientifically in two simple experiments. In the first, they had a group of men and women pray for their romantic partner. It was just a single prayer for their partner’s well-being, spoken privately in a quiet room. Others—the experimental controls—also went into a quiet room, where they simply described their partner, speaking into a tape recorder.<br /><br />Then they meaured forgiveness. When someone hurts you, it’s human nature to want to strike back, retaliate—or to withdraw from the relationship. The scientists defined forgiveness as the diminishing of these initial negative feelings, and when they analyzed all the data, the results were clear: Those who had prayed for their partner harbored fewer vengeful thoughts and emotions: They were more ready to forgive and move on.<br /><br />This is remarkable, when you think that a single prayer made the difference. The researchers decided to run another test to double-check the findings. In this study, they had a group of men and women pray for a close friend every day for four weeks. Others simply reflected on the relationship, thinking positive thoughts but not praying for their friend’s well-being. They also added another dimension. They used a scale to measure selfless concern for others—not any particular person but other people generally. They speculated that prayer would increase selfless concern, which in turn would boost forgiveness.<br /><br />And that’s just what they found. But why? How does this common spiritual practice exert its healing effects? The psychologists have an idea, which they described recently in the journal <em>Psychological Science</em>: Most of the time, couples profess and believe in shared goals, but when they hit a rough patch, they often switch to adversarial goals like retribution and resentment. These adversarial goals shift cognitive focus to the self, and it can be tough to shake that self-focus. Prayer appears to shift attention from the self back to others, which allows the resentments to fade.<br /><br />For more insights into the quirks of human nature, visit the <a href="http://www.trueslant.com/wrayherbert/">“Full Frontal Psychology”</a> blog at True/Slant. Excerpts from “We’re Only Human” also appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert's book, <em>On Second Thought: Outsmarting Your Mind's Hard-Wired Habits</em>, will be published by Crown in September.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-2881501914827616825?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com3tag:blogger.com,1999:blog-27543730.post-56496148865106675942010-01-08T12:20:00.005-05:002010-01-08T12:42:52.651-05:00Revisiting the Green MonsterWhen South Carolina Governor Mark Sanford was caught red-handed returning from a tryst with his Argentine mistress last June, he told the Associated Press that he had met his “soul mate.” His choice of words seemed to suggest that having a deep <a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/sanford-765026.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 136px; FLOAT: left; HEIGHT: 91px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/sanford-765024.jpg" /></a>emotional and spiritual connection with Maria Belen Chapur somehow made his sexual infidelity to his wife Jenny Sanford less tawdry.<br /><br />Jenny Sanford wasn’t buying it, and neither would most women. What the two-timing governor didn’t understand is that most women view emotional infidelity as worse, not better, than sexual betrayal. Publicly acknowledging a soul connection was probably the most insulting and hurtful thing he could have said to his wife of 20 years.<br /><br />The clueless governor is not alone. Research has documented that most men become much more jealous about sexual infidelity than they do about emotional infidelity. Women are the opposite, and this is true all over the world. Just why this is the case is not fully understood, although the prevailing theory is that the difference has evolutionary origins: Men learned over eons to be hyper-vigilant about sex because they can never be absolutely certain they are the father of a child, while women are much more concerned about having a partner who is committed to raising a family.<br /><br />New research now suggests an alternative explanation. The new studies do not question the fundamental gender difference regarding jealousy—indeed they add additional support for that difference. But the new science suggests that the difference may be rooted more in personality—specifically in traits like self-reliance and insecurity.<br /><br />Pennsylvania State University scientists Kenneth Levy and Kristen Kelly doubted the evolutionary explanation because there is a conspicuous subset of men who are more like women. That is, they find emotional betrayal more distressing than sexual infidelity. Why would this be? The researchers suspected that it might have to do with trust and emotional attachment. Some people—men and women alike—are by nature more secure in their attachments to others, while others are more invested in their own autonomy and seemingly less in need of intimacy. Psychologists see this compulsive self-reliance as a defensive strategy—protection against deep-seated feelings of vulnerability. People high on this trait tend to be preoccupied with the sexual aspects of relationships rather than emotional intimacy.<br /><br />Levy and Kelly decided to explore a possible link between attachment style and jealousy style, and they did this by running a group of volunteers through some standard psychological tests. One questionnaire measured whether the volunteers were secure in their romantic relationships, or whether they instead were avoidant and noncommittal. A second questionnaire asked which they would find more distressing—knowing their partner was off having passionate sexual intercourse with someone else, or knowing that same partner had formed a deep emotional attachment with someone else.<br /><br />They sorted the data, and the conclusions were indisputable. As the scientists reported on-line in the journal <em>Psychological Science </em>this week, avoidant types—those who prize their autonomy in relationships over commitment—were much more upset about sexual infidelity than emotional infidelity. And conversely, emotionally secure volunteers—including secure men—were much more likely to find emotional betrayal more upsetting.<a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/appalachian_trail-706139.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 320px; FLOAT: left; HEIGHT: 176px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/appalachian_trail-706115.jpg" /></a><br /><br />But here’s the interesting twist. Just like all the earlier studies, Levy and Kelly found clear evidence of a gender difference in jealousy style.<br />In other words, men are indeed preoccupied with sexual betrayal, and women the reverse, but not for the reasons we thought. Men fret about sexual betrayal because they are overly invested in the sexual side of their own relationships—and that superficiality is linked to their thin personal attachments. Not to put too fine a point on it, male jealousy is shaped by deep emotional insecurities. Jenny Sanford probably knew that already, and the governor’s soul mate is no doubt having her suspicions by now.<br /><br />For more insights into the quirks of human nature, visit the <a href="http://www.trueslant.com/wrayherbert">“Full Frontal Psychology” blog </a>at True/Slant. Excerpts from “We’re Only Human” appear regularly in the magazine <em>Scientific American Mind</em>.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-5649614886510667594?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com1tag:blogger.com,1999:blog-27543730.post-66339372565402289112009-12-23T11:37:00.001-05:002009-12-23T11:43:01.239-05:00Hearses, coffins and the meaning of life<div></div><br />In the darkly funny film classic <em>Harold and Maude</em>, Harold is a 19-year-old who is obsessed with death and dying. He repeatedly fakes his own suicide, drives around in a hearse, and attends strangers’ funerals as a pastime. At one of these funerals he meets Maude, a 79-year-old with the same morbid hobby, and in one of the most unlikely romances on film, the melancholy young man and the vivacious concentration camp survivor fall in love. Maude’s life ends with her suicide on her 80th birthday, but it’s not a depressing death. Indeed, the final scene shows Harold putting aside his morbid ways and embracing life anew.<br /><a href="http://www.psychologicalscience.org/onlyhuman/uploaded_images/harold-and-maude-729890.jpg"><img style="MARGIN: 0px 10px 10px 0px; WIDTH: 100px; FLOAT: left; HEIGHT: 143px; CURSOR: hand" border="0" alt="" src="http://www.psychologicalscience.org/onlyhuman/uploaded_images/harold-and-maude-729884.jpg" /></a><br /><em>Harold and Maude</em> is one of the cleverest films to wrestle with existential themes, but the interplay of morbidity and zest for life is a recurring theme in art and literature. And in real lives as well: People who have close brushes with death often report a sharpened appetite even for the ordinary stuff of daily life. Facing one’s mortality appears to give new meaning to being alive.<br /><br />But why would this be? It’s not obvious. One can imagine becoming negative and fearful when faced with life’s fragility, or reckless, but that doesn’t seem to happen. What cognitive crunching transforms morbidity into hope, mourning into joy? In other words, what was taking place in young Harold’s neurons when his soul mate’s death lifted his spirits out of the doldrums?<br /><br />Some new science offers one possible explanation for this cognitive phenomenon. A team of cognitive scientists at the University of Missouri, headed by Laura King, decided to look at the death-and-zest interplay in terms of mental heuristics. <em>Heuristic</em> is just scientific jargon for the ancient, deep-wired rules that shape many of our thoughts and actions, and the Missouri scientists were especially interested in two of these rules. The so-called scarcity heuristic states: If something is rare, it must be valuable. This explains, for example, why we prize gold, even though steel is much more useful. The flip side of the scarcity heuristic, often called the value heuristic, states: If we desire something very much, it must be scarce.<br /><br />Neither of these cognitive rules is necessarily correct or useful all the time, but they are both powerful—powerful enough to explain the common intertwining of morbidity and zest. Because scarcity and value are so tightly linked in the human mind, King and her colleagues reasoned, the mind might interpret death as a scarcity of life, which according to the theory should enhance its perceived value. They decided to test this idea in their laboratory.<br /><br />The experiments were fairly straightforward. In one, for example, the researchers had a large group of volunteers complete word-find puzzles—those grids of letters with words embedded in them. For some of the volunteers, the embedded words were death-related, like <em>tombstone</em> and <em>coffin</em>, while for others—the controls—they were pain-related, like <em>headache</em>. Then all the volunteers completed three widely used measures of life’s meaning and purpose. The findings were simple and unambiguous: Those with death on their mind found life more meaningful and, well, simply better. They valued life more when primed by funerals and hearses.<br /><br />So that’s the scarcity principle at work. But the scientists wanted to test their idea the other way around. That is, if it is indeed the heuristic mind finding meaning in death, then loving and embracing life should also enhance awareness of death’s constant presence. They tested this idea in an ingenious way. They approached strangers on the streets of Columbia, Missouri, and asked them to read a brief prose passage. Some read about how valuable the human body was if the organs were traded on the market—in the neighborhood of $45 million, the equivalent of “400 Porsches, 265 houses, or 45 luxury yachts.” The idea was to spark thoughts about life’s monetary worth. Others read about how the body was made up of common chemicals with a total value of about $4.50—the equivalent of “a Big Mac Value Meal at McDonald’s.”<br /><br />Then they had all the volunteers do a different word test, this one requiring word completions like coff__ and de__. These words could be completed with either death-related words like <em>coffin </em>and <em>dead</em>, or with neutral words like<em> coffee</em> and <em>deal</em>. The idea was to see how much the two different groups of volunteers were thinking about death and dying. And the findings, reported in the December issue of the journal <em>Psychological Science</em>, were again clear: As the value heuristic would predict, those who were imagining themselves as the $45 million bionic man were also focused on the inevitability of dying—much more than those primed to devalue life. Valuing life made it seem scarcer and thus more fragile.<br /><br />So the reality of death does not render life meaningless. Indeed, the opposite. And what’s more, when we embrace life, death is not pushed out of awareness; it lurks just outside of consciousness, easily accessible. That’s a psychological reality that Maude knew well from experience, and 19-year-old Harold was just beginning to sense.<br /><br />For more insights into the quirks of human nature, visit the “Full Frontal Psychology” blog at True/Slant. Excerpts from “We’re Only Human” also appear regularly in the magazine <em>Scientific American Mind</em>. Wray Herbert’s book on the heuristic mind will be published by Crown in fall of 2010.<div class="blogger-post-footer"><img width='1' height='1' src='https://blogger.googleusercontent.com/tracker/27543730-6633937256540228911?l=www.psychologicalscience.org%2Fonlyhuman%2Findex.cfm' alt='' /></div>Wray Herberthttp://www.blogger.com/profile/02157965041515501630noreply@blogger.com0