The Neurocritic

Sunday, May 28, 2006

Some of what we know so far about hippocampal neurogenesis and depression is listed below (along with a speculation or two by The Neurocritic):

1. In rodents, electroconvulsive shock (ECS -- called ECT in humans, the "T" is for "therapy") is more effective than chronic antidepressant administration in increasing the number of new granule cells in the dentate gyrus region of the hippocampus (Malberg et al., 2000).

SPECULATION: it doesn't seem like those extra new neurons could compensate for the memory impairment observed in humans after ECT.

2. Other bad things that increase hippocampal neurogenesis include traumatic brain injury (TBI), ischemic insults, seizures, and caloric restriction (reviewed in Thomas & Peterson, 2003). We wouldn't want to use any of those as treatments for depression.

3. On the other hand, environmental enrichment (Kempermann et al., 1998) and exercise (Ernst et al., 2006) have been shown to increase neurogenesis. These are generally seen as good things that psychiatrists can recommend to their patients. However, it turns out that neurogenesis is not necessary for mice to obtain the cognitive and anxiolytic benefits of environmental enrichment (Meshi et al., in press).

4. The antidepressants that increase neurogenesis are not limited to those acting through serotonin-mediated pathways (Malberg et al., 2000; Santarelli et al., 2003). The effecive drug classes include selective norepinephrine reuptake inhibitors, SNRIs (reboxetine); tricylclic antidepressants, TCAs, acting to inhibit reuptake of norepinephrine (desipramine) and both norepinephrine and serotonin (imipramine); monoamine oxidase inhibitors, MAOIs (tranylcypromine); as well as selective serotonin reuptake inhibitors, SSRIs (fluoxetine). This has been viewed as an attractive reason for believing that neurogenesis somehow mediates treatment response, since all sorts of drugs produce the effect.

5. It does appear that hippocampal neurogenesis is necessary for the behavioral benefits of fluoxetine in the novelty-supressed feeding task (Santarelli et al., 2003). Although as pointed out by Thomas and Peterson (2003), the mice in that study were not actually "depressed" (i.e., exposed to continued stress before getting Prozac).

The Neurocritic, although by no means an expert on hippocampal neurogenesis and depression (1) , is nonetheless a skeptic, and in good company in the skeptical camp:

The concept that decreased neurogenesis might be the cause of depression is supported by the effects of stress on neurogenesis and the demonstration that neurogenesis seems to be necessary for antidepressant action. Data from the animal models tested to date show that decreasing the rate of neurogenesis does not lead to depressive behavior. Furthermore, evidence shows that an effective treatment for depression, transcranial magnetic stimulation, does not alter rates of neurogenesis. On the basis of these findings, it is suggested that neurogenesis might play a subtle role in depression but that it is not the primary factor in the final common pathway leading to depression.

Finally, the newly-generated baby neurons appear to have greater plasticity than their elders, BUT many of them have only a transient existence (Gould et al., 2001). Alas!

SUMMARY FROM THE NEUROCRITIC: Although it's all very trendy to consider neurogenesis as "The Reinvention of the Self" (see article in SEED), at this stage of the game, it's all very hyberbolic.

Friday, May 26, 2006

The Neurocritic has wondered why hippocampal neurogenesis has been touted as THE therapeutic sequela of antidepressant treatment ever since it was discovered as an interesting "side effect" of SSRI administration in rats. The hippocampus hasn't exactly been associated with mood regulation (the orbitofrontal cortex, sad cingulate, and amygdala, much more likely). Yes, it's thought that depressed people are more stressed, and stress is bad blah blah glucocorticoids blah blah hippocampus... other than that, is there any evidence that damaged hippocampi cause mood disorders? in focally-lesioned rodents? in amnesic humans? Could it be that hippocampal circuitry is more sensitive to a variety of insults (e.g., anoxic-ischemic events, excessive glucocorticoid levels, etc) and therefore, its integrity is a sensitive marker for some of the pathological cellular effects of stress-induced depression? Thus, fixing the ailing hippocampus itself may not be what causes remission of major depressive episodes.

And about that bureau genesis... what is that? A really brief bible placed by the Gideons in motel drawers? Oh, what? it's neurogenesis? . . . Never mind!

And about that neurogenesis. . . What do these new neurons do in adult humans? Are they fully functional?

Anyway, a new study touts neural progenitor cells as the key step for Prozac efficacy, as explained by this New Scientist article:

Prozac, one of the most common drug treatments for depression, acts by stimulating the growth of new neurons in the hippocampus. Grigori Enikolopov and his team from Cold Spring Harbor Laboratory in New York wanted to narrow down which steps in neurogenesis the drug was influencing. So they created mice with nuclei in their nerve cells that glow green during neurogenesis, making it easy to count and compare the number of developing neurons.

By tracking other factors associated with different stages of neurogenesis, Enikolopov's team found that only one step was actually influenced by Prozac: amplifying the neural progenitor cell, just downstream of the stem cell (see article in PNAS).

In the figure below, the blue arrow shows the step at which fluoxetine has its effect (Encinas et al., 2006).

...We have generated a reporter mouse line, which allows identification and classification of early neuronal progenitors. It also allows accurate quantitation of changes induced by neurogenic agents in these distinct subclasses of neuronal precursors. We use this line to demonstrate that the selective serotonin reuptake inhibitor antidepressant fluoxetine does not affect division of stem-like cells in the dentate gyrus but increases symmetric divisions of an early progenitor cell class. We further demonstrate that these cells are the sole class of neuronal progenitors targeted by fluoxetine in the adult brain and suggest that the fluoxetine-induced increase in new neurons arises as a result of the expansion of this cell class. This finding defines a cellular target for antidepressant drug therapies.

Monday, May 22, 2006

neurological abnormalities that are not readily localizable to a specific brain region, while hard signs provide some indication of the underlying brain systems or regions that are affected (Ismail et al., 1998).

Pre-existing, mild neurological deficits may have increased the risk of PTSD in a sample of Vietnam combat veterans, says a study published in the May 2006 issue of Archives of General Psychiatry. How did they surmise the pre-existing neurological state of the vets?

Gurvits et al. examined 45 neurological soft signs in Vietnam combat veterans and their combat-unexposed, identical co-twins. The unexposed co-twins of the combat veterans with posttraumatic stress disorder (PTSD) had significantly higher neurological soft sign scores than the unexposed co-twins of the veterans without PTSD. This result supports the conclusion that subtle neurological dysfunction represents an antecedent familial vulnerability factor for developing chronic PTSD on exposure to a traumatic event.

The authors argue that the stress--> cortisol--> hippocampal shrinkage--> PTSD pathway [Sapolsky RM. Why stress is bad for your brain. Science. 1996;273:749-750] isn't as directly causal as the arrows above might suggest. This is an expansion of their earlier work published in Nature Neuroscience:

It's incredibly impressive that they drew from a sample of 103 male identical twin pairs discordant for combat in Vietnam. 49 twin pairs participated in the current study: 25 pairs in which the combat vet had PTSD, and 24 pairs in which the combat vet never had PTSD.

Conclusions These results replicate previous findings of increased NSSs in Vietnam combat veterans with PTSD. Furthermore, results from their combat-unexposed identical co-twins support the conclusion that subtle neurologic dysfunction in PTSD is not acquired along with the trauma or PTSD but rather represents an antecedent familial vulnerability factor for developing chronic PTSD on exposure to a traumatic event.

...to borrow a phrase from Bruce Bower, who wrote an article entitled The Bias Finders for Science News Online last month.

A test of unconscious attitudes polarizes psychologists

. . .

However, one measure -- the Implicit Association Test, or IAT -- has proved especially popular. Since its introduction in 1998, more than 250 IAT-related studies have been published. More than 3 million IATs have been completed on a Web site established by the test's major proponents -- Anthony G. Greenwald of the University of Washington in Seattle, Mahzarin R. Banaji of Harvard University, and Brian A. Nosek of the University of Virginia in Charlottesville. Other online venues run by organizations concerned about various types of discrimination also offer the IAT to visitors.

The huge IAT database contains troubling findings that have been highly publicized. For example, more than three-quarters of white and Asian test takers in the United States display an unconscious tendency to value white people over black people. Roughly half of black test takers show a pro-white bias as well. Many people who complete the IAT exhibit implicit inclinations for young versus old people and unconsciously favor men over women.

Such results challenge the traditional view in psychology that each person knows his or her social attitudes and stereotypes, Banaji says. People maintain unconscious preferences for certain social groups over others, even if they disavow those preferences when asked about them, in her view. In the post–Civil Rights era, few people admit to harboring ill will toward blacks or to acting in a racially discriminating style, but IAT results reveal a stubborn undercurrent of white favoritism with the potential to stoke bigoted behavior, in Banaji's view.

Virtually from the start, the test sparked a schism in social psychology. The IAT taps into much more than individuals' unconscious attitudes, critics contend. Familiarity with members of those groups, knowledge of cultural attitudes toward particular groups, and test-taking tactics influence IAT scores, they say.

Critics also argue that specific IAT scores are meaningless because they haven't been tied to relevant, real-world behaviors. No one should assume that he or she is unconsciously prejudiced against black people on the basis of an IAT score, these investigators hold.

Psychologist William von Hippel of the University of New South Wales in Sydney, Australia, has followed the IAT debate closely. "Rarely has a methodological tool garnered such strong adherents and detractors," von Hippel says. "The IAT should be vigorously researched and debated, but we still do not really understand what it reveals."

The graph below is from the previously-discussed article by Mitchell et al. After the fMRI session was over, the participants in that study completed a "liberal-conservative" IAT that used photos of the hypothetical persons presented for "mentalizing" judgments in the scanning session.

Participants were told that we were investigating their ability to extrapolate another person’s opinions, likes, and dislikes from a small amount of information about that person. Prior to scanning, participants were introduced to two target individuals, represented by face photographs downloaded from an Internet dating site. Each target face was accompanied by a short descriptive paragraph intended to create a sense of similarity between the participant and one target and dissimilarity between the participant and the other target. For one target (randomly determined for each participant), this paragraph described the person as having liberal sociopolitical views and participating in activities typical of many students at Northeast liberal arts colleges. For the other target, the paragraph described the person as a fundamentalist Christian with conservative political and social views who participated avidly in a variety of events sponsored by religious and Republican organizations at a Midwest university.

The authors used the IAT to retroactively assign subjects to "like liberal" and "not like liberal" groups. As the graph illustrates, only 3 subjects (out of 15 total) actually had RT effects indicating they might have a closer affinity to the conservative "other" (if you believe the IAT).

Mentalizing refers to our ability to read the mental states of other agents and engages many neural processes. The brain’s mirror system allows us to share the emotions of others. Through perspective taking, we can infer what a person currently believes about the world given their point of view. Finally, the human brain has the unique ability to represent the mental states of the self and the other and the relationship between these mental states, making possible the communication of ideas.

Human social interaction requires the recognition that other people are governed by the same types of mental states -- beliefs, desires, intentions -- that guide one’s own behavior. We used functional neuroimaging to examine how perceivers make mental state inferences when such self-other overlap can be assumed (when the other is similar to oneself) and when it cannot (when the other is dissimilar from oneself). We observed a double dissociation such that mentalizing about a similar other engaged a region of ventral mPFC linked to self-referential thought, whereas mentalizing about a dissimilar other engaged a more dorsal subregion of mPFC. The overlap between judgments of self and similar others suggests the plausibility of "simulation" accounts of social cognition, which posit that perceivers can use knowledge about themselves to infer the mental states of others.

As described by Frith and Frith (2006), this study

"report[ed] an elegant experiment that directly investigates the effect of similarity. Participants were told about two target individuals who were described as having liberal or conservative views. They were then asked to predict the feelings and attitudes of these two targets in various situations (e.g., "would he enjoy having a roommate from a different country"). Subsequently the political attitudes of the participants were also assessed. The results show a different pattern when thinking about a similar or a dissimilar other. Thinking about similar others was associated with activity in ventral mPFC (18, 57, 9 -- in the region labeled anterior rostral MFC in Amodio and Frith 2006), while thinking about a dissimilar other was associated with activity in a more dorsal region of mPFC (29, 45, 42 -- posterior rostral MFC).

This is strong evidence for segregation of function within the area of medial prefrontal cortex associated with mentalizing." [my emphasis]

It couldn't have anything to do with merely agreeing or disagreeing with "the other" now, could it? It really has to do with extracting the putative mental contents from the brain of a hypothetical individual, now does it? Not different affective states engendered by the agreement or disagreement? Not semantic knowledge about how certain "types" of people may act in certain situations? Hmmmm?

Does the first bar graph (below left) mean that liberals are a little less hostile to conservatives than vice versa? Does the other bar graph (below right) mean that the “Not Like Me” area in liberals is equally activated by “self” and “conservative other”?? What DOES it all mean?

Friday, May 12, 2006

When the minds of non-autistic people are "idle", a network within the brain involved in social and emotional thought is, in fact, active. People often drift into daydreams at these times, but when we have to concentrate on a task, we suppress daydreaming.

A team from the University of California at San Diego used functional MRI to show that while this network is more active in non-autistic people when their brains are resting than carrying out a cognitive test, there is no difference between the active and resting brains of people with autism (Proceedings of the National Academy of Sciences, DOI: 10.1073/pnas.0600674103).

THURSDAY, May 11 (HealthDay News) -- Stopped at a red light or waiting in a doctor's office, people's idle thoughts may focus on themselves, other people in their lives, nearby strangers or their plans for the day.

But a new brain-imaging study suggests the minds of autistic individuals do not engage in these "daydreams" about themselves or other people whenever their brains are free to wander.

. . .

The interconnected network of brain sites that supports daydreaming also "supports thinking about other people, emotional processing and the processing of familiar faces -- all things that we know are abnormal in autism at a behavioral level," said Daniel Kennedy, a graduate student in neurosciences and psychology at the University of California, San Diego, who conducted the work while at the Center for Autism Research at the Children's Hospital Research Center in La Jolla, Calif.

And the quality of information contained in this article is much better than most coverage I've seen in the popular press.

According to Kennedy, what people think of as daydreaming isn't a low-energy task, neurologically speaking.

"It's actually got a very high metabolism -- it's using lots of oxygen, glucose, the neurons are really firing," he said.

This activity is spread throughout a number of key sites in both the brain's executive centers in the forebrain, as well as areas toward the back of the brain. "It's a really distributed set of brain regions. That's why we call it a 'network,'" Kennedy said. "We call it a 'resting network;' other people have called it a 'default mode' of brain functioning."

In the normally functioning brain, this resting network tends to focus on the self or the self's interaction with others. "Some people think that maybe it has a lot to do with the construction of the self and self-awareness," Kennedy said.

He and [co-researcher Elizabeth] Redcay suspected the resting network might work differently in people with autism, however. First, they knew that studies of autistic brains had shown anatomical abnormalities in regions of the resting network. One area, the medial-frontal cortex, "actually grows too big and too fast" in people with autism, Kennedy said.

Autistic people also tend to have trouble with behaviors specifically linked to areas that make up the resting network -- social interactions, face processing and emotions.

. . .

In their experiments using functional (real-time) MRI scans, he and Redcay compared changes in brain-energy usage in 15 people with autism-spectrum disorders (ranging from autism to less-severe conditions such as Asperger's syndrome) vs. 14 healthy controls.

They first measured the brain activity of each study participant at rest. Then, they had each participant engage in the Stroop test -- a standard visual test of attention and cognition that has been used by experimental psychologists for decades. "Previous work has shown that autistics and non-autistics handle this task equally," Kennedy said.

As expected, the brains of healthy controls switched their focus of energy-usage away from the resting network to other cognitive centers as they struggled to solve the Stroop test.

"The resting network shuts down or 'deactivates,' so you can perform the task and let other regions take over," Kennedy explained.

But the researchers saw no such deactivation in the brains of autistic people. "The resting network shuts down in normal subjects, because it was already running high during rest. We didn't see a similar shutdown in autistic subjects -- because it wasn't ever there to begin with," he said.

In other words, it looks as if autistic people may not daydream -- at least not in the way non-autistic people do, Kennedy said.

"We also found that the more abnormal the neural activity in this resting network, the more abnormal their social behaviors," Kennedy said, suggesting that impairment of the resting network increases as autism becomes more severe.

So, what are autistic people thinking about when not engaged in specific tasks? Kennedy said that, so far, researchers have found it tough to get good answers to that question from severely impaired autistic individuals. "We do know, though, that they tend to have repetitive, stereotyped thoughts -- their attention is drawn to things like calendars, schedules, maps, computers -- rigid, concrete things," he said.

"We think that they probably have a much different internal process at rest," Kennedy said.

He stressed that the new findings don't answer the central question of what causes autism, although they do offer tantalizing clues.

"We know that autism is a neurodevelopmental disorder, so what's happening in the first couple years of life are going to be crucial to understanding what is causing autism," he said. "Right now, though, we know very little about this network in early development. In adults, these regions have an extremely high metabolism and use a high amount of energy. So, if that energy supply was cut off or impaired, you'd think that these regions might be the first to be affected," Kennedy said.

Non-neuroscientists and neuroscientists can actually learn something new (and informative) about autism and the brain from this article!

Or rather, "No-Word," because they still haven't posted the pheromone article. It's annoying that their manuscripts are released to the press so far in advance of when they're available to the rest of us without press credentials (you know, the public, e.g., scientists).

However, viewers of The L Word will be happy to hear that lesbians aren't just like heterosexual men, after all!

Monday, May 08, 2006

May 8, 2006 -- Lesbian women and heterosexual women respond differently to the scent of human pheromones, a new study shows.

. . .

The researchers studied brain scans of lesbian women, heterosexual women, and heterosexual men while those people smelled scents including two potential human pheromones.

Brain scans taken while smelling those pheromones were more similar for lesbian women and heterosexual men than for lesbian women and heterosexual women, the researchers report. Their study appears in the online early edition of Proceedings of the National Academy of Sciences. [Not yet, check back tomorrow.]

Last year, Savic's team published a study showing that homosexual men and heterosexual women had similar brain activity patterns when smelling those same human pheromone candidates. [See below.]

The testosterone derivative 4,16-androstadien-3-one (AND) and the estrogen-like steroid estra-1,3,5(10),16-tetraen-3-ol (EST) are candidate compounds for human pheromones. AND is detected primarily in male sweat, whereas EST has been found in female urine. In a previous positron emission tomography study, we found that smelling AND and EST activated regions covering sexually dimorphic nuclei of the anterior hypothalamus, and that this activation was differentiated with respect to sex and compound. In the present study, the pattern of activation induced by AND and EST was compared among homosexual men, heterosexual men, and heterosexual women. In contrast to heterosexual men, and in congruence with heterosexual women, homosexual men displayed hypothalamic activation in response to AND. Maximal activation was observed in the medial preoptic area/anterior hypothalamus, which, according to animal studies, is highly involved in sexual behavior. As opposed to putative pheromones, common odors were processed similarly in all three groups of subjects and engaged only the olfactory brain (amygdala, piriform, orbitofrontal, and insular cortex). These findings show that our brain reacts differently to the two putative pheromones compared with common odors, and suggest a link between sexual orientation and hypothalamic neuronal processes.

Lesbian and heterosexual women showed different patterns of brain activity while sniffing AND and EST, the study shows.

While smelling AND and EST, the brain activity pattern for lesbian women was closer to that of heterosexual men than heterosexual women, Savic and colleagues note.

However, the previously reported similarities between brain activity for heterosexual women and homosexual men while sniffing the pheromones were stronger than those between lesbian women and heterosexual men.

The pheromones didn't necessarily have a sexy smell. "None of our subjects reported sexual arousal" while whiffing any of the scents, the researchers write.

Heterosexual women found the male and female pheromones about equally pleasant, while straight men and lesbians liked the female pheromone more than the male one. Men and lesbians also found the male hormone more irritating than the female one, while straight women were more likely to be irritated by the female hormone than the male one.

All three groups rated the male hormone more familiar than the female one. Straight women found both hormones about equal in intensity, while lesbians and straight men found the male hormone more intense than the female one.

The brains of all three groups were scanned when sniffing male and female hormones and a set of four ordinary odors. Ordinary odors were processed in the brain circuits associated with smell in all the volunteers.

In heterosexual males the male hormone was processed in the scent area but the female hormone was processed in the hypothalamus, which is related to sexual stimulation. In straight women the sexual area of the brain responded to the male hormone while the female hormone was perceived by the scent area.

In lesbians, both male and female hormones were processed the same, in the basic odor processing circuits, Savic and her team reported.

Arginine vasopressin (AVP) and related peptides affect social behaviors in numerous species, but AVP influences on human social functions have not yet been established. Here, we describe how intranasal AVP administration differentially affects social communication in men and women, and we propose a mechanism through which it may exert those influences. In men, AVP stimulates agonistic facial motor patterns in response to the faces of unfamiliar men and decreases perceptions of the friendliness of those faces. In contrast, in women, AVP stimulates affiliative facial motor patterns in response to the faces of unfamiliar women and increases perceptions of the friendliness of those faces. AVP also affected autonomic responsiveness to threatening faces and increased anxiety, which may underlie both communication patterns by promoting different social strategies in stressful contexts in men and women.

Sunday, May 07, 2006

The Neurocritic is not an economist and is not fond of the field of neuroeconomics. Nonetheless, to understand the narrow definition of "dread" used by Berns et al. (2006) in their article, "Neurobiological Substrates of Dread," one needs to consult some old articles on economic theory:

"...the term 'savouring' referes to positive utility derived from anticipation of future consumption; 'dread' refers to negative utility resulting from contemplation of the future."

As one can discern from my previous two "meta-commentaries" on this article, I was bothered by the pretentiousness of a title claiming to have a neurobiological explanation of the very complex affective state known as "dread." The construct under study here would seemingly be better described as "anticipation," specifically anticipation of a well-defined cutaneous shock to the dorsum of the left foot.

The paper by Berns et al. aims to explain why people will sometimes choose to delay gratification and to speed up the receipt of unpleasant outcomes (most specifically, the latter... although I would imagine a corporation-sponsored study on the former will be forthcoming from some lab or other).1 Anyway, the act of waiting (or anticipation) can take on positive or negative dimensions.

George LoewensteinAnticipation and the valuation of delayed consumption.The Economic Journal 97: 666-684 (1987).This paper presents a model of intertemporal choice that incorporates "savoring" and "dread" -i.e., utility from anticipation of delayed consumption. The model explains why an individual with positive time preference may delay desirable outcomes or get unpleasant outcomes over with quickly, contrary to the prediction of conventional formulations of intertemporal choice. Implications of savoring and dread for savings behavior, empirical estimation of discount rates, and public policy efforts to combat myopic behavior are explored. The model provides an explanation for common violations of the independence axiom as applied to intertemporal choice. Copyright 1987 by Royal Economic Society.

My problem with the Berns et al. study is with the interpretation of their results:

The manifestation of dread in the more posterior elements of the pain matrix informs our understanding of what dread is and how it impacts decision-making. ... Although dread is usually thought of as an emotion based on fear and anxiety (Berridge, 1999), our localization of dread tothe posterior elements of the matrix suggests that dread has a substantial attentive component. (p. 756)

Here, the authors commit the logical fallacy known as "reverse inference" by inferring the participants' emotional state from the observed pattern of brain activity. They discount the role of the amygdala in "dread" because both moderate and extreme dreaders showed elevated hemodynamic responses there during the unpleasant interval of waiting for the shocks.

Taken together, the anatomical locations of dread responses suggest that the subjective experience of dread that ultimately drives an individual's behavior comes from the attention devoted to the expected physical response (SI, SII, the caudal ACC, and the posterior insula) and not simply a fear or anxiety response.

So anticipation of pain is "attention," not fear and anxiety. It's a little early to make that conclusion.

There is much interest currently in using functional neuroimaging techniques to understand better the nature of cognition. One particular practice that has become common is ‘reverse inference’, by which the engagement of a particular cognitive process is inferred from the activation of a particular brain region. Such inferences are not deductively valid, but can still provide some information. Using a Bayesian analysis of the BrainMap neuroimaging database, I characterize the amount of additional evidence in favor of the engagement of a cognitive process that can be offered by a reverse inference. Its usefulness is particularly limited by the selectivity of activation in the region of interest. I argue that cognitive neuroscientists should be circumspect in the use of reverse inference, particularly when selectivity of the region in question cannot be established or is known to be weak.

Friday, May 05, 2006

Brain research has proven that the word "dread" should no longer mean "fear" or "extreme uneasiness." Time to update those anachronistic dictionary entries!

Main Entry: 1 dread Function: verbEtymology: Middle English dreden, from Old English dr[AE]dantransitive senses1 a : to fear greatly b archaic : to regard with awe2 : to feel extreme reluctance to meet or faceintransitive senses : to be apprehensive or fearful

Main Entry: 2 dreadFunction: noun1 a : great fear especially in the face of impending evil b : extreme uneasiness in the face of a disagreeable prospect c archaic : AWE2 : one causing fear or awe

But seriously, did we really need an expensive fMRI study to tell us this:

For those who dread a colonoscopy or a root canal so much that they avoid it altogether, scientists have good news.

The first study ever to look at where sensations of dread arise in the brain finds that contrary to what is widely believed, dread does not involve fear and anxiety in the moment of an unpleasant event. Instead, it derives from the attention that people devote beforehand to what they think will be extremely unpleasant.

Given the choice of waiting for an adverse outcome or getting it over with quickly, many people choose the latter. Theoretical models of decision-making have assumed that this occurs because there is a cost to waiting—i.e., dread. Using functional magnetic resonance imaging, we measured the neural responses to waiting for a cutaneous electric shock. Some individuals dreaded the outcome so much that, when given a choice, they preferred to receive more voltage rather than wait. Even when no decision was required, these extreme dreaders were distinguishable from those who dreaded mildly by the rate of increase of neural activity in the posterior elements of the cortical pain matrix. This suggests that dread derives, in part, from the attention devoted to the expected physical response and not simply from fear or anxiety. Although these differences were observed during a passive waiting procedure, they correlated with individual behavior in a subsequent choice paradigm, providing evidence for a neurobiological link between the experienced disutility of dread and subsequent decisions about unpleasant outcomes.

Is the anticipation of pain really as bad as experiencing it? The Associated Press seems to think so:

WASHINGTON (AP) -- Anyone who's ever taken a preschooler to the doctor knows they often cry more before the shot than afterward. Now researchers using brain scans to unravel the biology of dread have an explanation: For some people, anticipating pain is truly as bad as experiencing it.

How bad? Among people who volunteered to receive electric shocks, almost a third opted for a stronger zap if they could just get it over with, instead of having to wait.

More importantly, the research found that how much attention the brain pays to expected pain determines whether someone is an "extreme dreader" -- suggesting that simple diversions could alleviate the misery.

The research, published Friday in the journal Science, is part of a burgeoning new field called neuroeconomics that uses brain imaging to try to understand how people make choices.

PRELIMINARY SUMMARY from The Neurocritic (until I can read the entire Science paper) is this quote from Susan Sontag:

"The best emotions to write out of are anger and fear or dread. If you have emotions like that you just sail."

Wednesday, May 03, 2006

The Neurocritic did not find any new media coverage for the topic of Sunday's posting on Quotidian Virtual Violence. Two speculations on the lack of media hoopla were (1) no press releases from the journal (Human Brain Mapping) or the authors' institutions (Aachen University and Michigan State); and (2) moderating comments in the article's Discussion,

One might speculate that a frequent training of aggressive neuronal pattern leads to the development of aggressive problem-solving scripts, hostile attribution biases, and normative beliefs approving of aggression as stated by social-cognitive theory [Bandura, 2001]. This proposition with its important implications, however, is not a direct conclusion of this study's findings.

and Conclusion,

Although we observed patterns of suppressed affective structures induced by virtual violent interactions, the current experiment does not prove whether the rehearsal of such a mechanism can promote aggressive behavior in real life.

Well, despite the fact that the journal article was received by HBM on 29 August 2005 and accepted 13 December 2005, it was featured in NewScientist.com on 23 June 2005. So old news after all!

YOU know that just round the corner is a man who wants to kill you. Your heart is pounding and your hands are sweating - even though this is only a video game. But what is happening in your brain?

A small study of brain activity in video-game veterans suggests that their brains react as if they are treating the violence as real.

What's most interesting about the whole affair is a comparison of the quotes in the final article (above) and what the authors said to the press last year, here,

It is impossible to scan people's brains during acts of real aggression so Mathiak argues that this is as close as you can get to the real thing. It suggests that video games are a "training for the brain to react with this pattern," he says.

"There is a causal link between playing the first-person shooting game in our experiment and brain-activity pattern that are considered as characteristic for aggressive cognitions and affects," said René Weber, assistant professor of communication and telecommunication at MSU and a researcher on the project. "There is a neurological link and there is a short-term causal relationship.

"Violent video games frequently have been criticized for enhancing aggressive reactions such as aggressive cognitions, aggressive affects or aggressive behavior. On a neurobiological level we have shown the link exists."

SUMMARY from The Neurocritic:Nice to see that the reviewers and/or editors at Human Brain Mapping take their jobs seriously!

This study aims to advance the media effects debate concerning violent video games. Meta-analytic reviews reveal a small but noticeable association between playing violent video games and aggressive reactions. However, evidence for causal associations is still rare. In a novel, event-related functional magnetic resonance imaging study, 13 male research participants were observed playing a latest-generation violent video game. Each participant's game play was recorded and content analyzed on a frame-by-frame basis. Onscreen activities were coded as either "passive/dead, no interactions"; "active/safe, no imminent danger/no violent interactions"; "active/potential danger occurs, violent interactions expected"; "active/under attack, some violent interactions"; and "active/fighting and killing, many violent interactions." Previous studies in neuroscience on aggressive thoughts and behaviors suggested that virtual violence would suppress affective areas of the anterior cingulate cortex (ACC) and the amygdala subsequent to activity variations at cognitive areas of the ACC. Comparison of game play activities with and without virtual violence in 11 participants confirmed the hypothesis. The rather large observed effects can be considered as caused by the virtual violence. We discuss the applicability of neuroscience methodology in media effects studies, with a special emphasis on the assumption of virtuality prevalent in video game play.

About Me

Born in West Virginia in 1980, The Neurocritic embarked upon a roadtrip across America at the age of thirteen with his mother. She abandoned him when they reached San Francisco and The Neurocritic descended into a spiral of drug abuse and prostitution. At fifteen, The Neurocritic's psychiatrist encouraged him to start writing as a form of therapy.