Archive for the ‘brain’ Category

You may have heard that multitasking is bad for you, but new studies show that it kills your performance and may even damage your brain. Every time you multitask you aren’t just harming your performance in the moment; you may very well be damaging an area of your brain that’s critical to your future success at work.

Research conducted at Stanford University found that multitasking is less productive than doing a single thing at a time. The researchers found that people who are regularly bombarded with several streams of electronic information cannot pay attention, recall information, or switch from one job to another as well as those who complete one task at a time.

A Special Skill?

But what if some people have a special gift for multitasking? The Stanford researchers compared groups of people based on their tendency to multitask and their belief that it helps their performance. They found that heavy multitaskers—those who multitask a lot and feel that it boosts their performance—were actually worse at multitasking than those who like to do a single thing at a time. The frequent multitaskers performed worse because they had more trouble organizing their thoughts and filtering out irrelevant information, and they were slower at switching from one task to another.

Multitasking reduces your efficiency and performance because your brain can only focus on one thing at a time. When you try to do two things at once, your brain lacks the capacity to perform both tasks successfully.

Multitasking Lowers IQ

Research also shows that, in addition to slowing you down, multitasking lowers your IQ. A study at the University of London found that participants who multitasked during cognitive tasks experienced IQ score declines that were similar to what they’d expect if they had smoked marijuana or stayed up all night. IQ drops of 15 points for multitasking men lowered their scores to the average range of an 8-year-old child.

So the next time you’re writing your boss an email during a meeting, remember that your cognitive capacity is being diminished to the point that you might as well let an 8-year-old write it for you.

Brain Damage From Multitasking?

It was long believed that cognitive impairment from multitasking was temporary, but new research suggests otherwise. Researchers at the University of Sussex in the UK compared the amount of time people spend on multiple devices (such as texting while watching TV) to MRI scans of their brains. They found that high multitaskers had less brain density in the anterior cingulate cortex, a region responsible for empathy as well as cognitive and emotional control.

While more research is needed to determine if multitasking is physically damaging the brain (versus existing brain damage that predisposes people to multitask), it’s clear that multitasking has negative effects.

“I feel that it is important to create an awareness that the way we are interacting with the devices might be changing the way we think and these changes might be occurring at the level of brain structure.”

The EQ Connection

Nothing turns people off quite like fiddling with your phone or tablet during a conversation. Multitasking in meetings and other social settings indicates low Self- and Social Awareness, two emotional intelligence (EQ) skills that are critical to success at work. TalentSmart has tested more than a million people and found that 90% of top performers have high EQs. If multitasking does indeed damage the anterior cingulate cortex (a key brain region for EQ) as current research suggests, doing so will lower your EQ while it alienates your coworkers.

Bringing It All Together

If you’re prone to multitasking, this is not a habit you’ll want to indulge—it clearly slows you down and decreases the quality of your work. Even if it doesn’t cause brain damage, allowing yourself to multitask will fuel any existing difficulties you have with concentration, organization, and attention to detail.

St. Jude Children’s Research Hospital scientists have linked disruption of a brain circuit associated with schizophrenia to an age-related decline in levels of a single microRNA in one brain region

St. Jude Children’s Research Hospital scientists have identified a small RNA (microRNA) that may be essential to restoring normal function in a brain circuit associated with the “voices” and other hallucinations of schizophrenia. The microRNA provides a possible focus for antipsychotic drug development. The findings appear today in the journal Nature Medicine.

The work was done in a mouse model of a human disorder that is one of the genetic causes of schizophrenia. Building on previous St. Jude research, the results offer important new details about the molecular mechanism that disrupts the flow of information along a neural circuit connecting two brain regions involved in processing auditory information. The findings also provide clues about why psychotic symptoms of schizophrenia are often delayed until late adolescence or early adulthood.

“In 2014, we identified the specific circuit in the brain that is targeted by antipsychotic drugs. However, the existing antipsychotics also cause devastating side effects,” said corresponding author Stanislav Zakharenko, M.D., Ph.D., a member of the St. Jude Department of Developmental Neurobiology. “In this study, we identified the microRNA that is a key player in disruption of that circuit and showed that depletion of the microRNA was necessary and sufficient to inhibit normal functioning of the circuit in the mouse models.

“We also found evidence suggesting that the microRNA, named miR-338-3p, could be targeted for development of a new class of antipsychotic drugs with fewer side effects.”

There are more than 2,000 microRNAs whose function is to silence expression of particular genes and regulate the supply of the corresponding proteins. Working in a mouse model of 22q11 deletion syndrome, researchers identified miR-338-3p as the microRNA that regulates production of the protein D2 dopamine receptor (Drd2), which is the prime target of antipsychotics.

Individuals with the deletion syndrome are at risk for behavior problems as children. Between 23 and 43 percent develop schizophrenia, a severe chronic disorder that affects thinking, memory and behavior. Researchers at St. Jude are studying schizophrenia and other brain disorders to improve understanding of how normal brains develop, which provides insights into the origins of diseases like cancer.

The scientists reported that Drd2 increased in the brain’s auditory thalamus when levels of the microRNA declined. Previous research from Zakharenko’s laboratory linked elevated levels of Drd2 in the auditory thalamus to brain-circuit disruptions in the mutant mice. Investigators also reported that the protein was elevated in the same brain region of individuals with schizophrenia, but not healthy adults.

Individuals with the deletion syndrome are missing part of chromosome 22, which leaves them with one rather than the normal two copies of more than 25 genes. The missing genes included Dgcr8, which facilitates production of microRNAs.

Working in mice, researchers have now linked the 22q11 deletion syndrome and deletion of a single Dgcr8 gene to age-related declines in miR-338-3p in the auditory thalamus. The decline was associated with an increase in Drd2 and reduced signaling in the circuit that links the thalamus and auditory cortex, a brain region implicated in auditory hallucination. Levels of miR-338-3p were lower in the thalamus of individuals with schizophrenia compared to individuals of the same age and sex without the diagnosis.

The miR-338-3p depletion did not disrupt other brain circuits in the mutant mice, and the findings offer a possible explanation. Researchers found that miR-338-3p levels were higher in the thalamus than in other brain regions. In addition, miR-338-3p was one of the most abundant microRNAs present in the thalamus.

Replenishing levels of the microRNA in the auditory thalamus of mutant mice reduced Drd2 protein and restored the circuit to normal functioning. That suggests that the microRNA could be the basis for a new class of antipsychotic drugs that act in a more targeted manner with fewer side effects. Antipsychotic drugs, which target Drd2, also restored circuit function.

The findings provide insight into the age-related delay in the onset of schizophrenia symptoms. Researchers noted that microRNA levels declined with age in all mice, but that mutant mice began with lower levels of miR-338-3p. “A minimum level of the microRNA may be necessary to prevent excessive production of the Drd2 that disrupts the circuit,” Zakharenko said. “While miR-338-3p levels decline as normal mice age, levels may remain above the threshold necessary to prevent overexpression of the protein. In contrast, the deletion syndrome may leave mice at risk for dropping below that threshold.”

The study’s first authors are Sungkun Chun, Fei Du and Joby Westmoreland, all formerly of St. Jude. The other authors are Seung Baek Han, Yong-Dong Wang, Donnie Eddins, Ildar Bayazitov, Prakash Devaraju, Jing Yu, Marcia Mellado Lagarde and Kara Anderson, all of St. Jude.

An implant that beams instructions out of the brain has been used to restore movement in paralysed primates for the first time, say scientists.

Rhesus monkeys were paralysed in one leg due to a damaged spinal cord. The team at the Swiss Federal Institute of Technology bypassed the injury by sending the instructions straight from the brain to the nerves controlling leg movement. Experts said the technology could be ready for human trials within a decade.

Spinal-cord injuries block the flow of electrical signals from the brain to the rest of the body resulting in paralysis. It is a wound that rarely heals, but one potential solution is to use technology to bypass the injury.

In the study, a chip was implanted into the part of the monkeys’ brain that controls movement. Its job was to read the spikes of electrical activity that are the instructions for moving the legs and send them to a nearby computer. It deciphered the messages and sent instructions to an implant in the monkey’s spine to electrically stimulate the appropriate nerves. The process all takes place in real time. The results, published in the journal Nature, showed the monkeys regained some control of their paralysed leg within six days and could walk in a straight line on a treadmill.

Dr Gregoire Courtine, one of the researchers, said: “This is the first time that a neurotechnology has restored locomotion in primates.” He told the BBC News website: “The movement was close to normal for the basic walking pattern, but so far we have not been able to test the ability to steer.” The technology used to stimulate the spinal cord is the same as that used in deep brain stimulation to treat Parkinson’s disease, so it would not be a technological leap to doing the same tests in patients. “But the way we walk is different to primates, we are bipedal and this requires more sophisticated ways to stimulate the muscle,” said Dr Courtine.

Jocelyne Bloch, a neurosurgeon from the Lausanne University Hospital, said: “The link between decoding of the brain and the stimulation of the spinal cord is completely new. “For the first time, I can image a completely paralysed patient being able to move their legs through this brain-spine interface.”

Using technology to overcome paralysis is a rapidly developing field:
Brainwaves have been used to control a robotic arm
Electrical stimulation of the spinal cord has helped four paralysed people stand again
An implant has helped a paralysed man play a guitar-based computer game

Dr Mark Bacon, the director of research at the charity Spinal Research, said: “This is quite impressive work. Paralysed patients want to be able to regain real control, that is voluntary control of lost functions, like walking, and the use of implantable devices may be one way of achieving this. The current work is a clear demonstration that there is progress being made in the right direction.”

Dr Andrew Jackson, from the Institute of Neuroscience and Newcastle University, said: “It is not unreasonable to speculate that we could see the first clinical demonstrations of interfaces between the brain and spinal cord by the end of the decade.” However, he said, rhesus monkeys used all four limbs to move and only one leg had been paralysed, so it would be a greater challenge to restore the movement of both legs in people. “Useful locomotion also requires control of balance, steering and obstacle avoidance, which were not addressed,” he added.

The other approach to treating paralysis involves transplanting cells from the nasal cavity into the spinal cord to try to biologically repair the injury. Following this treatment, Darek Fidyka, who was paralysed from the chest down in a knife attack in 2010, can now walk using a frame.

Study paves way for personnel such as drone operators to have electrical pulses sent into their brains to improve effectiveness in high pressure situations.

US military scientists have used electrical brain stimulators to enhance mental skills of staff, in research that aims to boost the performance of air crews, drone operators and others in the armed forces’ most demanding roles.

The successful tests of the devices pave the way for servicemen and women to be wired up at critical times of duty, so that electrical pulses can be beamed into their brains to improve their effectiveness in high pressure situations.

The brain stimulation kits use five electrodes to send weak electric currents through the skull and into specific parts of the cortex. Previous studies have found evidence that by helping neurons to fire, these minor brain zaps can boost cognitive ability.

The technology is seen as a safer alternative to prescription drugs, such as modafinil and ritalin, both of which have been used off-label as performance enhancing drugs in the armed forces.

But while electrical brain stimulation appears to have no harmful side effects, some experts say its long-term safety is unknown, and raise concerns about staff being forced to use the equipment if it is approved for military operations.

Others are worried about the broader implications of the science on the general workforce because of the advance of an unregulated technology.

In a new report, scientists at Wright-Patterson Air Force Base in Ohio describe how the performance of military personnel can slump soon after they start work if the demands of the job become too intense.

“Within the air force, various operations such as remotely piloted and manned aircraft operations require a human operator to monitor and respond to multiple events simultaneously over a long period of time,” they write. “With the monotonous nature of these tasks, the operator’s performance may decline shortly after their work shift commences.”

Advertisement

But in a series of experiments at the air force base, the researchers found that electrical brain stimulation can improve people’s multitasking skills and stave off the drop in performance that comes with information overload. Writing in the journal Frontiers in Human Neuroscience, they say that the technology, known as transcranial direct current stimulation (tDCS), has a “profound effect”.

For the study, the scientists had men and women at the base take a test developed by Nasa to assess multitasking skills. The test requires people to keep a crosshair inside a moving circle on a computer screen, while constantly monitoring and responding to three other tasks on the screen.

To investigate whether tDCS boosted people’s scores, half of the volunteers had a constant two milliamp current beamed into the brain for the 36-minute-long test. The other half formed a control group and had only 30 seconds of stimulation at the start of the test.

According to the report, the brain stimulation group started to perform better than the control group four minutes into the test. “The findings provide new evidence that tDCS has the ability to augment and enhance multitasking capability in a human operator,” the researchers write. Larger studies must now look at whether the improvement in performance is real and, if so, how long it lasts.

The tests are not the first to claim beneficial effects from electrical brain stimulation. Last year, researchers at the same US facility found that tDCS seemed to work better than caffeine at keeping military target analysts vigilant after long hours at the desk. Brain stimulation has also been tested for its potential to help soldiers spot snipers more quickly in VR training programmes.

Neil Levy, deputy director of the Oxford Centre for Neuroethics, said that compared with prescription drugs, electrical brain stimulation could actually be a safer way to boost the performance of those in the armed forces. “I have more serious worries about the extent to which participants can give informed consent, and whether they can opt out once it is approved for use,” he said. “Even for those jobs where attention is absolutely critical, you want to be very careful about making it compulsory, or there being a strong social pressure to use it, before we are really sure about its long-term safety.”

But while the devices may be safe in the hands of experts, the technology is freely available, because the sale of brain stimulation kits is unregulated. They can be bought on the internet or assembled from simple components, which raises a greater concern, according to Levy. Young people whose brains are still developing may be tempted to experiment with the devices, and try higher currents than those used in laboratories, he says. “If you use high currents you can damage the brain,” he says.

In 2014 another Oxford scientist, Roi Cohen Kadosh, warned that while brain stimulation could improve performance at some tasks, it made people worse at others. In light of the work, Kadosh urged people not to use brain stimulators at home.

If the technology is proved safe in the long run though, it could help those who need it most, said Levy. “It may have a levelling-up effect, because it is cheap and enhancers tend to benefit the people that perform less well,” he said.

The social transmission of emotions has been reported in several studies in recent years. Research published in 2013, for example, found that joy and fear are transmissible between people, while a 2011 study showed that stress — as measured by an increase in cortisol — can be transmitted from others who are under pressure.1,2 Results of a new study that appeared in Science Advances suggest that pain may also be communicable.3

“Being able to perceive and communicate pain to others probably gives an evolutionary advantage to animals,” study co-author Andrey E. Ryabinin, PhD, a professor of behavioral neuroscience at Oregon Health & Science University, told Clinical Pain Advisor. Such awareness may trigger self-protective or caretaking behaviors, for instance, that facilitate the survival of the individual and the group.
In the current study, Ryabinin and colleagues investigated whether “bystander” mice would develop hyperalgesia after being housed in the same room as “primary” mice who had received a noxious stimulus. In one experiment, the paws of primary mice were injected with complete Freund’s adjuvant (CFA), which, as expected, induced persistent hypersensitivity that was apparent for 2 weeks. Bystander mice who had been injected with phosphate-buffered saline (PBS) similarly demonstrated hypersensitivity throughout the same 2-week period.

Bystander mice also displayed acquired hypersensitivity in another set of experiments in which primary mice experienced pain related to withdrawal from morphine and alcohol. This suggests that the transfer of hyperalgesia is not limited to the effects of inflammatory stimuli. In addition, the transfer was consistent across mechanical, thermal, and chemical modalities of nociception.

Tests revealed that nociceptive thresholds returned to basal levels in both primary and bystander mice within 4 days, and the transferred hyperalgesia was not accounted for by familiarity, as the effects were similar between mice that were not familiar with the others and those that were.
Finally, the authors determined that the transfer of hyperalgesia was mediated by olfactory cues (as measured by exposing naïve mice to the bedding of hypersensitive co-housed mice), and it could not be accounted for by anxiety, visual cues, or stress-induced hyperalgesia.

Future research is needed to pinpoint the molecular messenger involved in the transfer of hyperalgesia, and whether a similar process occurs in humans.

“Here we show for the first time that you do not need an injury or inflammation to develop a pain state–pain can develop simply because of social cues,” said Dr Ryabinin. These findings have important implications for the treatment of chronic pain patients. “We cannot dismiss people with chronic pain if they have no physical pathology. They can be in pain without the pathology and need to be treated for their pain despite lack of injury.”

When a flock of geese fly into the air and a hunter takes aim, which bird is most likely to drop from the sky? A new study published in the journal Biology Letters shows that those birds with larger brains relative to their body size are less likely to be shot by hunters.

PhysOrg reports:

The researchers found that those birds with smaller brains (relative to the size of their bodies) were more likely to be shot and catalogued—as were males and larger birds in general. The team looked at a variety of factors such as organ size, body mass, gender, species, color, etc., and found one factor that stood out very clearly from the rest—birds with larger brains were 30 times less likely to be shot and killed. This, the team suggests, indicates that hunting is very likely having an evolutionary impact on animals that are hunted by humans. They do not believe that hunters are specifically targeting smaller species, it’s more likely that those with larger brains have learned to be wary of humans.

Brain size is of course not the only possible factor for which bird ends up on a hunter’s dinner table. But the ability to distinguish danger with more clarity than your compatriots certainly helps, and the researchers point out that brain size might be part of that ability.

More than 200,000 U.S. soldiers serving in the Middle East have experienced a blast-related traumatic brain injury, making it a common health problem and concern for that population.

Traumatic brain injury (TBI) can have various harmful long-term neurological effects, including problems with vision, coordination, memory, mood, and thinking. According to the Centers for Disease Control and Prevention, TBI from a head injury is a leading cause of death and disability in the United States, and close to 5 million Americans—soldiers and non-soldiers alike—are currently living with a TBI-related disability. Current therapy for these patients involves supportive care and rehabilitation, but no treatments are available that can prevent the development of chronic neurological symptoms.

Researchers from the University of Iowa believe they may have identified a potential approach for preventing the development of neurological problems associated with TBI. Their research in mice suggests that protecting axons—the fiber-like projections that connect brain cells—prevents the long-term neuropsychiatric problems caused by blast-related traumatic brain injury.

In a recent study, the UI team led by Andrew Pieper, professor of psychiatry at the UI Carver College of Medicine, investigated whether early damage to axons—an event that is strongly associated with many forms of brain injury, including blast-related TBI—is simply a consequence of the injury or whether it is a driving cause of the subsequent neurological and psychiatric symptoms.

To answer that question, the researchers used mice with a genetic mutation that protects axons from some forms of damage. The mutation works by maintaining normal levels of an important energy metabolite known as nicotinamide adenine dinucleotide (NAD) in brain cells after injury.

When mice with the mutation experienced blast-mediated TBI, their axons were protected from damage, and they did not develop the vision problems, or the thinking and movement difficulties that were seen when mice without the mutation experienced blast-related TBI. The findings were published Oct. 11 in the online journal eNeuro.

“Our work strongly suggests that early axonal injury appears to be a critical driver of neurobehavioral complications after blast-TBI,” says Pieper, who also is a professor of neurology, radiation oncology, and a physician with the Iowa City Veterans Affairs Health Care System.

“Therefore, future therapeutic strategies targeted specifically at protecting or augmenting the health of axons may provide a uniquely beneficial approach for preventing these patients from developing neurologic symptoms after blast exposure.”

In confirming the critical relationship between axon degeneration and development of subsequent neurological complication, the new study builds on previous work from Pieper’s lab. The researchers also have discovered a series of neuroprotective compounds that appear to help axons survive the kind of early damage seen in TBI. These compounds activate a molecular pathway that preserves neuronal levels of NAD, the energy metabolite that has been shown to be critical to the health of axons. Pieper’s team previously demonstrated that these neuroprotective compounds block axonal degeneration and protect mice from harmful neurological effects of blast-TBI, even when the compound are given 24 to 36 hours after the blast injury.

In addition to Pieper, the research team included Terry Yin, Jaymie Voorhees, Rachel Genova, Kevin Davis, Ashley Madison, Jeremiah Britt, Coral Cinton, Latisha McDaniel, and Matthew Harper. Pieper also is a member of the Pappajohn Biomedical Institute at the UI.