Official site of Aharon Hersh Fried Associates

Monthly Archives: April 2016

Altruistic behavior is often seen as a hallmark of civilized person. Defined as a selfless concern for the well-being of others, or action/behavior that benefits others at someone’s own expense, altruism was, for very long time, viewed from two opposite perspectives.

Some would argue that altruism is an integral part of human nature, something that is written in our genes. Others would say that altruism is a product of civilizing influence which start to appear in human society with the development of culture and/or religion. The question appears to be mostly philosophical rather than scientific, and indeed it was mostly discussed and analyzed in philosophical and theological circles. Surprisingly, more definitive answer to this question may come from neuroscience. Indeed, recent research findings provide convincing evidences that, to a certain degree, we are biologically programmed to be good and caring of each other.

Altruism is not an exclusive domain of human culture –animals are known to be altruistic. Animals fearlessly defend their youngsters, even when knowing that the offspring belong to other members of the species. Many researchers do not view parental behavior as real altruism, though. In a more convincing experiment, scientists were giving electric shocks to a rat each time when its neighbor was eating food. The neighbor eventually stopped eating! Should we view this as an example of higher level of intelligence and brain development? Neuroscience provides a remarkable answer to this question.

Altruism and charity

In their seminal work published in 2006, Moll and co-authors investigated the human brain activity using functional MRI when participants were making decisions on charitable donations with real money. Anonymous charitable activity is universally seen as an example of pure altruism since individuals donating money can hardly ever expect any benefits, favors or financial gains come back to them.

The main theme of this experiment was to construct a map of the neural pathways involved in the decisions based on self-interests or any kind of altruistic behavior. The participants were provided with the list of charitable organizations and their mission statements, and were asked to donate small amount of their sum to organizations of their own choice so that scientist could study their brain activities. But the experiment involved an additional unusual feature: the money that were not donated would be given to participant as his/her personal monetary reward. Thus, there was a conflict between decisions to donate or to oppose the cause.

Most of the participants made consistently costly decisions donating, on average, 40% of money. Participants also took longer time making costly decisions than non-costly, showing that such decision involves moral emotions in judgment. Activity in different regions of the brain was observed according to the decision of the participant, either involving self-interest or selfless decisions.

Midbrain ventral tegmental area (VTA), dorsal striatum and ventral striatum were activated by both pure monetary rewards and decisions to donate. Donating to social causes activates two regions: VTA and striatum mesolimbic network. This suggests that both donation to societal causes and money earning activate anatomical system of reward reinforcements and expectancy.

The subgenual area (Brodmann’s area 25) was highly specific for decisions involving donations. This area plays an important role in social attachments. Unlike the midbrain VTA, this area was activated in situations where monetary rewards were not expected. The ventral striatum (with adjoining sepal’s region) was seen to be more active for donations rather than pure monetary rewards. The anterior prefrontal cortex was involved in decisions purely involving the benefits of others.

Leaving this anatomical description aside, what these findings demonstrate? The areas of brain that were lighting up during altruistic donations are actually the same ancient parts of the brain that are activated in response to food, sex and material gains. The results suggest that altruistic behavioral traits are hard-wired in the brain, and they are even pleasurable.

Altruism in the pre-frontal cortex

There are many areas in the brain that are responsible for decision making and reasoning. They include the amygdala, somatosensory cortex, anterior insula and prefrontal cortex. The combined effect of processes happening in these areas influences our altruistic behavior. Some areas are more important for decision making while others are involved in empathy, the sympathy for pain and feelings for others. But the most important area of them all was shown to be the prefrontal cortex.

Recent experiments have showed that the prefrontal cortex is responsible for behavioral changes and controlling impulses. In one study, researchers aimed to find out if certain areas of the prefrontal cortex might be involved in blocking the altruistic impulses.

The study participants were subjected to a noninvasive procedure called theta-burst Transcranial Magnetic Stimulation (TMS). This procedure temporarily dampens activity in specific regions of the brain, thus allowing to observe what happens when a specific part of brain is not active.

Those participants in whom the dorsolateral prefrontal cortex was dampened tended to be generous to people with higher income, i.e. those who wouldn’t be in much need of handouts. And in those participants in whom the dorsomedial prefrontal cortex was dampened, there was a tendency to be more generous towards everyone. The findings demonstrate once more that altruism is really encoded in our brain. By nature, we are very altruistic indeed.

Apart from answering the deeply philosophic question about our nature and morality, neuroscience appears to suggest potential new avenues for increasing empathy. This can have far reaching practical applications, particularly for treating people who have experienced desensitizing situations, such as war experience or a period of staying in prison. I won’t be surprised if one day we will have some pills aimed at modifying our character to the better.

Maslow, Rogers, Satir and Erickson are just some of the scholars who have shaped and will continue to shape a core psychological paradigm – humanism.

In this article, I elaborate on the optimal conditions necessary to become the best persons we can possibly be. Some psychologists refer to this ultimate state as self-actualization. I call it the optimal phenotype. I will also expound on the core features of what it is like to become self-actualized, as per Maslow.

Genetic underpinning is only a part of who we are. Genes create possibilities. Our internal and external experiences have more to do with the structure of our autobiography.

Getting our biopsychosocial needs met consistently and met over time is an important accomplishment. We need to experience essential survival needs, needs for safety and security, of affiliation and affection, for recognition and approval, to be in place before moving towards self-actualization.

According to Erickson, achieving mastery at each of the nine psychosocial developmental levels is another significant achievement. We learn to: trust others, show autonomy, demonstrate initiative, form an identity, realize intimacy with others, show generativity by serving others, and experience integrity and transcendence at the end of our days.

Those who eventually realize self-actualization evidence four common qualities. One, they have several, very close friendships as opposed to many “friends” on the varied social internet sites. Secondly, they are intrinsically motivated as opposed to some who seek external rewards for their actions. They experience a clarity about who they are and not in the world. Finally, periods of intense joy and contentment are frequent.

We all have an opportunity to be and do the best we can, and the end result is absolutely amazing!

The brain is a pliable organ. Its pliability makes it versatile – allowing the body to easily respond to the external environment and at the same time permit a variety of other cognitive as well as emotional and motor functions. But, over time, it ages like the rest of us.

A child’s brain is most malleable. The brain, in fact, grows dramatically in size during the childhood years, as shown by the rapid gains in intelligence of a child. Brain volume begins to diminish at age 30.

Similar to an automobile, the apparent health of the human brain and its maximal efficiency also depend on a lot of factors. The most profound among them is the presence of aerobic activity, or to put it more simply, exercise. According to recent studies, exercise greatly affects brain aging by slowing cognitive decline. Studies like this can be revolutionary for neurodegenerative diseases such as Alzheimer’s disease.

How does aging affect the brain?

As I’ve already said, by age 30, the brain begins to shrink. This happens gradually but eventually increases its pace by age 60 or 70.

The diminishing size is brought about by the enlargement of the lateral ventricles and widening of the brain sulci which later on will present with neurological deficits, as in the form of degenerative diseases. Most noticeable also is the atrophy of the limbic system, which is largely responsible for memory. The pace of atrophy, however, is minimized with healthier, older individuals in contrast to individuals with Alzheimer’s disease. By the age of 70, a significant decrease in the number of neurons will be noted, particularly in the neocortex. Neuronal loss can also be seen in other parts of the brain, such as the locus ceruleus, substantia nigra and also the nerve cells and myelinated fibers of the spinal cord. These losses are subtle, occurring minimally by the decade. However, this process accelerates by age 60.

A study explained that verbal intelligence begins to decline very gradually by the age of around 60. Profound age-related effects are seen with learning, memory, and problem solving. These might be due to the slowing of the speed in analyzing information. By age 70, the ability to process, memorize, and obtain new data as well as focus and recall names diminishes. However, although recalling names and specific dates becomes difficult, the memory of the experience itself is preserved.

While all these observations are backed by solid data and years of research, it is still worthy to note that the effects of aging on mental abilities varies tremendously among individuals. Certain individuals in history such as Goethe, Picasso and Humboldt continued to be productive and creative until late in their lives. However, these personalities only continued works that had been started earlier on in their lives and no, or little, new work was started in older age. In these situations, the effects of old age may have been compensated by above average intelligence, stringent work environment, and certain personal lifestyles.

Motor and gait changes are the most obvious manifestations of the effects of aging. By age 30, agility begins to decrease, brought about by diminishing neuromuscular control. This can be seen especially in athletes who usually retire by age 35. Urinary incontinence is also seen in the aging population. The decline in gait brings about a more common occurrence of falls in the elderly. These are usually caused by worsening visual acuity and vestibular function.

The inability to compensate for the drastically changing posture in the elderly also causes the frequency of falls, often occurring while performing day-to-day activities such as walking and moving down the stairs. Age-related neurological diseases such as stroke and proteinopathies as seen in Alzheimer’s disease can also cause both cognitive and motor impairments in the elderly.

How does exercise affect the brain?

There have been various studies that indicate the beneficial effects of aerobic exercise to the brain. One study showed that greater physical activity in the elderly is associated with greater brain volume in the frontal, temporal, parietal lobes and the hippocampus, decreasing their risks for Alzheimer’s-related dementia by 50%. Individuals who already have mild Alzheimer’s-related cognitive impairment also showed increased brain volumes. A significant improvement in cortical connectivity and activation was observed in one study using functional brain MRI. In line with this, elderly individuals who are physically fit performed better in cognitive testing than their unfit counterparts with similar amounts of exercise.

Exercise also increases levels of Brain-Derived Neurotrophic Factor (BDNF). BDNF is predominantly seen in the brain but its values cannot be directly assessed. The circulating levels of BDNF in blood serum gives an approximate value of BDNF in the brain. BDNF has been widely studied in animals, indicating its role in neuroplasticity and against neuronal atrophy. A study has shown that BDNF level is decreased in individuals with Alzheimer’s dementia. It is also observed that increased BDNF levels are observed in young, healthy individuals with short-term vigorous and long-term endurance exercise. This clearly suggests that younger individuals who are consistently active and do exercise or any form of sport have significantly decreased risk for developing dementia in their old age.

It is also important to note that certain lifestyles and genetic predispositions affect the normal aging process of the brain. A study has noted that increased total serum cholesterol in late middle-aged adults accelerates brain processes related to normal aging. Along with other factors, this increases the risk for development of Alzheimer’s dementia. Exercise, therefore, is one way to minimize this risk.

Implications

Exercise can become a cost-effective prophylactic therapy for neurodegenerative diseases such as dementia and Alzheimer’s. The type of exercise regimen does not matter as long as it suits the physical requirements and interests of the individual. Aside from the cognitive effects that can be brought about with exercise, beneficial effects can also be observed for cardiovascular and skeletal reasons. In general, routine physical activity in both young and old individuals is important factors in decreasing mortality rates and public burden. Nevertheless, more studies are needed to suggest specific interventions aimed at slowing down aging in the general population.

The concept of dressing smart is about to change. It doesn’t matter if you’re wearing trainers, a suit or a chimpanzee onesie. Any garment can actually be very smart if made with the right technology, that is to say, if it is “intelligent clothing”.

While smart wearables like wristbands and watches are already popular gadgets amongst athletes and fitness enthusiasts, smart clothes are still in the initial stages of production. However there are a few smart wearable electronic garments out there, which are already able to track our heartbeat and rate, body temperature, performance, muscle activity and breathing.

In the longer run, the potential uses of smart clothes are as varied as our needs, our creativity and our imagination. Japan, The US and Europe are at the top of the list in countries researching and developing these garments.

Smart clothes for sport and health

There have also been prototypes of smart clothes for monitoring posture and movement in order to improve body postures or assist in rehabilitation after injury. The sports and fitness industry is one of the most interested in creating smart garments. The variety of such garments is rapidly growing, with shirts which track sleep patterns, trousers which warm our legs before exercise, bras which support the breast when most needed and shirts which offer feedback during training.

One very remarkable design which is still in production is a bra that could scan for breast cancer.

According to experts; hospitals, the military and rescue services have been amongst the sectors most interested in the development of such garments, but soon the industry will expand towards individual consumption.

Smart clothes for babies

AT&T and Exmovere are working on baby pyjamas with biosensors fitted to transmit critical vital signs such as heart rate, skin temperature, moisture and movement in order to prevent cot death.

I wouldn’t be surprised if the next wave includes a range of smart clothes for pregnant or breastfeeding mothers. Such garments could feasibly be used to keep track of a baby’s development inside the womb or of much milk your baby is getting from the breast.

Is it all positive?

Smart clothes may very well have an extremely positive impact on our health since they will be able to directly assist us in accurately tracking the state of our bodies. Nonetheless, there are other potential consequences which should be taken into account such as the potential impact on our privacy and mental wellbeing.

Using smart clothes will doubtless become a very popular trend amongst those interested in body image, fitness and health, and probably for a host of other reasons including numerous lifestyle niches. For some people smart clothes could also quite literally become life savers.

However, we might find that these garments manifest as yet another source of information glut in a society that’s already flooded with it. Getting undressed at night might come to mean taking off our connections, as well as our clothes.

Anorexia nervosa affects millions of people throughout the world. It has a high mortality rate and the therapies that are currently available are highly ineffective. Yet only 10–30% of adults with anorexia recover with psychotherapy, and pharmacological treatments have a low efficacy. The need for better treatments is obvious and urgent.

Research has revealed a number of changes which occur in the brain of patients with anorexia. These include both structural and functional deficits such as the loss of grey matter in areas that play important roles in the regulation of feeding behavior, reward, emotion and motivation. It is believed that anorexia may be associated with a dysregulation of inhibitory and reward systems, which lays the ground for compulsive and obsessive behaviors to arise.

Reduced activity in the prefrontal cortex – an area of the brain responsible, among others, for goal-oriented behavior and decision making – has been found in patients with anorexia nervosa. This reduced activity is manifested in poor inhibitory control, which may explain some of the symptoms of anorexia, namely binge eating, purging, compulsive behaviors such as body checking and exercising, and obsessions with eating and weight.

Repetitive transcranial magnetic stimulation (rTMS) is a brain stimulation technique that uses magnetic pulses to induce electrical currents and activate specific parts of the brain. The application of rTMS to specific areas of the prefrontal cortex has been successfully used in other psychiatric disorders, including addictive behavior and depression. rTMS has also been shown to effectively reduce food cravings and binge eating in patients with bulimia nervosa.

Given its efficacy in these disorders, maybe rTMS can also be effective in anorexia. This was the rationale behind a new study published in PLoS ONE which aimed at determining whether rTMS could be a helpful therapy for anorexia. There had been preliminary studies that indicated that rTMS could reduce the symptoms of anorexia, both after a single session and after repeated treatment, specifically anxiety, feeling full and feeling fat.

To test the efficacy of rTMS, 60 patients with anorexia were subjected to one session of rTMS; their feeding behavior and decision-making patterns were tested before and after the intervention. Participants would watch videos of people eating appealing food while the same items were available to them, and they would then have to rate their urge to eat those foods. For the decision-making assessment, participants had to choose between smaller amounts of money available immediately and a larger amount available at later time points.

Their results were promising: they found that one session of rTMS reduced the urge to avoid food intake, the levels of feeling full and feeling fat, and impulsive decisions. These findings thus demonstrated that this brain stimulation technique could the reduce symptoms of anorexia by improving cognitive control over compulsive characteristics of the disease.

Further clinical studies are still needed before rTMS can be regularly applied to anorexia patients, but this is an important indication of the potential of neurostimulation techniques in psychiatric therapy. rTMS is a non-invasive, safe and well tolerated therapeutic approach and results such as these may encourage the progression of research and therapeutic trials using this methodology.

Although the changes induced by rTMS were described by the authors as just “a trend”, they are important because they show that the symptoms and decision-making abilities associated with anorexia can be improved with just one session of rTMS. It is possible that multiple sessions of rTMS can have even better results and, hopefully, become a viable treatment option for anorexia.

The brain is the body’s the most mysterious organ. It functions in ways that experts are still trying to figure out. As much as we know about this organ, there are still various things left to discover and learn. For instance, did you know that the brain functions differently in winter than it does in summertime?

Brain function and the seasons

The conclusion that the brain works differently in winter than it does in summer came as a result of a study conducted by Gilles Vandewalle and Christelle Meyer of the University of Liege in Belgium. The researchers who worked on the study inspected the cognitive brain functions of 28 Belgians during each season of the year.

During each season, participants spent about 4.5 hours in the lab where they didn’t have access to the external world or seasonal cues such as daylight. For the purpose of the study, researchers scanned participants’ brains while they performed tasks, the primary aim of which was to test their ability to sustain attention as well as to store, update and compare information in their memories.

The results of this study were published in Proceedings of the National Academy of Sciences and they showed that participants’ performance didn’t change regardless of the season. However, scientists discovered that the neural cost of performing these cognitive tasks (i.e. the amount of brain activity involved in the performance) changed with the season.

For instance, levels of brain activity associated with working memory peaked in the autumn and they were lower around the spring equinox. On the other hand, levels of brain activity that are linked with sustaining attention peaked in June – around the summer solstice – and they were at their lowest in December around the winter solstice.

Although alterations in brain activity during different seasons were evident, scientists aren’t sure what mechanisms are behind them. It is assumed that the levels of certain neurotransmitters like serotonin and the levels of brain proteins involved in learning vary with the seasons as well.

This is the first study ever to show that brain functions differ depending on the season.

Season changes and the body

The findings of this study present an additional evidence supporting the claim that the human body functions differently in summer and winter months. Previous studies showed that season changes are associated with alterations in other processes related to daily functioning.

For example, this Nature Communications study conducted by Chris Wallace of the University of Cambridge showed that the activity of the genes changes with seasons together with the immune system. That same study also found the link between mood and season changes. For instance, some people suffer from SAD (Season Affective Disorder) during cold winter months.

Conclusion

Various scientific researches have confirmed that our bodies function differently when seasons change. However, the study discussed in this article was the first research ever to inspect how the brain functions during different periods of the year. Its findings show that although people’s performances remain unchanged, brain activity is different. It peaks in June while its lowest point is in December. However, it’s not certain what mechanisms contribute to these changes.

The best news from March is that spring has sprung in the northern hemisphere! Spring is a great time to go outside and get active, which, as March showed us yet again, will only do you good.

Health and healthcare research also brought us good news in the form of new diagnostic tools and new therapies. But as always, there are also bad news. Here’s a selection of the best and worst news I came across in March. Comments are welcome!

THE BEST

Aging delayed by exercise

Exercise can greatly benefit our brain’s health by delaying the onset of cognitive decline. In a report published in Neurology in March, it was shown just how powerful exercise can be. The effect of leisure-time physical activity in cognitive performance was determined and it was shown that cognitively decline was significantly less accentuated in physically active subjects. Low levels of physical activity were associated with worse executive function, semantic memory, and processing speed, with a difference that was equal to that of 10 years of aging.

This is a great reminder to stay active.

Exercise increases gray matter volume

Another study assessing the health effects of exercise showed that the beneficial effect of physical activity in the brain may be associated with an increase in gray matter volume.

The article published in the Journal of Alzheimer’s Disease showed that physical activity was associated with larger gray matter volumes in the frontal, temporal, and parietal lobes, as well as in the hippocampus, thalamus, and basal ganglia. High levels of physical activity were also shown to decrease gray matter volume loss associated with neurodegeneration.

tDCS for stroke recovery

Rehabilitation of movement after stroke requires practice and time from brain changes to occur. In a new study found in Science Translational Medicine, it was tested whether transcranial direct current stimulation (tDCS) could improve movement in stroke patients.

The authors found that patients who received tDCS did indeed show a greater improvement in movement. These benefits were maintained for several months after motor training. According to these findings, brain stimulation may be an effective alternative therapy to improve clinical outcomes in stroke patients. Furthermore, it shows yet another application of tDSC.

A simple test to detect concussions

The symptoms of a concussion or traumatic brain injury are not always obvious and can sometimes be identifiable only after a few days. An early detection can allow a more effective treatment and may prevent the development of long-term problems. There are two proteins, GFAP and UCH-L1, whose presence in the blood has been proposed as a possible marker of brain injury.

In a new study published in JAMA Neurology, which aimed at determining the diagnostic accuracy of these proteins over time and their applicability in the clinical practice, it was shown that both GFAP and UCH-L1 were indeed detectible in the blood of trauma patients within 1 hour of injury, allowing the detection of mild to moderate traumatic brain injury and intracranial lesions. This can therefore be a very simple and accurate new diagnostic tool.

Voluntary imitation is preserved in Alzheimer’s patients

Cognitive impairment is the most notorious effect of Alzheimer’s disease. One of its consequences is an altered communication capacity in advanced stages of the disease. However, in mild to moderate stages, communication and social interaction abilities may still be preserved. Automatic imitation, an involuntary predisposition to copy observed actions, is one of the features that seems to be preserved. Voluntary imitation mechanisms requiring attention to another person’s actions, on the other hand, seem to be more easily affected by the disease.

Research published in Frontiers in Aging Neuroscience aimed at determining if this skill was indeed affected in Alzheimer’s patients. It was shown that Alzheimer’s patients in mild and moderate stages of the disease maintained an intact ability to reproduce observed movements, particularly when those were performed by a human agent instead of a computer. These results show that the high-level cognitive processes required for voluntary imitation are preserved in mild and moderate stages of Alzheimer’s disease and that they may be used in interpersonal communication.

THE WORST

Transient amnesia induced by fatigue

Working long hours and being sleep deprived can easily induce fatigue – this is quite common in healthcare professionals who have to go through long working shifts. Slower reaction times, decreased performance, and impaired judgment are among the most common consequences of fatigue.

But a new report published in the journal Cortex adds another consequence: transient amnesia. This report describes a few cases of healthcare professionals who had to attend to attention-demanding episodes of care, with their decisions being recorded in writing at the time, but completely forgotten some hours later. Although this is rare, the fact that these cases occurred shows that prolonged wakefulness associated with intensive and intellectually demanding work can lead to transient memory dysfunctions. This may have serious consequences in many professional settings and should be taken into account when long working hours are needed.

Environmental chemicals and neurological diseases

Environmental factors are often associated with the development of numerous diseases. Chemicals released into the environment, namely pesticides, can be particularly harmful. A study published in Nature Communications aimed at pinpointing possible links between neurological diseases and chemicals commonly found in our environment and food.

To do so, the authors exposed neuronal cell cultures to hundreds of pesticides, fungicides and other chemicals, and evaluated the molecular changes that those chemicals induced in neurons. It was found that rotenone, a pesticide that had already been associated with Parkinson’s disease risk, and the fungicides pyraclostrobin, trifloxystrobin, famoxadone and fenamidone, by stimulating oxidation and by inducing structural changes in neurons, led to pathological alterations that resembled those observed in brain samples from humans with autism, advanced age, Alzheimer’s disease and Huntington’s disease. This study highlights the potential impact of chemicals we can potentially ingest in our brain’s health.

Zika and brain damage

The outbreak of the Zika virus has been all over the news recently, mostly due to its association with a possible increased risk of congenital microcephaly.

In March, The New England Journal of Medicine published a report that further supports this association. It describes a case of a pregnant woman and her fetus who were infected with the Zika virus during the 11th gestational week. In 4 weeks, between the 16th and the 20th week of gestation, the fetal head circumference decreased from the 47th to the 24th percentile, revealing a decrease in the rate of brain growth. Around the 20th week of gestation, substantial brain abnormalities were found, and genetic material from the virus was still present in the mother’s serum. Postmortem analysis of the fetal brain showed thinning of the cerebral cortex and the presence of Zika viral particles and genetic material, supporting its role in inducing severe damage to the fetal brain.

Birth control pills increase the risk of seizures

Birth control pills can have some undesired side-effects including, for example, an increase in blood pressure and in the risk of cardiovascular diseases. Studies have also shown that women with epilepsy who use oral contraceptives report a higher frequency of seizures.

Taking that into account, a new study published in Epilepsy Research investigated the effect of ethinyl estradiol, the primary component of birth control pills, on epileptic seizures in mice. Ethinyl estradiol accelerated the rate of epileptic seizures as well as their severity. This is an important finding that women should take into account when choosing oral contraceptives as a birth control method.

The long-lasting effects of amphetamine use in adolescence

Amphetamine use can induce a number of behavioral and neurochemical changes by interacting with neurotransmitter release in the brain. New experimental research using rats investigated if amphetamine exposure could induce long-lasting changes in neurochemical transmission in the prefrontal cortex, a brain region which plays a key role in behavioral control.

According to the article published in Neuroscience, the brains of rats exposed to amphetamines, either starting at adolescence or adulthood, showed altered dopamine signaling in the prefrontal cortex associated with functional changes. These changes were more persistent after exposure to amphetamine in the adolescence. This indicates that amphetamine use during adolescence, when the brain is still developing, can have long-lasting detrimental consequences.

In this article I present a selection of publications that came out in March. There were many interesting developments, both in fundamental neuroscience and neurology, and in the practical aspects of dealing with brain-related diseases and disorders.

On March 20th, the scientific community marked the birthday of Erwin Neher, who received the 1991 Nobel Price in Physiology and Medicine for discovering the functions of single ion channels in cells. Together with Bert Sakmann, Neher developed the patch clamp technique that enabled the recording of the current of single ion channel molecule for the first time. The work contributed substantially to the fundamental understanding of nerve activity.

THE BEST

Alzheimer’s–preventing implant?

This idea has a potential to revolutionize the treatment of Alzheimer’s disease. And the proof of concept was just published this month.

In the animal experiments, researchers implanted a capsule containing specifically modified cells under the skin of mice. The cells produce antibodies against amyloid-beta, a protein that is known to over-accummulate in the brain of patients with Alzheimer’s disease and eventually causes neurodegeneration. The capsule gradually releases antibodies and thus successfully prevents the formation of amyloid-beta plagues. Implementation of a similar device suitable for human treatment may pave the way to significantly reducing the burden of Alzheimer’s disease and similar neurodegenerative conditions.

Pain relief due to meditation is opioid-free

Pain is a natural reaction to the body harm that warn us of potential or present damage. What eventually stops us feeling the pain is the internal production of natural opioids. Cognitive approaches to reducing pain, such as distraction, acupuncture, hypnosis and even placebo, all work through this opioid-based mechanism.

When researchers attempted to find out if the same is true for meditation, the result came as a surprise. In experiments where scientists blocked the opioid receptors in the body using a drug targeting them, meditation was still capable of providing the pain relief. The findings point to the existence of yet another molecular mechanism of pain relief, which needs to be further investigated. Such studies might help in developing new painkillers that are especially needed for people with chronic pain who developed resistance to opiate-based drugs.

Exercise significantly slows down brain aging

The benefits of exercise are well known and widely publicized. New research data have convincingly demonstrated yet another advantage of being physically active: exercise slows down the aging of brain. The long-term North Manhattan study data show that physically active elderly individuals perform much better in the cognitive tests compared to their sedentary counterpart. In fact, the difference in tests results equals to about 10 years difference in the brain age!

Structure of Parkinson’s protein finally characterized

One of the reasons Parkinson’s disease is still poorly manageable is the lack of well-studied suitable molecular targets. The protein alpha-synuclein is the major culprit in the development and progression of this condition: it forms insoluble fibrils disrupting the brain cells activity. Unfortunately, due to the complexity of alpha-synuclein fibrils, their molecular structure has been poorly investigated.

This gap in knowledge was filled with the report published this month that outlines high-resolution molecular details of alpha-synuclein deposits. The findings will help in identification of suitable pharmaceutical targets and, eventually, in developing the drugs directed at them.

Bacteria from GI tract can reduce severity of stroke

We know that the brain and the gastrointestinal (GI) tract “talk” – this is particularly obvious when it comes to the regulation of appetite and body weight. But it appears that the relationship might be much more complex than previously thought. Certain bacteria residing in the GI tract may modify our immune system in such a way that it can decrease the severity of stroke.

In the recently published study, researchers demonstrated that the severity of induced ischemic stroke in mice treated with antibiotics is reduced by 60%. Antibiotics can change the balance of different bacterial species in GI tract. The bacteria, in turn, can modify the cells of the immune system that assemble on the meninges, the outer covering of brain. These cells modify and direct the response to stroke. The discovery open the way of reducing the severity of potential stroke in vulnerable patients through pharmaceutical or dietary interventions.

THE WORST

Western diet might facilitate development of Alzheimer’s disease

The classical Western diet–rich on fat, sugars and animal products–is known to be unhealthy and linked to obesity and associated chronic conditions, such as cardiovascular problems, strokes and some cancers.

New findings suggest that this diet can also increase the chances of developing Alzheimer’s disease at older age. At least this is what was demonstrated in the experiments on laboratory animals. Mice that were kept on a “Western diet” chow for ten months demonstrated dramatic increase in the activity of microglia and monocytes in the brain. Both types of cells function as components of immune system in the brain, and their increased activity is known to elevate susceptibility to Alzheimer’s disease.

Higher BMI leads to poorer memory

The findings above correlate well with the recent report from the University of Cambridge showing that excess body weight correlates with poorer memory. Higher BMI values were shown to correlate with poor performance in the tests on forming and retrieving episodic memories. It appears that this effect is linked to the previously found structural and functional changes in the brain of obese people. These changes were identified in the hippocampus, a key area of the brain responsible for memory and learning, and the frontal lobe, which is involved in problem solving and decision making.

Common contraceptive increases the risk of seizures

A commonly used contraceptive drug, ethinyl estradiol, was shown to increase the frequency of seizures in women suffering from epilepsy. Women with epilepsy taking this drug experience seizures 4.5 times more often. In addition, the seizures last longer, exposing the brain to potential permanent damage.

Dopamine receptor agonists do not help in schizophrenia

At present time, we have no efficient treatments for the negative and cognitive effects associated with schizophrenia. Previous findings provided hope that treatment with low-dose dopamine-1 receptor (D1R) agonists may address this problem. In the study published this month, researchers used functional MRI to evaluate the effects of D1R agonist on the activity of brain performing a working memory task. Disappointingly, no treatment effect was observed.

Prolonged stress is damaging for memory

Extended periods of stress were shown to damage memory. In the study published this month, researchers induced sustained stress in mice by repeatedly exposing them to an intruding aggressive alpha-male. This is a kind of satiation that can be often encountered in human society when people are exposed to bullying by peers or an abusive boss at work. Chronically stressed mice demonstrated worsened performance in cognitive tests: for example, they weren’t able to navigate a maze previously known to them fast enough. Importantly, the erosion of memory was linked to the increased level of inflammation in brain that was caused primarily by increased level of macrophages.

The findings can potentially redirect the research on treating the chronic stress towards its immune components.

This is a very interesting and timely question because, as the number of brain training interventions grow, we realize that there are in fact approaches that do work, while others do not. What is the difference between them? What determines whether or not a brain training method will work? These were the main discussion topics of this session.

As Alvaro Fernandez, CEO and Co-Founder of SharpBrains, highlighted, one thing needs to be cleared before entering this discussion: how do we define that a brain training intervention is “working”? As Alvaro Fernandez stated, what consumers of brain training interventions are mainly looking for is to improve performance, not treat diseases. By living longer, we are faced with the need to delay cognitive decline as much as possible. And, in fact, from a healthcare perspective, the best way to prevent symptoms of aging, cognitive decline, neurodegeneration, or even depression is to improve performance in a sustained manner. Better performance will delay decline, and consequently, will delay the onset of symptoms. So, we can say that a brain training intervention works when it improves cognitive performance in a sustained manner.

Another aspect that is important to understand is what kind of interventions are useful. Optimal brain health is the result of a multitude of factors; it is influenced by nutrition, physical exercise, mental stimulation, among others. When any of these elements is defective, the brain starts to lose power. What each person has to do is to figure out what needs to be worked on.

What is brain training?

Brain training is also a factor that may directly influence brain health. Brain training is more than just mental activity, it is mental exercise. Alvaro Fernandez used a great example that illustrates the difference between mental activity and exercise: cab drivers vs bus drivers. Cab drivers constantly need to figure out the best routes to get from one point to another, as well as to know where each point is located. Doing so acts as a type of mental exercise. Bus drivers, on the other hand, usually follow well-known repetitive routes; although there’s constant mental activity, there is little novelty and variety, and thus, little challenge. This difference is reflected in the fact that cab drivers’ memory and orientation skills tend to improve throughout time.

This serves as an answer to the initial question – brain training works when it offers mental exercise, when it provides novelty, variety, and challenge to our brain.

Mental exercise is pretty much like physical exercise – we should train as many “muscles” as possible. We can call it cross-training – a training of different types of intelligence and capacities, such as emotional, executive or perceptual skills.

Professor Bruce Wexler, another speaker at the SharpBrains Virtual Summit session, added that for brain training to work, certain requirements must be met. First off, a neurocognitive target, a specific intervention needs to be clearly determined – defining a clear target will most likely lead to good results. In the context of disease, the pathological processes should be well understood; in the context of neuroenhancement, there must be consistent targeting of a neuronal process. Furthermore, an effective training program should be individualized, focused on each one’s needs; it should also be multi-dimensional and performed systematically.

Brain stimulation is on the rise

One of the most popular approaches to brain training is transcranial direct current stimulation (tDCS), a cheap and safe technique which has been increasingly used to enhance brain functions. But does tDCS make the brain work better? Can it be used to enhance cognitive states?

According to Professor Roy Hamilton, also a speaker in this session, depending on how you stimulate the brain, you may get different effects.

There have been studies using tDCS to enhance cognitive skills with many interesting results. When used to enhance motor learning skills, for example, it has been observed that there is an accelerated rate of skill acquisition in individuals receiving tDCS, and that those effects are persistent, with performance remaining improved throughout time. Mental flexibility can also be improved, namely impulse control, cognitive control and even creativity.

Studies on working memory enhancement also showed that subjects receiving tDCS can improve their performance in working memory tasks such as committing number sequences to memory. Interestingly, this enhancement is greater when tasks are more difficult. This is actually a key point, since it shows that the nature of the training is relevant to its effects.

This brings us back to the initial question of what conditions determine if brain stimulation will work. An important factor may be the type of training you’re engaged in while receiving brain stimulation. The demands of the specific task in hand seems to make a difference on whether or not there are relevant effects.

Another important factor is the baseline cognitive state of the subject. When initial skills are low, there is greater room for improvement and a greater improvement is in fact observed. But when skills are already good to begin with, brain stimulation can potentially be counterproductive. In some cases, there is also the possibility that the enhancement of one aspect of cognition may lead to a decrease in another aspect, acting as a kind of tradeoff between mental skills.

Although brain stimulation can be effective, there is still a lot to be learned. An important question that remains to be answered is how these interventions affect the brain’s chemistry and the production of neurotransmitters. There are complex interactions that need to be understood so that the whole potential of brain stimulation and brain training techniques can be understood and achieved.

Many older adults take proton pump inhibitors (PPIs) to treat gastrointestinal diseases. And, many older adults have dementia. Recently, a study in JAMA Neurology linked these two common features of the elderly, but questions remain about the validity of the results and about the real risk of PPI use.

The authors of the current study examined eight years of German data and evaluated associations between PPI use and new diagnoses of dementia. They report that the nearly 3,000 people who were older than 75 and regularly used PPIs had a significantly increased risk of developing dementia compared to people who did not regularly use PPIs. (The same group published a similar report of a longitudinal multicenter study in the European Archives of Psychiatry and Clinical Neuroscience; both studies reported an approximately 1.4-fold increased risk of developing dementia with regular PPI use.) The authors go so far as to posit that avoiding PPIs may prevent dementia.

This observed association between PPIs and dementia is supported by previous pharmacoepidemiological studies and mouse models that indicate that PPI use alters the development and metabolism of beta-amyloid plaques, a key factor in Alzheimer’s, a specific type of dementia. Still, many other factors contribute to the development of dementia, and the authors of the current study adjusted the estimated risks of dementia for several confounding factors: age, sex, other medications, history of stroke, depression, diabetes, and ischemic heart disease. But, they did not adjust for other well-known risks such as family history of dementia, hypertension, and alcohol use. Further, they did not address length of PPI use or dose in the association with dementia.

Data are unclear and associations are far from conclusive evidence of cause-and-effect. In fact, a recent case-control study (also conducted in Germany) reported that PPIs were associated with a decreased risk of dementia. And, several reports of medication use in older adults add to the “which comes first” conundrum of PPIs and dementia. In fact, the reports claim that dementia is associated with an increased risk of inappropriate PPI use.

By 2040, nearly 100 million people worldwide are expected to suffer from dementia. So, an increased understanding of the risks associated with this condition is warranted. However, the current study leaves too many questions unanswered to claim that PPI use should be avoided in the elderly.

Still, the findings may serve as a reminder that many medicines are prescribed incorrectly or inappropriately – especially in the elderly – and health care providers should be diligent to ensure that all medications are used only for the correct indications and at the correct doses, while minimizing risks associated with all medication use.