As Earth rotates around its axis, the organisms that inhabit its surface are exposed to daily cycles of darkness and light. In animals, light has a powerful influence on sleep, hormone release, and metabolism. Work by Takaomi Sakai, a neuroscientist at Tokyo Metropolitan University, and his team suggests that light may also be crucial for forming and maintaining long-term memories.

The puzzle of how memories persist in the brain has long been of interest to Sakai. Researchers had previously demonstrated, in both rodents and flies, that the production of new proteins is necessary for maintaining long-term memories, but Sakai wondered how this process persisted over several days given cells’ molecular turnover. Maybe, he thought, an environmental stimulus, such as the light-dark cycles, periodically triggered protein production to enable memory formation and storage.

Sakai and his colleagues conducted a series of experiments to see how constant darkness would affect the ability of Drosophila melanogaster to form long-term memories. Male flies exposed to light after interacting with an unreceptive female showed reduced courtship behaviors toward new female mates several days later, indicating they had remembered the initial rejection. Flies kept in constant darkness, however, continued their attempts to copulate.

The team then probed the molecular mechanisms of these behaviors and discovered a pathway by which light activates cAMP response element-binding protein (CREB)—a transcription factor previously identified as important for forming long-term memories—within certain neurons found in the mushroom bodies, the memory center in fly brains.

“The fact that light is essential for long-term memory maintenance is fundamentally interesting,” says Seth Tomchick, a neuroscientist at the Scripps Research Institute in Florida who wasn’t involved in the study. However, he adds, “more work will be necessary” to fully characterize the molecular mechanisms underlying these effects.

The idea of a machine that can decode your thoughts might sound creepy, but for thousands of people who have lost the ability to speak due to disease or disability it could be game-changing. Even for the able-bodied, being able to type out an email by just thinking or sending commands to your digital assistant telepathically could be hugely useful.

That vision may have come a step closer after researchers at the University of California, San Francisco demonstrated that they could translate brain signals into complete sentences with error rates as low as three percent, which is below the threshold for professional speech transcription.

While we’ve been able to decode parts of speech from brain signals for around a decade, so far most of the solutions have been a long way from consistently translating intelligible sentences. Last year, researchers used a novel approach that achieved some of the best results so far by using brain signals to animate a simulated vocal tract, but only 70 percent of the words were intelligible.

The key to the improved performance achieved by the authors of the new paper in Nature Neuroscience was their realization that there were strong parallels between translating brain signals to text and machine translation between languages using neural networks, which is now highly accurate for many languages.

While most efforts to decode brain signals have focused on identifying neural activity that corresponds to particular phonemes—the distinct chunks of sound that make up words—the researchers decided to mimic machine translation, where the entire sentence is translated at once. This has proven a powerful approach; as certain words are always more likely to appear close together, the system can rely on context to fill in any gaps.

The team used the same encoder-decoder approach commonly used for machine translation, in which one neural network analyzes the input signal—normally text, but in this case brain signals—to create a representation of the data, and then a second neural network translates this into the target language.

They trained their system using brain activity recorded from 4 women with electrodes implanted in their brains to monitor seizures as they read out a set of 50 sentences, including 250 unique words. This allowed the first network to work out what neural activity correlated with which parts of speech.

In testing, it relied only on the neural signals and was able to achieve error rates of below eight percent on two out of the four subjects, which matches the kinds of accuracy achieved by professional transcribers.

Inevitably, there are caveats. Firstly, the system was only able to decode 30-50 specific sentences using a limited vocabulary of 250 words. It also requires people to have electrodes implanted in their brains, which is currently only permitted for a limited number of highly specific medical reasons. However, there are a number of signs that this direction holds considerable promise.

One concern was that because the system was being tested on sentences that were included in its training data, it might simply be learning to match specific sentences to specific neural signatures. That would suggest it wasn’t really learning the constituent parts of speech, which would make it harder to generalize to unfamiliar sentences.

But when the researchers added another set of recordings to the training data that were not included in testing, it reduced error rates significantly, suggesting that the system is learning sub-sentence information like words.

They also found that pre-training the system on data from the volunteer that achieved the highest accuracy before training on data from one of the worst performers significantly reduced error rates. This suggests that in practical applications, much of the training could be done before the system is given to the end user, and they would only have to fine-tune it to the quirks of their brain signals.

The vocabulary of such a system is likely to improve considerably as people build upon this approach—but even a limited palette of 250 words could be incredibly useful to a paraplegic, and could likely be tailored to a specific set of commands for telepathic control of other devices.

Now the ball is back in the court of the scrum of companies racing to develop the first practical neural interfaces.

Dogs pay much closer attention to what humans say than we realised, even to words that are probably meaningless to them.

Holly Root-Gutteridge at the University of Sussex, UK, and her colleagues played audio recordings of people saying six words to 70 pet dogs of various breeds. The dogs had never heard these voices before and the words only differed by their vowels, such as “had”, “hid” and “who’d”.

Each recording was altered so the voices were at the same pitch, ensuring that the only cue the dogs had was the difference between vowels, rather than how people said the words.

After hearing the recordings just once, 48 of the dogs reacted when either the same speaker said a new word or the same word was said by a different speaker. The remainder either didn’t visibly respond or got distracted.

The team based its assessment of the dogs’ reactions on how long they paid attention when the voice or word changed – if the dogs moved their ears or shifted eye contact, for example, it showed that they noticed the change. In contrast, when the dogs heard the same word repeated several times, their attention waned.

Until now, it was thought that only humans could detect vowels in words and realise that these sounds stay the same across different speakers. But the dogs could do both spontaneously without any previous training.

“I was surprised by how well some of the dogs responded to unfamiliar voices,” says Root-Gutteridge. “It might mean that they comprehend more than we give them credit for.”

This ability may be the result of domestication, says Root-Guttridge, as dogs that pay closer attention to human sounds are more likely to have been chosen for breeding.

The work highlights the strength of social interactions between humans and dogs, says Britta Osthaus at Canterbury Christ Church University, UK. “It would be interesting to see whether a well-trained dog would react differently to the command of ‘sat’ instead of ‘sit’,” she says.

Preliminary findings from a clinical trial of heavy drinkers suggest that the drug can weaken certain memories tied to the reward of imbibing, although the mechanisms aren’t fully clear.

by CATHERINE OFFORD

he anesthetic drug ketamine could be used to rewire heavy drinkers’ memories and help them cut down on alcohol consumption, according to a study published yesterday (November 26) in Nature Communications. In a clinical trial of people who reported consuming around 590 grams of alcohol—equivalent to nearly two cases of beer—per week on average, researchers found that a procedure that involved administering the drug while people were thinking about drinking durably reduced consumption.

While it’s not clear how the method works at a neurological level, the study represents “a really exciting development,” Amy Milton, a behavioral neuroscientist at the University of Cambridge who was not involved in the work, tells STAT. She adds that the findings mark “the first time it’s been shown in a clinical population that this can be effective.”

The study was designed to manipulate the brain’s retrieval and stabilization of memories—in this case, those linking the sight and thoughts of alcohol to the reward of drinking it, study coauthor Ravi Das, a psychopharmacologist at University College London, tells Science News. “We’re trying to break down those memories to stop that process from happening.”

To do that, the team asked 30 of the participants to look at a glass of beer, followed by a sequence of images of alcoholic and non-alcoholic drinks. On the first day of tests, the session ended with participants being invited to drink the beer. On the second day, after viewing the beer and images, the screen cut off, and instead of drinking the beer, participants were given a shot of ketamine.

Among various functions, ketamine blocks NMDA receptors—key proteins in the brain’s reward pathways—so the researchers hypothesized that administering the drug during memory retrieval would help weaken participants’ associations between the sight or contemplation of alcohol and the reward of drinking it. Their results somewhat support that hypothesis. Nine months following the several-day trial, the volunteers reported cutting their drinking back by half.

“To actually get changes in [participants’] behavior when they go home and they’re not in the lab is a big deal,” Mary Torregrossa, a neuroscientist at the University of Pittsburgh who was not involved in the work, tells Science. But she notes that it’s not clear whether it was the ketamine or some other part of the procedure that led to the effect.

Another 60 participants, split into two control groups, received slightly different procedures that involved either beer or ketamine and still showed, on average, a 35 percent decrease in alcohol consumption after nine months. The participants themselves were recruited to the study through online ads—meaning that the researchers may have selected for people already interested in reducing consumption.

Whatever the mechanisms behind the effect, the results so far suggest the method is worth investigating, David Epstein, an addiction researcher at the National Institute on Drug Abuse, tells Science News. “If a seemingly small one-time experience in a lab produces any effects that are detectable later in real life, the data are probably pointing toward something important.”

Catherine Offord is an associate editor at The Scientist. Email her at cofford@the-scientist.com.

Boosting brain function is key to staving off the effects of aging. And if there was one thing every person should consider doing right now to keep their brain young, it is to add extra virgin olive oil (EVOO) to their diet, according to research by scientists at the Lewis Katz School of Medicine at Temple University (LKSOM). EVOO is a superfood, rich in cell-protecting antioxidants and known for its multiple health benefits, including helping put the brakes on diseases linked to aging, most notably cardiovascular disease. Previous LKSOM research on mice also showed that EVOO preserves memory and protects the brain against Alzheimer’s disease.

In a new study in mice published online in the journal Aging Cell, LKSOM scientists show that yet another group of aging-related diseases can be added to that list—tauopathies, which are characterized by the gradual buildup of an abnormal form of a protein called tau in the brain. This process leads to a decline in mental function, or dementia. The findings are the first to suggest that EVOO can defend against a specific type of mental decline linked to tauopathy known as frontotemporal dementia.

Alzheimer’s disease is itself one form of dementia. It primarily affects the hippocampus—the memory storage center in the brain. Frontotemporal dementia affects the areas of the brain near the forehead and ears. Symptoms typically emerge between ages 40 and 65 and include changes in personality and behavior, difficulties with language and writing, and eventual deterioration of memory and ability to learn from prior experience.

Senior investigator Domenico Praticò, MD, Scott Richards North Star Foundation Chair for Alzheimer’s Research, Professor in the Departments of Pharmacology and Microbiology, and Director of the Alzheimer’s Center at Temple at LKSOM, describes the new work as supplying another piece in the story about EVOO’s ability to ward off cognitive decline and to protect the junctions where neurons come together to exchange information, which are known as synapses.

“EVOO has been a part of the human diet for a very long time and has many benefits for health, for reasons that we do not yet fully understand,” he said. “The realization that EVOO can protect the brain against different forms of dementia gives us an opportunity to learn more about the mechanisms through which it acts to support brain health.”

In previous work using a mouse model in which animals were destined to develop Alzheimer’s disease, Dr. Praticò’s team showed that EVOO supplied in the diet protected young mice from memory and learning impairment as they aged. Most notably, when the researchers looked at brain tissue from mice fed EVOO, they did not see features typical of cognitive decline, particularly amyloid plaques—sticky proteins that gum up communication pathways between neurons in the brain. Rather, the animals’ brains looked normal.

The team’s new study shows that the same is true in the case of mice engineered to develop tauopathy. In these mice, normal tau protein turns defective and accumulates in the brain, forming harmful tau deposits, also called tangles. Tau deposits, similar to amyloid plaques in Alzheimer’s disease, block neuron communication and thereby impair thinking and memory, resulting in frontotemporal dementia.

Tau mice were put on a diet supplemented with EVOO at a young age, comparable to about age 30 or 40 in humans. Six months later, when mice were the equivalent of age 60 in humans, tauopathy-prone animals experienced a 60 percent reduction in damaging tau deposits, compared to littermates that were not fed EVOO. Animals on the EVOO diet also performed better on memory and learning tests than animals deprived of EVOO.

When Dr. Praticò and colleagues examined brain tissue from EVOO-fed mice, they found that improved brain function was likely facilitated by healthier synapse function, which in turn was associated with greater-than-normal levels of a protein known as complexin-1. Complexin-1 is known to play a critical role in maintaining healthy synapses.

Dr. Praticò and colleagues now plan to explore what happens when EVOO is fed to older animals that have begun to develop tau deposits and signs of cognitive decline, which more closely reflects the clinical scenario in humans. “We are particularly interested in knowing whether EVOO can reverse tau damage and ultimately treat tauopathy in older mice,” Dr. Praticò added.

New research has found that people who are illiterate, meaning they never learned to read or write, may have nearly three times greater risk of developing dementia than people who can read and write. The study is published in the November 13, 2019, online issue of Neurology®, the medical journal of the American Academy of Neurology.

According to the United States Department of Education, approximately 32 million adults in the country are illiterate.

“Being able to read and write allows people to engage in more activities that use the brain, like reading newspapers and helping children and grandchildren with homework,” said study author Jennifer J. Manly, Ph.D., of Columbia University Vagelos College of Physicians and Surgeons in New York. “Previous research has shown such activities may reduce the risk of dementia. Our new study provides more evidence that reading and writing may be important factors in helping maintain a healthy brain.”

The study looked at people with low levels of education who lived in northern Manhattan. Many were born and raised in rural areas in the Dominican Republic where access to education was limited. The study involved 983 people with an average age of 77. Each person went to school for four years or less. Researchers asked each person, “Did you ever learn to read or write?” Researchers then divided people into two groups; 237 people were illiterate and 746 people were literate.

Participants had medical exams and took memory and thinking tests at the beginning of the study and at follow-up appointments that occurred every 18 months to two years. Testing included recalling unrelated words and producing as many words as possible when given a category like fruit or clothing.

Researchers found of the people who were illiterate, 83 of 237 people, or 35 percent, had dementia at the start of the study. Of the people who were literate, 134 of 746 people, or 18 percent, had dementia. After adjusting for age, socioeconomic status and cardiovascular disease, people who could not read and write had nearly a three times greater chance of having dementia at the start of the study.

Among participants without dementia at the start of the study, during follow-up an average of four years later, 114 of 237 people who were illiterate, or 48 percent, had dementia. Of the people who were literate, 201 of 746 people, or 27 percent, had dementia. After adjusting for age, socioeconomic status and cardiovascular disease, researchers found that people who could not read and write were twice as likely to develop dementia during the study.

When researchers evaluated language, speed, spatial, and reasoning skills, they found that adults who were illiterate had lower scores at the start of the study. But their test scores did not decline at a more rapid rate as the study progressed.

“Our study also found that literacy was linked to higher scores on memory and thinking tests overall, not just reading and language scores,” said Manly. “These results suggest that reading may help strengthen the brain in many ways that may help prevent or delay the onset of dementia.”

Manly continued, “Even if they only have a few years of education, people who learn to read and write may have lifelong advantages over people who never learn these skills.”

Manly said future studies should find out if putting more resources into programs that teach people to read and write help reduce the risk of dementia.

A limitation of the study was that researchers did not ask how or when literate study participants learned to read and write.

The study was supported by the National Institutes of Health and National Institute on Aging.

Story Source:

Materials provided by American Academy of Neurology. Note: Content may be edited for style and length.

Down syndrome is a cognitive disability that can affect a person’s memory or ability to learn — intellectual impairments researchers traditionally thought were untreatable and irreversible.

But now, researchers from the University of California San Francisco and Baylor College of Medicine say they’ve reversed the impairments in mouse models of Down syndrome — potentially foreshadowing an ethically-fraught future in which doctors can do the same for humans with the condition.

All people with Down syndrome share one thing in common: an extra copy of chromosome 21. For that reason, much of the research on Down syndrome has focused on genetics.

But for this new study, published Friday in the prestigious journal Science, researchers focused on the protein-producing cells in the brains of mice with Down syndrome. That led them to the discovery that the animals’ hippocampus regions produced 39 percent less protein than those of typical mice.

Further study led the researchers to conclude that the presence of an extra chromosome likely prompted the animals’ hippocampal cells to trigger the integrated stress response (ISR), which decreased protein production.

“The cell is constantly monitoring its own health,” researcher Peter Walter said in a press release. “When something goes wrong, the cell responds by making less protein, which is usually a sound response to cellular stress. But you need protein synthesis for higher cognitive functions, so when protein synthesis is reduced, you get a pathology of memory formation.”

By blocking the activity of PKR, the enzyme that prompted the ISR in the mouse model’s hippocampal cells, the researchers found they could not only reverse the decreased protein production but also improve the animals’ cognitive function.

Of course, just because something works in mice doesn’t mean it’ll work in humans.

However, when the researchers analyzed postmortem brain tissue samples of people with Down syndrome, they found evidence that the ISR had been activated. They also obtained a tissue sample from a person with Down syndrome who only had the extra copy of chromosome 21 in some of their cells — and those cells were the only ones with ISR activated.

“We started with a situation that looked hopeless,” Walter said. “Nobody thought anything could be done. But we may have struck gold.”