Dr. Julie, a.k.a. Scientific Chick, brings you insights into what's happening in the world of life sciences. Straight from the scientific source, relevant information you should know about, in plain language.

Monday, April 27, 2009

You’ve heard of Atkins. You’re familiar with Weight Watchers. You probably know about the South Beach Diet. Well, there’s a new diet in town, and it’s called the “Longevity Diet”. It’s inspired by a relatively new hot topic in the life sciences called caloric restriction. As the name implies, it essentially means eating less, and it's hailed not so much as a weight loss strategy as an anti-aging solution. While I’m usually excited about new simple ways of changing daily habits to live a healthy life, this one I’m not sold on yet. You may remember that I like cheese. You may also remember that I like brownies. So obviously, I’m not too excited to hear that eating less is extra healthy. Especially since most caloric restrictions studies suggest you have to cut back anywhere from 30% to 60% of what you eat to see an effect.

So far, most of the really convincing data on caloric restriction slowing down the aging process have been carried out in model organisms that fit in your pocket: rats, mice, worms, all the way down to the tiny yeast. I personally would be reluctant to extrapolate those findings to humans. Surely worms don’t have the same kind of relationship with brownies that I do. However, one recent study looks at caloric restriction in healthy humans, and it’s hard not to take notice.

Researchers from Germany took 50 normal-to-overweight elderly subjects (sorry, Mom, in this case, “elderly” means 60ish, but the important thing is to be young at heart!) and divided them into groups. One group was told to not change their eating habits, and one group was put on 30% caloric restriction for 3 months. Before the study and after the 3 months, everyone’s memory was tested using simple tests like remembering a list of words. Well, I’m very sorry to say, but after 3 months, the group who ate less did significantly better at the memory tests. Sad but true.

During the study, the volunteers were monitored for many different biological indicators (such as cholesterol, insulin, inflammation, and cellular stress), in hopes of identifying the mechanisms responsible for the effects of caloric restriction. The one mechanism that really stood out and that showed a solid correlation with the memory improvements is insulin: the group on caloric restriction had lower insulin levels. Insulin is a hormone responsible for taking the sugar out of your blood and storing it in your liver and muscles to use when energy is needed, but it also plays an important role in keeping your brain healthy. When you have less insulin circulating in your body, you become more sensitive to it, and this sharpens and improves how your body (including your brain!) reacts to insulin. This may be why the group who ate less performed better on the memory tests.

So, throw out the cheesecake? I’m going to wait a little before I draw any solid conclusions, as there is still a very ongoing debate over caloric restriction. One side is claiming significant benefits like longevity, healthy aging and protection against age-associated diseases (think Alzheimer’s). The other side is critical of the methods and models used, as well as the contradictory results, and points to the downsides of caloric restriction, especially during the reproductive years. Not eating enough can also lead to the breakdown of muscles (and remember, your heart is a muscle), which is very important to consider if you have an active lifestyle. Interestingly, in this study, the authors show that the caloric restriction group lost a significant amount of weight, but did not lose body fat. Healthy? I think the jury is still out, but hopefully more well-controlled human studies will shed some light on this potentially exciting and easy way to fend off the effects of the ticking clock.

Saturday, April 18, 2009

As a teenager, like many of you I’m sure, I neglected sleep. It seemed that partying all night and then working hard at school all day was the obvious solution to an overbooked schedule. I was rudely awakened a few years later when a good night’s sleep became necessary to merely function the next day (also when I realized what it really meant to have an overbooked schedule). While we know that sleep is essential for survival (at some point, coffee just doesn’t cut it), we know very little about why sleep is so important, and what happens to our brain while we sleep. In a recent issue of Science, a team of researchers made a contribution to this field using sleep-deprived fruit flies.

In the study, researchers looked at synapses, the junctions between brain cells (called neurons). Synapses are important because they relay information from one neuron to the next. Through synaptic connections, neurons form networks, and these networks underlie many complex brain functions like perception and thought.

The researchers cleverly engineered fruit flies to make their synaptic connections fluorescent (for those of you who read my first post, this is an excellent use of GFP). Subsequently, the researchers took images of the flies’ brains, and were able to count the brightly fluorescent synapses. The study first established that flies that hang out with other flies (this is called social enrichment) have more synapses than lonely flies. While this finding is interesting on its own, the researchers didn’t stop there. They took the socially enriched flies and divided them in two groups. The first group of flies was allowed to sleep as much as they wanted for 48 hours while the second group of flies was sleep deprived for 48 hours. Which group of flies do you think had more synapses after the experiment?

Well, the study shows that flies that slept had much fewer synapses than the sleep-deprived ones. Does it surprise you?

Initially, I thought this was a little counter-intuitive. With all this talk about sleep being important for performance and memory, I would have thought that the flies that slept would have had more connections between brain cells. This study suggests exactly the opposite, and shows that sleep acts to downscale the synapses that are created while the flies are awake and experiencing new things. When you think about it, this finding makes sense. If there was no way to “reset” those synapses, we can hypothesize that every time you learn or experience something new, you would get more and more connections between your brain cells. Eventually, we can imagine it would be a complete mess up there, and connections might saturate, leaving no room for anything new. Downscaling your synapses at night while you sleep also helps eliminate the unimportant connections, thereby making the stronger synapses stand out.

So what’s the take-home message? Enough procrastinating on the internet, go to bed!

Monday, April 13, 2009

I recently wrote about natural selection, or the survival of the fittest, in bacteria, which is a pretty ruthless process. When it comes to humans, I’m thankful that we care for the sick and the disabled. Having had a few common health issues myself, I know that in the wild, I probably would not have made it past 15 or 16 years old (ever had mono? I can’t imagine hunting mammoths with mono). That being said, I always assumed that compassion and care for the ill was a relatively new concept, made possible by advances in civilization. A new paleopathology (that’s the study of past diseases) finding suggests compassion may have much earlier roots.

A group of Spanish bone hunters found a very interesting cranium at the Sima de los Huesos site in Spain. The cranium belongs to a child who died between the ages of 5 and 12 years old and who lived at least 530 000 years ago! The cranium has been almost fully reconstructed and clearly shows many signs of malformations. The researchers were able to link those signs to a disease that still exists today called craniosynostosis. This disease can have multiple causes and results in cranial deformities (such as an asymmetrical face) and mental retardation. In this case, the pathology would have been present before birth. So what we have here is that 530 000 years ago, there was a child who was visibly abnormal and affected in a way that he or she probably would not have been able to keep up with the group. The amazing finding of this paper is that this child made it to be at least 5 years old, and probably closer to 10 years old. This suggests not only that the population did not act against the individual who was different or sick during infancy (i.e. they didn’t kill a sick baby, like some other populations have been known to do), but also that they cared for the disabled. Not something I would have expected of our hunter-gatherer ancestors.

An interesting question that springs to mind here is, if humans from the Neanderthal era showed some form of compassion, what about animals? I’ll definitely be keeping my eyes out for more research in this area. This kind of science is my favorite: answer a question, and many more arise (unless we’re talking about my thesis project: in that case, all I want are answers).

As I mentioned, I always thought that caring for the ill was a very recent human behavior, but now I’m not so sure. In any case, I’m just glad to know that our ancestors were not complete jerks.

Sunday, April 5, 2009

If you’re planning to start a family, or make an addition to your already existing family, you probably know that it’s not a good idea to drink alcohol while you’re pregnant. If you choose to consume alcohol during your pregnancy, Fetal Alcohol Spectrum Disorder, or FASD, may happen to your baby, resulting in a whole range of possible consequences including facial malformations and mental retardation.

There may also be much subtler, lesser-known consequences of prenatal alcohol exposure, such as an increased risk of adolescent alcohol abuse. Two researchers from New-York recently added a piece to this puzzle.

The researchers fed pregnant moms either a diet containing ethanol or a regular diet, and then studied how the offspring reacted towards ethanol once they grew up to be teenagers or adults. Oh, and by pregnant moms I mean pregnant rat moms! The study’s first finding is that the offspring from the ethanol-fed moms enjoyed the taste of ethanol a lot more than the offspring of the regular moms. This was measured using a “lick” test: the teenage rats were given solutions of ethanol in increasing concentrations and the researchers counted how much the rats licked the solution compared with water. To try to better understand this finding, the researchers then did the same test but for a bitter tasting liquid. They found that teenage rats exposed to prenatal ethanol had a much better tolerance for the taste of the bitter liquid. Since rats are known to think that alcohol tastes both bitter and sweet, the researchers then tested whether the teenage rats from ethanol-fed moms also thought sugar tasted sweeter than the other rats, but there was no difference there. So this means that the rats that were exposed to ethanol in the womb find that ethanol tastes less bitter and equally sweet as regular rats, tipping the balance towards tasty and away from aversive.

The rats were then tested on how they perceived the smell of ethanol. Sure enough, the teenage rats from ethanol-fed moms seemed to appreciate the smell of ethanol more than the normal rats. Furthermore, their appreciation of the smell also led them to consume more ethanol!

Taken together, this means that exposing rats to ethanol in the womb would lead them to drink more as teenagers by making ethanol taste and smell better. The good news is these findings do not persist into adulthood. Even then, this study puts forward very relevant findings, especially since ethanol exposure during gestation is the best predictor of teenage abuse. The research offers a simple explanation: FASD teenagers may consume more alcohol simple because to them, beer tastes like triple-chocolate brownie fudge sundae with cookie dough to them (that’s the most delicious-sounding analogy I could come up with!). The researchers also point out that other drugs such as tobacco or marijuana have similar chemical smell and taste signatures, and therefore these findings may have broad implications for the link between maternal drug use and how vulnerable you are as a child or teenager.

Personally, the enjoyment I derive from alcohol consumption doesn’t nearly outweigh the consequences of potentially booze crazy teenagers. When the time comes to grow a family, I won’t be taking any chances, thank you very much.

About Me

Dr. Julie is an Assistant Professor of Neurology at the National Core for Neuroethics and the Djavad Mowafaghian Centre for Brain Health at the University of British Columbia. She holds a PhD in Neuroscience.