Dr. Julie, a.k.a. Scientific Chick, brings you insights into what's happening in the world of life sciences. Straight from the scientific source, relevant information you should know about, in plain language.

Sunday, May 29, 2011

It’s been a while, science-loving friends, and I apologize. I could list all the things that have kept me away from this blog in the past month, but then I might scare away anyone who is considering the postdoc life. Instead, I will reward your patience with brain news hot off the press: a report from a neuroethics conference I just attended in Montreal called Brain Matters. As with previous conference reports, I will share my insights in bullet-proof format, as my foggy jet-lagged brain cannot write a coherent paragraph at the moment.

The conference brought together a large variety of professionals: neuroscientists, lawyers, bioethicists, philosophers, psychiatrists, you name it. As it turns out, psychiatrists know a joke or two.

We heard quite a bit about how the media handles neuroscience news. The consensus is that in most cases (but not all), the answer is poorly. The blame gets tossed around. Journalists hype research too much, but it’s not their fault, they need to sell papers. Researchers hype research too much, but it’s not their fault, they need to get funding. I voted to shift the blame onto grad students. I also thought we could solve the problem easily by making everybody read Scientific Chick. It didn’t take as well as I had hoped.

We also heard quite a bit about deep brain stimulation (DBS), a potential treatment for a variety of illnesses and conditions that involves sticking a stimulating electrode in the brain and leaving it there. Right now, this works relatively well for treating advanced cases of Parkinson. The problem is that it comes with side effects and that people undergoing this type of treatment are reporting things like “no longer feeling like themselves”. This brings us to an important question: What does it mean to feel like yourself? One of the most fascinating talks of the conference involved an in-depth discussion of personal identity and how it is or isn’t affected by brain interventions like DBS.

There was also a discussion of self-experimentation. Should willing neuroscientists be allowed to stick electrodes in their own brains to advance our knowledge of neuroscience? The speaker argued that we allow people to skydive and bungee jump without having them fill endless forms and run their proposal to do something crazy through an ethics board, so self-experimentation should be no different. I mean, do you want the Nobel Prize or not?

One of the most interesting talks was on placebos. The researcher argued that antidepressants work only marginally better than placebos in most cases (though not all cases), and so we should really ask ourselves whether the small improvement is worth the side effects. I thought of a genius business venture that involves selling sugar pills for every possible condition. Then I remembered that this already exists. It’s called homeopathy.

Speaking of placebos, everyone always assumes that they only work because you think you’re getting an active drug. Some researchers were skeptical about this, and so they carried out this fascinating study that involved giving placebos to people with irritable bowel syndrome, but also telling them that they’re getting placebos the entire time. Guess what? They felt better anyway.

This blog post is already dragging on too long, but these were just a few of all the very interesting topics and discussions we had over two days. I hope to be back with regular science programming very shortly, so stay tuned for the latest and greatest!

Sunday, April 10, 2011

My Canadian readers are no doubt preparing for the upcoming federal election. Living in a democracy, we have the opportunity to vote for the candidate we think is most competent (even if sometimes it seems like all options are pretty grim). If we choose well, we may be rewarded: there is much evidence that intelligence and competence correlate with effective performance in politics. Unfortunately, research also shows that intelligence can’t be predicted from one’s appearance. Everyone knows that. That’s why we would never choose a competent candidate solely based on what they look like. Or would we?

To evaluate how much looks factor in when choosing a political candidate, Swiss researchers asked over 600 adults to rate which of two faces (in photographs) looked most competent. Little did the participants know, the two faces were actually of two candidates in a past French parliamentary election. As it turns out, over 70% of the participants ended up picking the candidate who had won the election.

To take things a little further, the researchers then carried out a similar experiment in children. They had over 600 children ranging from five to thirteen years old play a computer game that involved a sailing trip from Troy to Ithaca (sounds familiar?). After the game, the children were shown the same two faces used in the adult experiment and were asked to choose who they would prefer to have as captain of the boat. Again, just over 70% of the children chose the election winner.

Interpreting these results can be a bit tricky, but keep in mind that we already know that competent people aren’t necessarily prettier. One thing is clear: the experiments tell us that adult and children use similar types of visual information when judging whether someone is competent or not. The researchers also conclude that voters don’t factor in enough information about the actual performance of candidates when heading to the ballots, relying instead on what candidates look like. While I’m sure they are at least partially right, I think it’s a bit of a stretch to come to this conclusion given the simplicity of the study.

In any case, the results serve as a good reminder to think about our candidates and to take our civic duties seriously. After all, we wouldn’t judge a book by its cover…Reference: Predicting elections: child’s play! (2009) Antonakis J and Dalgas O. Science 323(27):1183.

Monday, April 4, 2011

When I heard all the buzz about a recent Canadian study showing that identical twins don’t share the same DNA, I thought: there’s this week’s blog post. Easy peasy. I imagined the title of the article was probably something like “Identical twins don’t share the same DNA” or, even better: “Just kidding: everybody is unique after all”. Instead, to give you the low-down on this moderately exciting finding, I had to read through a paper called “Ontogenetic de novo copy number variations (CNVs) as a source of genetic individuality: studies on two families with MZD twins for schizophrenia”. So don’t ever say that I don’t love you.

The finding is very straightforward and fairly intriguing, but before I dish out the details, I have to remind you about two concepts you may not have heard since high school (but rejoice: it's sex-related): meiosis and mitosis. Meiosis is what happens when one cell with two copies of each chromosome divides to produce gametes – in our case, sperm cells and egg cells, each with a single copy of chromosomes. This process is necessary for sexual reproduction. Mitosis is what happens when one cell generates two separate sets of chromosomes and then divides, leaving each daughter cell identical to the mother cell. This process is necessary during early development.

Now onto the article: the researchers were on the hunt for genes that are involved in schizophrenia, and as is often the case in these types of studies, they were looking at identical twins. The idea is that diseases can arise from genes (for example, Huntington’s disease), from the environment (for example, certain forms of cancer), or from a combination of both (for example, certain forms of Alzheimer’s disease). If identical twins have the same genes but only one of them gets a disease (say, schizophrenia), then researchers typically rule out genetics as the cause and examine what differences in the environment of the two twins may have caused the disease. See how that works?

In this study, the researchers were particularly interested in a DNA alteration called “copy number variations”. You see, DNA is never perfect, and sometimes cells have abnormal copies of big chunks of your DNA. These copy number variations can be harmless, but they can also cause certain diseases, so geneticists are paying close attention to them. In any case, the researchers found that supposedly identical twins had different sets of copy number variations when compared with their parents (meaning they didn’t inherit these DNA alterations from their parents). The cool thing about this finding is that the researchers can now determine when the alteration happened depending on who has the different copy number variation. If both twins have the same copy number variation, then we know it originated during meiosis (when the parents were generating eggs and sperm). If only one twin has the variation, then we know it originated during mitosis (during development).

While the article doesn’t focus all that much on the fact that identical twins have the same DNA (I hate to say it, but we kind of already knew that), this tidbit of information is relevant in two ways. First, it’s going to change how researchers carry out twin studies, because we can no longer assume that identical twins have the same DNA (as if we needed to complicate things further…). Second, this finding might lead to a new way of thinking about genetic diseases.

So there you have it: we are all unique individuals after all. Up next: your parents aren’t who you think they are. Your “parents” are really bug-eyed aliens from Neptune! (as always, bonus points for the correct reference in the comments…)

Sunday, March 27, 2011

“Sticks and stones may break my bones but words will never hurt me.” Sounds familiar? You may have been taught this witty maxim to fend off bullies during the glorious years that are high school. Since then, bullying has become a big deal: its often-devastating consequences are more than ever in the public eye. We already know that childhood abuse in many different forms (sexual abuse, physical abuse, witnessing domestic violence) can have long-lasting impacts. For example, sufferers are found to be more susceptible to depression and suicide, and more likely to engage in fights, do drugs and use a weapon. But what about verbal abuse from peers?

To evaluate the effects of peer verbal abuse on the brain and behavior, a team of researchers studied over 800 young adults who had no history of any of the big confounding factors such as exposure to domestic violence, sexual abuse, or physical abuse. The participants were asked to fill out surveys about how much verbal abuse they experienced from peers at school as well as surveys with more general questions about mood, behavior and psychiatric symptoms.

The results show that the more peer verbal abuse one is exposed to during school, the more likely they are to experience anxiety, depression, anger and drug use. As it turns out, verbal abuse from peers is just as bad as verbal abuse from parents in generating these consequences. As well, researchers found that peer verbal abuse that occurs during middle school years (ages 11-14) has the most significant impact (compared with elementary school and high school). I find that surprising, as I remember high school being much worse than middle school, but apparently it has to do with the timeline of brain development, not my personal feelings about high school.

To dig a little deeper, the researchers selected 63 participants who had experienced varying degrees of peer verbal abuse and had them undergo a brain scan (MRI). They found that participants who had been exposed to a lot of peer verbal abuse displayed abnormalities in their corpus callosum, a big bunch of white matter fibers that connect the left and right sides of your brain. The researchers suggest that this abnormality may explain some of the behaviors and symptoms associated with the abuse (such as depression).

While this study convincingly highlights the impact of bullying on the brain and brain function, there are a few things to keep in mind. Repeat after me: correlation does not mean causation. That undergoing bullying is associated with abnormalities in the brain does not mean that bullying necessarily caused these abnormalities. More studies will be needed to uncover that link. As well, the study is retrospective, meaning the authors “go back in time” by asking the participants to remember events from years ago. This can sometimes lead to faulty recalls or false associations. Lastly, I find it a bit strange that the researchers have not looked at the hippocampus of the participants. You may remember that the hippocampus is a brain region important for memory, but it is also involved in emotions, and it has been shown to be susceptible to other forms of abuse. I’m hoping the bullying-hippocampus link will be looked at in a future study.

Overall, though, the study reminds us that bullying is an important and potent childhood stressor. Sticks and stones it is.

Monday, March 21, 2011

It's Scientific Chick's blogiversary! To celebrate two years of sciency goodness, treat yourself to some cupcakes:

I also take this occasion to launch my first-ever "Who are you?" thread. Since I've started this blog, I've been picking science articles that I thought were interesting, and writing about them in hopes that my excitement for science would be contagious. Two years later, it's time for me to think about how to make Scientific Chick better, and how to cater to my readers. This means I need to get to know you! So pretty please, indulge me by answering these easy questions in the comments:

1) Tell me about you. Who are you? Why are you here? Do you have a background in science? An inquisitive mind?

2) Tell me about what you like. What are your favorite stories? What topics are you most interested in? Do you enjoy a meatier science discussion, or are you satisfied with the big picture?

3) Tell someone you know about Scientific Chick. Do you have a friend or family member who you think would enjoy this blog? Let them know! Readership keeps me going. :)

Monday, March 7, 2011

Here’s how blog writing usually goes for me: I peruse science journals for a suitable story, read a few articles, pick one, write the blog post, copy the reference, and find a good picture. Then I spend anywhere from an hour to a couple of days trying to come up with a title. For some reason, finding the right title is always the hardest part. So this week, when I saw an article with a title that I could use as is for the blog post, I just knew I had to write about it.

The article, as you can infer from the title of this post, looks into the relationship between attractiveness of a mating partner and stress in female birds. Most birds, like humans, tend to form monogamous pair bonds that last through the course of at least one reproductive event. Because of this “socially monogamous” system, if there are approximately the same amount of males and females around, most birds will be able to find a partner, but inevitably, a big chunk of females will end up paired with males of “below-average” quality (but I’m sure they have lovely personalities). So the researchers wanted to know, how does that make the female birds feel?

As I’m sure you can imagine, rating the attractiveness of one bird over another is no easy feat for human scientists. What’s more, beauty is in the eye of the beholder, but you can’t send out mass emails to female finches to fill out online in their spare time (“On a scale of 1 to 7, 1 being Dwight Schrute and 7 being George Clooney, how would you rate Mr. John Finchy?”). Therein lies the beauty of this study: the researchers picked a very creative model to answer their question. They studied a type of finch that comes in two colors: red heads and black heads. Even though they are the same species, red-headed and black-headed finches are partially genetically incompatible (meaning they have a harder time producing offspring), and so these birds tend to have a preference for mating partners with the head color that matches their own (though if there are slim pickings, they will mate with a bird of the other color). Knowing this, the researchers set up an aviary with a whole mix of these birds (males, females, in combinations of red heads and black heads) that had not previously met, waited until every bird had paired off, and then assessed the females’ satisfaction with their mating partner based on two parameters: how long until she would agree to breed, and how much corticosterone (a stress hormone) she had in her blood (don’t worry, a harmless procedure).

The results show that females that paired with a male of the wrong color laid eggs nearly one month later that the females paired with a male of the same head color. What’s more, females paired with incompatible (“below-average” quality) males had three to four times more stress hormones in their bloods, and this went on for weeks. Who knew that attractiveness could have such an impact on stress levels?

As it turns out, a widespread strategy used by female birds to deal with unattractive mates is to… select alternative, extrapair fathers for their offspring. Dan Savage would have a field day if he knew about these little sneaky females…

Sunday, February 27, 2011

Having a big brain seems like a very desirable thing right now (it certainly wasn't "trendy" when I was in high school, though). Games like "Big Brain Academy" measure your success by the size of your virtual brain. In the real world, scientific studies right, left and center extol the virtues of anything ranging from exercise to learning a new language as ways of expanding your gray matter. It turns out that learning to manage your stress might also do the trick, as I found out from a recent article pointed out to me by my friend Fawn.

The article looks at mindfulness mediation, a practice that involves becoming aware of experiences in the present moment without judging oneself. Many studies have already shown that mindfulness-based stress reduction programs can ease symptoms of anxiety and depression and can improve sleep and attention. But how does it work? To answer this question, researchers studied what mindfulness meditation does to your brain (to learn about what mindfulness mediation does to your pain, see this post).

The study looked at a handful of participants enrolled in an 8-week Mindfulness-Based Stress Reduction course. This course entails one meeting per week, one full day of training in week 6, and daily homework to do at home (meditation exercises). The experiment was very simple: researchers took a picture of each participant's brain using magnetic resonance imaging (MRI) at two time points: before the course started, and once it was over (8 weeks later). They also took pictures of the brains of control subjects who didn't take the course (also about 8 weeks apart).

By now I'm sure you've guessed the results: yup, the participants who meditated had significantly bigger brains. One area of the brain in particular was bigger: the hippocampus, a region known for its role in memory, but also involved in emotions. The researchers hypothesized that the increase in gray matter in the brain of people who mediate may explain the improvement they experience in dealing with their emotions. This hypothesis is supported by the fact that people who suffer from certain emotion-related diseases and disorders like depression and post-traumatic stress disorder often have a smaller hippocampus.

While I'm a big believer in meditation (this blog is so biased!), there are two limitations of this study worth mentioning. First, the researchers only looked at about 14 participants in each group. That's a pretty small sample, so it will be interesting to see what later experiments looking at more subjects come up with. Second, the mindfulness-based stress reduction program is not only about meditating: it also involves social interaction at the weekly meetings, stress education, and gentle stretching, which the control participants didn't get. So it's quite possible that the effect described here (bigger brains) are not the result of meditation per se. At this point we can't tease it out.

Regardless of these limitations, though, the study drives home an important message: the adult brain can change in response to training. I for one find some comfort in that.

Wednesday, February 16, 2011

My friend Michael (a.k.a. this crazy guy – would you please send him to New Zealand?) recently blogged about undergoing what he perceived as an unnecessary medical imaging procedure. He was concerned that this exposure to radiation might impact his fertility. A recent study suggests that Michael should add cancer to his list of concerns.

We’ve known for a long time that radiation is bad news. Scientists studied atomic bomb survivors and found that those who were closest to the blast had a higher incidence of cancer than survivors who were farther from the blast. While the evidence is conclusive, the atomic bomb delivered a much higher dose of radiation than medical imaging procedures. To tease out whether low-dose radiation from medical imaging procedures also increase one’s risk of developing cancer, a team of researchers from McGill University analyzed a group of over 80,000 patients who were admitted to the hospital for a heart attack. They perused the medical records of these patients and noted who received medical procedures involving radiation and who didn’t, and then followed-up by finding out who got cancer later on.

The researchers found that over 10,000 patients developed cancer later on. Interestingly, two-thirds of those cases of cancer were located in the abdomen, pelvis or thorax (presumably the areas that would be subject to medical imaging procedures aimed at the heart). After looking at each patient’s history of procedures, the researchers were able to determine that the more radiation one is exposed to, the higher the risk of developing cancer.

While the study looks at a large number of patients and shows a significant link between radiation exposure and cancer risk, the researchers were limited in that they only had the information available in the medical records. This means that while they controlled for variables like age and sex, they didn’t know everything about the patients: what they ate, how much exercise they did, what kind of environment they worked in. There may be a confounding variable that we don’t know about. As well, the researchers did not assess mortality as an end-point, and even write, “These patients most likely will die of cardiac-related causes”. So it’s important to remember that the scenario is not 1) patient has heart attack, 2) patient undergoes medical procedures, 3) patient gets all better heart-wise but develops cancer because of the procedures, 4) patient dies of cancer. It’s likely much, much more complicated than that.

That said, any medical procedure is all about risks and benefits. We need to weigh the risks of cumulative exposure to radiation (you’ll be glad to know that exposure to radiation from a single test does not substantially increase your risk of cancer) against the value of the information that the medical imaging procedure will provide. Not always an easy task. In Michael’s case, the physician was clear: she was running the test to appease his wife. Now what is that worth to you?Reference: Cancer risk related to low-dose ionizing radiation from cardiac imaging in patients after acute myocardial infarction. (2011) Eisenberg MJ et al. Canadian Medical Association Journal. [Epub ahead of print].

Sunday, February 6, 2011

It’s Super Bowl Sunday. Most of us spend the day eating nachos and wings, drinking beer, and acting rowdy in front of the television. For those of us who don’t have a television or snack foods in the vicinity (gasp!), we may chose to spend the day looking up scientific articles with a mention of football and writing blogs. I’m going to let you guess what I did.

High school can be a dangerous place: many will go through those few years carefully balancing social life, self-esteem, some sort of learning and the inevitable characterization of every single person into a specific group (you might be surprised to find out that I fit in the “jock” category). For the athletes, high school can also be dangerous for something very precious: their brains.

In a recent study, researchers looked at the incidence of concussions in high school sports over eleven years (1997 to 2008). They wanted to know whether certain sports had higher rates of concussions, and whether the incidence of concussions varied by gender, and over time. So they followed 25 high schools in a large public school district and recorded every instance of concussions for twelve sports: football, lacrosse, wrestling, soccer, basketball and baseball for boys, and field hockey, lacrosse, soccer, basketball, cheerleading and softball for girls.

The researchers reported a few interesting findings. They recorded nearly eleven million instances of a student playing a given sport, and out of those, identified 2651 instances of concussions. While boys accounted for just over half of the instances of students playing a sport, they accounted for three quarters of all concussions. Perhaps not surprisingly, football accounted for more than half of all instances of concussions. Baseball was the boy’s sport with the lowest incidence of concussions. For girls, soccer took the lead with the highest incidence of concussions, while cheerleading had the lowest incidence. Unfortunately for all my Canadian readers, the researchers left out our national sport, so I’m not sure how hockey would compare. But hey, we can talk about hockey when the Stanley Cup rolls around.

What surprised me the most in this study is that the overall rate of concussions increased significantly over time (a 4-fold increase between 1997 and 2008). Football showed the greatest increase in concussion rates over time, but it’s important to note that all twelve sports showed an increased concussion rate over time.

As for sex differences, the researchers found that for sports that are the same for girls and boys (like soccer and basketball), girls had a higher rate of concussions. However, in lacrosse, where the girl’s game has different rules, protective equipment and nature of play when compared with the boy’s game, girls had a lower concussion rate than boys.

There are several factors that could explain some of these results: an increase in concussion rates over time could be explained by a greater awareness of this medical phenomenon, and thus an increase in the reporting of concussions. Girls could be showing higher concussion rates for the sports they share with the boys because evidence shows that girls tend to be more willing to report injuries. However, even when all these factors are considered, the study highlights a need to prevent, detect and treat concussions across all sports, not just football. Concussions can be a serious brain injury, especially if complications develop, and repeated concussions are particularly dangerous, as they can lead to dementia.

In the heat of the Super Bowl, I don’t want to be a complete downer, though: being active during your teenage years can have numerous benefits, and can lead to habits that will last a lifetime and play an important role in preventing a whole load of diseases. So play away, but just make sure to protect that noggin’ (and parents: chose that extracurricular activity wisely)!

Monday, January 31, 2011

On Friday I attended a series of seminars on Alzheimer's disease at the University of British Columbia's Brain Research Centre. I think the idea was to showcase Canadian research in the field of dementia to woo politicians (also in attendance) and ask them for more funding. We heard about all aspects of Alzheimer's disease, from its history to its treatment, and in this post I will fill you in on the latest developments.

Alzheimer's disease is the number one public health problem in the developed world, with approximately 35 million cases worldwide. In Canada it represents a very expensive problem, estimated to cost 50 million dollars a day. In the time it takes you to read this post, there will be two more people diagnosed with Alzheimer's in Canada. As there are currently no approved treatments that affect the disease itself, there is an urgent need to keep our heads down and power through (bonus points for whoever can identify this reference in the comments) to find a cure.

The first "official" patient with Alzheimer's disease was a 51-year old woman named Auguste Deter. She was examined by Alois Alzheimer in 1901. She suffered from impaired memory, aphasia (a language disorder) and disorientation. Alzheimer kept meticulous records: we have a very detailed description of Auguste's condition, and even a sample of her handwriting (see picture). Even though the condition was described in great detail, Alois Alzheimer did not know what had caused Auguste's disease. Today, as one of the researchers at the seminar pointed out, we still don't know what causes Alzheimer's disease, but on a much higher level.

We do know that one of the main culprits in Alzheimer's disease is amyloid beta (Abeta), a protein that everybody's brain makes. In the brain of an Alzheimer's patient, though, too much of this protein is being made, and it aggregates in toxic chunks called plaques. The researchers present at the seminar predicted that vaccines against these plaques will fail. However, there are several candidate drugs that could prevent or treat these plaques in clinical trials right now.

Interestingly, researchers are also studying naturally occurring compounds: one of the speakers talked about his research looking at whether natural extracts can block the formation of plaques in a "petri dish" model of Alzheimer's (brain cells grown in a dish). He finds that ginger, cinnamon, turmeric, cranberry, rhubarb, blueberry, pomegranate and blackberry all help prevent the aggregation of Abeta. However, he warns that at this point, it is not practical to focus on eating these foods because the concentrations used in the lab are just not possible to recreate in a diet.

Beyond the molecular and biological underpinnings of Alzheimer's disease, researchers are also addressing the inevitable changes the world will need to undergo to accommodate a growing prevalence of dementia. For example, one speaker pointed out that many public places such as airports and even hospitals are very difficult to navigate for cognitively healthy people: this represents a true disservice to people with Alzheimer's disease. Efforts are also being made to engage the public (as to avoid more bad news like this one), and to provide resources for caregivers (such as the fantastic First Link initiative).

Overall, I'm disappointed to report that I didn't learn of any magical intervention that will rid us of Alzheimer's disease, but it's comforting to know that there is a big research community out there who is taking this problem very seriously and who is tackling it from many different angles.

Monday, January 24, 2011

For this week’s post, I had originally intended to kick-off the (Vancouver) cycling season with a post about helmets. So I reviewed the recent evidence to see if I could find an interesting paper. Unfortunately, I ran into a problem of the “boring” kind: the evidence out there is pretty much what you think it is: helmets are good, they prevent injuries. While that’s relevant, it doesn’t make for a great post, because, well, you already know this.

Luckily, I stumbled upon a related story that looks at helmet usage amongst… Fictional characters. The study, published in the journal Pediatrics, looks at safety practices depicted in movies over time. This may seem like a silly waste of time (or the project of your dreams, if you're a grad student), but we know that children tend to imitate what they see in movies (and that is why my eventual kids will never see the “Jackass” movies). Given that by age 18 the average child has spent two years in front of a screen, we might want to know a little more about the kinds of influences they may be getting from mass media.

The researchers started by identifying the 25 top-grossing G-rated (general audience) and PG-rated (parental guidance suggested) US movies for each year between 2003 and 2007. Of those 125 movies, they excluded movies that were animated, not set in present day, fantasy, documentary or not in English. That left them with 67 movies. The researchers then analyzed the safety practices in all the scenes that included characters with speaking roles either walking, driving or riding in a car, driving or riding in a boat, or riding a bike (for a grand total of 958 scenes).

The results show that in movies, just over half (56%) of motor-vehicle passengers wear seat belts, just over a third of pedestrians (35%) use crosswalks, three quarters of boaters (75%) wear personal flotation devices (or lifejackets), and a quarter (25%) of cyclists wear helmets.

Compared with similar studies carried out in the 1990’s and early 2000’s, there is a significant overall improvement in the depictions of safety practices. However, about half of the scenes still show unsafe practices. What’s more, movie characters rarely suffer the consequences of unsafe behavior. How many times did you see someone get up after falling off a cliff and think “Come on!”. The depictions of unsafe behavior combined with the absence of consequences for these behaviors may lead children to minimize dangers in real life, so parents, make sure you point it out when you see characters acting unsafe!

Now the study excluded quite a few movies for simplicity’s sake, and ended up with a fairly small sample, so it would be premature to generalize these results to all movies out there. I would be especially interested in finding out how animated movies fare, since they definitely cater to a younger crowd (Simba sure learned the consequences of *his* unsafe behavior). A later post, perhaps, if such a study exists!

Sunday, January 16, 2011

The word "eczema", a skin inflammation that affects 15 to 30 percent of children and two to 10 percent of adults worldwide, is derived from an ancient Greek word that literally means "to boil out". I know firsthand why this word was chosen, as I suffer from contact dermatitis, a form of eczema that is caused by an allergic reaction (to nickel). The itch is not unlike that of bug bites, and this is by far the best depiction I've ever seen of what it feels like to be itchy:

(the ant hill is a nice touch)

As I'm sure you can imagine, having a child with this condition is a bucket load of fun. The ointments, the whining, the scratching, the scabs... (and, in my case, the sowing of little patches of fabric behind *every* jeans buttons... Thanks, mom!). So while eczema is not really a life-threatening condition, researchers are looking into it, because it's very closely tied to parental sanity.

We already know that eczema is not purely genetic: the environment you grow up in can influence your chances of developing the itchy rash. However, what we don't know is what components of the environment play an important role. A recent study attempts to add a piece to this puzzle by researching whether family pets can have an impact on the development of eczema.

Researchers followed over 600 children starting at one year of age. At the start of the study, the parents of each children were asked to fill a survey of their environment, and researchers took a dust sample from each house to test for allergens and such. Three years later, the researchers evaluated which child had developed eczema and which child hadn't, and analyzed what contributing factors might have played a role.

It turns out that owning a dog is not only good for your blood pressure: children who lived in a house with a dog had a significantly lower risk of developing eczema by four years of age. What about cats? The situation was a bit trickier for cats: living with a cat increased a child's risk of developing eczema, but only if the child tested positive on a cat allergen sensitivity test (the skin-prick kind).

So get rid of Mittens and adopt Fido? Not so fast. First, these findings don't hold true for all allergy-related conditions. For example, dogs are thought to contribute to asthma. Second, what this study really does is highlight how complicated these conditions are: several different types of environmental exposures may impact allergies in different ways, so it's very hard to draw clean, straightforward conclusions and guidelines.

That said, I'm still going to blame my eczema on growing up in a dogless home. I wish I would have known this tidbit of information way back when I was a kid: it might have helped me in my campaign to get a pet (admittedly, my heart was set on a horse).

Wednesday, January 12, 2011

It's that time of the year. That I-will-eat-better-and-exercise-lots time. It's that time when we kick start New Year's resolutions with the best intentions, the best plans, the most motivation. Unfortunately, and I can tell you this from experience, some of us will fail. A recent article published in the journal Obesity sheds light on one important aspect in keeping some resolutions: delayed gratification.

Delayed gratification, as the name implies, refers to the ability to forgo an immediate reward (for example, delicious Cheesy Poofs) for a benefit that will come later (for example, rocking that little black dress). A lot of research shows that the better you are at delaying gratification, the better you do in life in general (you may have heard of the famous marshmallow study). To see if delayed gratification is linked to obesity, a team of researchers set out to test whether children who have a high body mass index (BMI) are less likely to delay gratification.

The researchers looked at data from an educational obesity intervention program. In this program, attended by obese or overweight children along with their siblings (the healthy weight control group), children earn a point if they complete their weekly goals. They then have two choices: either spend that point immediately on a small toy prize (like a pencil) or save the point to use later on a larger prize worth more than one point (like a basketball). The measure of points saved and points spent is thought to be a valid model of delayed gratification. So the researchers looked at the relationship between how many points were saved by a child and that same child's BMI. The results show that a higher BMI is associated with less points saved, meaning the children who were overweight or obese had a harder time delaying gratification.

The strong aspect of this study is that the rewards were not food-related. This allowed the researchers to study delayed gratification as a behavior trait in general, and not specifically as it relates to obesity. However, their sample was fairly small (59 children) and the duration of the study was fairly short (12 weeks). Therefore, it's difficult to say whether delayed gratification plays a role in weight loss.

Overall, the research is relevant in that it suggests that working on delayed gratification (it's possible to "train" to get better at it) may help in obesity interventions. And for all you out there with eat-less-exercise-more resolutions, all I can say is "eyes on the prize"...

Reference: Ability to delay gratification and BMI in preadolescence. (2010) Bruce, AS and al. Obesity [Epub ahead of print].

About Me

Dr. Julie is an Assistant Professor of Neurology at the National Core for Neuroethics and the Djavad Mowafaghian Centre for Brain Health at the University of British Columbia. She holds a PhD in Neuroscience.