Learning about the brain, body and how it all connects

Main menu

Monthly Archives: October 2013

You’re running down a hallway; running away from someone? Running towards something? Your feet start to lift off the ground and the ceiling opens up. You float higher and higher, and you get the feeling you’re not alone. You turn to your left and it’s Bob Dylan, laughing and calling you “Mr. Tambourine Man”. Suddenly the balloon you were holding onto, carrying you up into the sky, turns into a tangerine and you start to plummet back to earth. Just before you slam into the ground you awaken; sweaty, sheets twisted, wondering what the hell that was all about.

Dreams are weird. Especially if you’ve eaten a lot of cheese the night before.

Or so says the common myth. From Charles Dickens to Arab Strap, cheese dreams have been a part of our popular culture for over the last 150 years. But is there actually any truth in this old wives’ tale?

A study conducted in 2005 by the British Cheese Board attempted to debunk this claim by giving 200 participants 20 grams (roughly 0.7 ounces) of cheese 30 minutes before they went to bed and asking them to record their dreams and quality of sleep. In the study, 67% of participants recalled their dreams, and none reported the presence of any nightmares, something the Cheese Board is calling a win.

Instead of night terrors, the researchers report that the cheese resulted in pleasant nighttime fantasies in most individuals. They even went so far as to test the varying effects different types of fromage had on an individual’s dream-state. From their conclusions, blue Stilton resulted in the most bizarre trips, affecting about 80% of participants and resulting in visions of talking animals, vegetarian crocodiles and warrior kittens. On the other end of the spectrum, Cheshire cheese produced the least memorable nights, with less than half of the participants being able to recall their dreams.

The study (again, initiated by the cheese industry) also claimed that eating cheese before bed actually helped people fall asleep. This is supposedly due to the relatively high tryptophan content in cheese, an amino acid involved in the production of melatonin (and serotonin), which plays an important role in our sleep-wake cycle.

However, it should be noted that there was no report of a control or placebo group in this experiment, such as participants who ate nothing or consumed a soy cheese sample (yum!) before bed. Thus, there’s no empirical evidence that it was actually the cheese causing these effects and that it was not just the natural sleep state for these individuals.

As for the dream link, there is only one academic paper that mentions the cheese-dream phenomenon, and that is only anecdotally. However, one Internet theory I found (I know, I’m reaching here) proposed that the bacteria and fungal content in cheese, and in potent blue cheeses in particular, might be at the root of the increase in dream vividness. This is due to the potential psychoactive effects different compounds found in fungi, like tryptamine or tyramine, might have, influencing our brains’ chemical systems and thus our state of mind.

Tryptamine is a common chemical precursor for serotonin and other related alkaloids, some of which are involved in the hallucinogenic effects of psilocybin (“magic” mushrooms) and DMT. However, there’s no hard evidence that tryptamine is actually present in the Stiltons and Gorgonzolas of the world, and even if it was, it would be in extremely low doses. After all, when was the last time you felt high after eating cheese?

Conversely, tyramine is a monoamine that works by releasing other neurotransmitters like adrenaline, noradrenaline and dopamine into the body. Another theory is that tyramine’s effect on noradrenaline release in an area of the brain called the locus coereleus, a region important in our sleep-wake cycle, is altering our dream patterns.

Some antidepressants work by inhibiting the breakdown of monoamines (monoamine oxidase inhibitors – MAOIs), and it can be potentially dangerous to eat foods high in tyramine when on this medication as it can result in an excess of these chemicals in your brain and body. The medication mentioned in the old academic paper, pargyline hydrochloride, actually works as an MAOI, potentially explaining the bizarre effect eating cheese had on the patient. There are also reports of foods high in tyramine causing migraines in some individuals, particularly those on MAOIs; however, another study found no evidence of this link.

Finally, there are numerous other types of foods that contain chemical compounds like tyramine and tryptophan affecting our neurotransmitter systems. This includes cured meats, egg whites and soybeans, none of which have the dream-producing reputation of cheese. So for now, it appears to be an untenable link between cheese specifically and these nighttime apparitions.

Then again, I did eat some cheddar last night, which might just explain Bob Dylan’s appearance in my nocturnal activities. According to the Cheese Board, cheddar was linked to visions of celebrities dancing in your head.

So I know I’ve written a lot about drugs and addiction recently (it is my profession after all!), but I have one more important thing to say on the nature of harm reduction for those who do choose to use drugs.

Some of you may have heard of several tragic and avoidable drug-relateddeaths that have occurred over the last couple of months at clubs and festivals in both the UK and the US. This has largely been in people who think they are taking MDMA, a relatively safe drug that has a low toxicity level and low risk for dependence, but instead have consumed a much more dangerous analog of MDMA called PMA.

PMA is chemically similar to MDMA, but there are a few crucial changes that make it much more risky. With PMA, there is a much higher chance for toxicity, meaning it is very easy to overdose on. The drug can also cause hyperthermia or over-heating, a dangerous side-effect that can result in organ failure and death.

It appears that those who illegally produce and sell ecstasy are replacing the active ingredient (MDMA) with PMA, as it is easier and cheaper to produce, and it is this switch that is causing people to unintentionally consume this dangerous substance.

The group INPUD (International Network of People who Use Drugs) has released a statement warning people about PMA and urging them to be careful and educated if they do choose to use. If you or someone you know uses drugs like ecstasy, MDMA or amphetamines, please check out their recommendations on how to stay safe.

I wrote last week on the idea of having an “addictive personality“. This was meant in the context of common drugs of abuse, like alcohol, cocaine or heroin, but what about addictions to things other than drugs? Like your iPhone. Or the internet. Or Oreos.

The idea of food addiction is not a new one, and I’ve written on this trend before, both on Brain Study and in real science journals. But a new study released last week takes this claim to a whole new (and unsubstantiated) level, claiming that Oreos – and especially that all-enticing creamy center – are as addictive as cocaine.

I’ve written a rant that’s been published in The Guardian today critiquing the study and stating what exactly is so wrong with the research. I’ve also provided some much better links to articles on the topic.

In case you’re not up on the latest Science Blogging/Writing/Communication gossip (which I’m assuming most of you aren’t), well, shit has really hit the fan recently.

There have been a slew of misogynistic occurrences in the past week, with more stories of sexual harassment coming to the surface. Numerous scientists and bloggers have commented on the developing situation far more effectively than I will be able to here, and I’ll be sure to link to them as I go. But first, the unfolding story.

Last week, Danielle N Lee (or DNLee, as she commonly goes by online), a biologist and blogger for Scientific American at The Urban Scientist, was invited by a relatively unknown biology site to do some guest blogging. Pretty standard. They said they were unable to pay her for her services, and she politely declined. Also pretty standard. Then the blog editor responded by asking her if she was “an urban scientist or an urban whore”. NOT STANDARD.

Obviously, her response was one of outrage, and she very eloquently documented the entire exchange on her blog for Scientific American. She included video, pictures, excerpts from the original email exchange, and an epic take-down of the idiot who dared insult her in such a degrading, simplistic and childish way.

At this point, the story should have ended with DNLee coming out on top, the editor being fired (which he was), all of us being shocked and appalled that such blatant unprofessional misogyny still exists, and writing a few tweets and blogs about it.

Unfortunately, it didn’t. Scientific American made the short-sighted and cowardly decision to take down DNLee’s post without telling her, claiming that their content was supposed to be solely science-based. Mariette DiChristina, the magazine’s editor-in-chief, explained this decision over Twitter, and the community erupted.

Re blog inquiry: @sciam is a publication for discovering science. The post was not appropriate for this area & was therefore removed.

The science Twitter and blog-o-sphere were a-buzz with outrage, and #standingwithdnlee started trending. (This entire exchange has been well documented on BuzzFeed and Jezebel, among other places.)

Stunned by the backlash, DiChristina and Scientific American officially apologized the next day, DNLee’s original piece was re-published, and again it seemed like we could all move on.

But then it REALLY got messy.

Flash back one year ago. Monica Byrne, a writer and playwrite, posted on her blog about an incident of sexual harassment she had experienced from an older male mentor that left her understandably shaken up. There was no overt sexual assault, just lots of inappropriate, unwanted innuendoes, which in some ways left her questioning herself and her reaction even more harshly. At the time, Byrne elected to leave the perpetrator unnamed. However, the recent incident involving DNLee caused her to change her mind, as the man in question was very high up in the Scientific American blogging network, and she suspected he had something to do with the initial silencing. So on Monday she decided to out him in an update on her original post.

This man is Bora Zivkovic, or BoraZ, otherwise known as “the Blogfather”. He is extremely influential in the world of science writing and is known to be generous with his connections and retweets. (I have never met or had contact with him personally, but he has retweeted for me, and those I’ve met who do know him have spoken of him highly.)

This reveal shocked the SciComm community more than learning that Darth Vader was Luke’s father. Many were quick to jump to Bora’s defense, questioning and mistrusting the victim as we are wont to do in this “slut-shaming” society we unfortunately inhabit. However, Bora himself admitted on his blog of the inappropriate behavior he had engaged in with Byrne and apologized for his actions.

Then someone else spoke up. Hannah Waters, another female scientist and blogger at Scientific American, responded to the fury surrounding Byrne’s accusation with her own report of “not-quite-harassment” that she had experienced at the hands of Bora.

Again there was shock and outrage. Again, Bora did not deny the claims.

No need to defend me. Kudos to @monicabyrne13 and @hannahjwaters for having the courage to speak up. I was wrong. I am sorry. I am learning.

And then came the final nail in the coffin. The two prior reports accounting Bora’s behavior describe uncomfortable, unprofessional and certainly unnecessary interactions. But do they qualify as harassment or abuse? Some were still unswayed, staunchly standing by their friend and idol.

But Kathleen Raven ended all of that today with a personal and painful account of her own experiences with sexual harassment from Bora and others. Her post came complete with explicit email snippets sent to her, which are jaw-dropping.

The disappointment and disgust at this formerly revered man are wide and ranging. But by far the best response I’ve read was penned by Hope Jahren, a professor at the University of Hawaii.

She writes:

The Worst Part Is Not.

The worst part is not when it all blows over just as you thought something was going to finally happen. When everything goes on as usual, except that your colleagues pass you in the hall with a wider berth. That when all the shock and outrage dies down, the only job that changed is yours. You used to be a valued mascot. Now you’re a traitor. You’ll never be Department Chair or Dean now that this has happened. How dare you throw all the Monopoly pieces in the air – we were letting you play! But that’s not the worst part.

The worst part is not when his wife and his employees come to you and say please don’t do this to us. Our mortgage, our children, our paychecks are at stake. When they ask you if you care about anything besides yourself. When they tell you the full story, which you never wanted to know. That there’s a rotten root of sickness and betrayal underneath it all. That this is your big chance to be the bigger person and walk away, proving that you are actually more compassionate than you seem. This is not the worst part. Although that part is pretty damn bad.

…

The worst part is the pivot. The click. When the switch flips. When you press down, turn the child-proof cap, and the thing breaks in your hands. When it dawns on you that this isn’t an interview, it’s a date. That there’s no study group, it’s a date. That this isn’t office hours, it’s a date. That it’s not a promotion, it’s a date. That it’s not a field trip, it’s a date. It’s a weird f*cked up date and you had no idea, you dumbass. You’re just as stupid as he thinks you are. Why are you carrying a backpack full of questions, homework, manuscripts, resumés and various other homely hopeful aspirations? All you needed to do was to show up. Show up for this weird f*cked up date. Sucker.

1. It Was Well-Written. Lordy lordy how well-written it was. Let’s all turn toward the East and say it together, loud enough to shake the walls where a certain book proposal is languishing on a certain desk. “HOPE JAHREN SURE CAN WRITE,” we bellow while choking back our collective sob. Someone should give that girl a goddam book deal.

2. It Didn’t Name Names. First Ofuck or Ofek or whoever-the-f*ck hate-spoke Danielle Lee and we were all like, String him up! How daaaaaare you! And the guys were all like, Let me at him! Then Borat or Boraz or Borehole sleazed up Monica Byrne and we were all like, Not Mr. Rogers! He’s a flesh-and-blood dude! He gave me peelings for my compost heap! He defragged my harddrive! Why universe, why? And the guys went kinda silent at that point (did you notice?). Then we looked at each other and said, Whoa this is complicated. Eventually we got to this place where we sure as hell don’t want him making decisions about women’s careers but we’d still probably perform CPR on him if we saw him lying in the street. Turns out he’s neither an angel nor a devil, just like all the other men I don’t know. Just like every sorry soul made flesh temporarily wandering this lonely dusty Earth.

Jahren generously allowed me to re-post some of her content, but I strongly encourage you to check out the full pieces here and here.

So that’s what’s buzzing around the ol’ science-sphere these days. Name-calling, misogyny, back-stabbing, sexual harassment, victim blaming, and some badass, brave, brilliant women who will not be silenced when they have something important to say.

—

Update: Bora has officially resigned from Scientific American. And Christie Wilcox, another scientist, writer and former protégé of Bora’s, has written an excellent piece from a position not of outrage but of numbness. Equally powerful perspective.

You’ll have to bear with me if this is a bit of a self-indulgent post, but I have some exciting news, Brain Study-ers: I’ve officially submitted my dissertation for a PhD in psychology!

In light of this – the culmination of three years of blood, sweat, tears and an exorbitant amount of caffeine – I thought I’d write this week on part of my thesis work (I promise to do my best to keep the jargon out of it!)

One of the biggest questions in addiction research is why do some people become dependent on drugs, while others are able to use in moderation? Certainly some of the risk lies in the addictive potential of the substances themselves, but still the vast majority of individuals who have used drugs never become dependent on them. This then leads to the question, is there really such a thing as an “addictive personality”, and what puts someone at a greater risk for addiction if they do choose to try drugs?

We believe that there are three crucial traits that comprise much of the risk of developing a dependency on drugs: sensation-seeking, impulsivity and compulsivity.

Sensation-seeking is the tendency to seek out new experiences, be they traveling to exotic countries, trying new foods or having an adrenaline junkie’s interest in extreme sports. These people are more likely to first try psychoactive drugs, experimenting with different sensations and experiences.

Conversely, impulsivity is acting without considering the consequences of your actions. This is often equated with having poor self-control – eating that slice of chocolate cake in the fridge even though you’re on a diet, or staying out late drinking when you have to be at work the next day.

While impulsivity and sensation-seeking can be similar, and not infrequently overlap, they are not synonymous, and it is possible to have one without the other. For example, in research we conducted on the biological siblings of dependent drug users, the siblings showed elevated levels of impulsivity and poor self-control similar to that of their dependent brothers and sisters, but normal levels of sensation-seeking that were on par with unrelated healthy control individuals. This led us to hypothesize that the siblings shared a similar heightened risk for dependence, and might have succumbed to addiction had they started taking drugs, but that they were crucially protected against ever initiating substance use, perhaps due to their less risk-seeking nature.

The final component in the risk for addiction is compulsivity. This is the tendency to continue performing a behavior even in the face of negative consequences. The most classic example of this is someone with OCD, or obsessive-compulsive disorder, who feels compelled to check that the door is locked over and over again every time they leave the house, even though it makes them late for work. These compulsions can loosely be thought of as bad habits, and some people form these habits more easily than others. In drug users, this compulsive nature is expressed in their continued use of the substance, even though it may have cost them their job, family, friends and health.

People who are high in sensation-seeking may be more likely to try drugs, searching for that new exciting experience, but if they are low in impulsivity they may only use a couple of times, or only when they are fairly certain there is a small risk for negative consequences. Similarly, if you have a low tendency for forming habits then you most likely have a more limited risk for developing compulsive behaviors and continuing an action even if it is no longer pleasurable, or you’ve experienced negative outcomes as a result of it.

Exemplifying this, another participant group we studied were recreational users of cocaine. These are individuals who are able to take drugs occasionally without becoming dependent on them. These recreational users had similarly high levels of sensation-seeking as the dependent users, but did not show any increase in impulsivity, nor did they differ from controls in their self-control abilities. They also had low levels of compulsivity, supporting the fact that they are able to use drugs occasionally but without having it spiral out of control or becoming a habit.

We can test for these traits using standard questionnaires, or with cognitive-behavioral tests, which can also be administered in an fMRI scanner to get an idea of what is going on in the brain during these processes. Behaviorally, sensation-seeing roughly equates to a heightened interest in reward, while impulsivity can be seen as having problems with self-control. As mentioned above, compulsivity is a greater susceptibility to the development of habits.

In the brain, poor self-control is most commonly associated with a decrease in prefrontal cortex control – the “executive” center of the brain. Reflecting this, stimulant-dependent individuals and their non-dependent siblings both showed decreases in prefrontal cortex volume, as well as impairments on a cognitive control task. Conversely, recreational cocaine users actually had an increase in PFC volume and behaved no differently from controls on a similar task. Thus, it appears that there are underlying neural correlates to some of these personality traits.

It is important to remember that we all have flashes of these behaviors in differing amounts, and it is only in extremely high levels that these characteristics put you at a greater risk for dependence. Also, crucially it is not just one trait that does it, but having all three together. Most notably though, neuroscience is not fatalistic, and just because you might have an increased risk for a condition through various personality traits, it does not mean your behavior is out of your control.

When I tell people that I ‘do psychology’ I typically get one of three reactions. 1) People ask if I can read their thoughts. No, unless you’re a drunken guy in a bar, in which case, gross. 2) They begin to tell me about their current psychological troubles and parental issues, to which I listen sympathetically and then make it clear that I got into experimental psychology because I didn’t want to have to listen to people’s problems (sorry). Or 3) they ask me a very astute question about the brain that 9 times out of 10 I can’t answer. This last option is by far the most preferable and I’ve had several very interesting conversations come out of these interactions.

One such question I received recently was where does handedness come from in the brain? While initially a basic-seeming question, I quickly realized that I had no idea how to answer it without dipping into pop psychology tropes about right- and left-brained people that I definitely wanted to validate before I started trotting them out.

So what exactly is handedness? Does it really reflect differences in dominant brain hemispheres? Is it underlying or created, and what happens if you switch back and forth? Can a person truly be ambidextrous?

Handedness may indeed relate to your so-called ‘dominant’ hemisphere, with the majority of the population being right-handed and thus ‘left-brained’ (each hemisphere controls the opposite side of the body in terms of motor and sensory abilities). The dominant side of the body, by definition, is quicker and more precise in its movements. This preference originates from birth and is then ingrained by your actions, such as the practice of fine motor skills like handwriting.

Going beyond basic motor differences, handedness has been loosely related to the more general functions of each brain hemisphere as well. The left hemisphere is typically associated with more focused, detailed and analytical processing, and this type of thinking may be reflected in the precision movements utilized by the more typically dominant right hand. Conversely, greater spatial awareness and an emphasis towards systems or pattern-based observations are thought to reside primarily in the right hemisphere. (I highly recommend Iain McGilchrist’s RSA animation on the divided brain for a great overview.) However, it is important to note that these types of thought and behavior are by no means exclusive to one hemisphere or another, and the different areas of the brain are in constant communication with each other via signals sent through white matter tracts that traverse the brain, like the corpus callosum that connects the two hemispheres.

Contributing to the right-hand/left-brain theory, the left hemisphere is largely responsible for language ability, which has traditionally been used as another indicator of hemispheric dominance. It was initially thought that this control was switched in left-handed people, with the right hemisphere in charge of verbal communication; however, it has since been proven that this linguistic laterality doesn’t really match up that neatly.

In the 1960s a simple test was devised to empirically determine a person’s dominant hemisphere in terms of language. This, the Wada test, involves injecting sodium amobarbital into an awake patient in an artery traveling up to one side of the brain, temporarily shutting down that hemisphere’s functioning. This allows neurologists to see which abilities are still intact, meaning that they must be controlled by the opposite side. This test is especially important in patients undergoing neurosurgery, as ideally you would operate on the non-dominant hemisphere to reduce possible complications in terms of movement, language and memory. The Wada test revealed that many left-handers are actually also left-brained dominant in terms of language, and that in only a small proportion does language reside in the right hemisphere. Still other left-handers share language abilities across the two hemispheres.

So where does this appendage preference come from? Handedness is thought to be at least partially genetic in origin, and several genes have been identified that are associated with being left-handed. However, there is evidence that it is possible to switch a child’s natural preference early in life. This often happened in cultures where left-handers were perceived as ‘evil’ or ‘twisted’, and attempts were made in schools to reform them, forcing them to act as right-handers. As mentioned above, when motor movements (and their underlying synaptic connections) are practiced, they become stronger and more efficient. Thus, individuals who were originally left-handed may come to favor their right hand, particularly for tasks like writing, as they were forced to develop these pathways in school. These same individuals may still act as left-handed for other motor tasks though, simultaneously supporting both the nature and nurture aspects of handedness. Notably, this mixed-handedness is different from ambidextrousness, as both hands cannot be used equally for all actions. True ambidexterity is extremely rare and has been largely under-studied to date. However, it has been theorized that in ambidextrous individuals neither hemisphere is dominant, and in some cases this has led to evidence of mental health or developmental problems in children.

So the next time you meet a psychologist in a bar, instead of challenging them to guess what you’re thinking, ask them the most basic brain-related question you have. It will undoubtedly lead to a much better conversation!