Twenty years ago this year, former Massachusetts Institute of Technology researcher Dr. Barry Sears published Enter the Zone: A Dietary Road Map, a book that helped change the discussion about diet and health. Against the established nutritional powers--still clutching the "high carb, low-fat" mantra--Sears championed what he called an "anti-inflammatory" dietary approach to undermining obesity and a host of chronic diseases, including diabetes, cardiovascular disease and brain-based diseases. At the time, little was being said about the connection between diet and diseases like Alzheimer's and Parkinson's, but Sears pointed to trends in the modern American diet that were fueling the expansion of these and other conditions.

Considering that cases of diabetes have risen nearly 180% in the last 30 years, and both the incidence and severity of new Alzheimer's cases have skyrocketed in the same period, Sears's points are well worth revisiting. I corresponded with him recently about his latest book, The Mediterranean Zone, and his evidence-based position on what our diet is doing to our bodies and brains.

DiSalvo: In your past work and your latest book you say that there is a direct link between a series of dire health conditions, two of which are already epidemics—obesity and diabetes--and a third that is an epidemic in the making: Alzheimer's. What’s the line connecting the three?

Sears: The linkage between all three chronic conditions is increased inflammation in the adipose tissue, pancreas or the brain. What you see is the movement of cellular inflammation initially from the adipose tissue to the pancreas and then to the brain, respectively. This is similar to the metastatic spread of a cancer. Virtually all chronic disease involves increased inflammation.

Most of us are familiar with "inflammation" as it relates to injuries, but here you’re talking about a particular sort of inflammation that you argue plays a major role (albeit unseen) in the health of our bodies and brains.

There are two types of inflammation. The first type is classical inflammation, the kind that hurts. That’s typically why you go to a doctor. The other type is below the perception of pain. This is cellular inflammation. Initially it causes disruption of hormonal signaling in the cells. However, since there is no indication of its presence, it will continue causing increased cellular damage until there is enough accumulated damage that you can call it chronic disease. It could be obesity, diabetes, heart disease, cancer, or Alzheimer’s, but they are all ultimately caused by cellular inflammation.

Is this type of inflammation reversible?

The best approach to increased inflammation is to follow an anti-inflammatory diet. This is one that maintains a balance of low-fat protein, low-glycemic carbohydrates (i.e. fruits and vegetables), and moderate amounts of fat that are low in both omega-6 and saturated fats (both of which can increase cellular inflammation) at every meal. This is because the hormonal responses of any meal will last only five hours. Such an anti-inflammatory diet can be supplemented with anti-inflammatory supplements rich in either omega-3 fatty acids and polyphenols (the chemicals that give fruits and vegetables their color) and ideally both. Clinical data suggests that cellular inflammation can be rapidly reduced with such a dietary approach. The secret is to follow it for a lifetime.

In your latest book you talk quite a bit about the distinction between omega-6 and omega-3 fatty acids, and the problems associated with an imbalance between the two. What is the main issue here?

Omega-6 fatty acids are the building blocks to make pro-inflammatory hormones, whereas omega-3 fatty acids are the building blocks to make anti-inflammatory hormones. You need a balance of both to maintain a healthy immune response. However, an excess of omega-6 fatty acids or a deficiency of omega-3 fatty acids causes an increase in cellular inflammation. This increase in cellular inflammation is accelerated in the presence of elevated insulin levels which comes from a high-carbohydrate diet.

Give us a sense of the imbalance in terms of how much omega-6 we’re consuming.

The average American now consumes 7-8% of their total calories as omega-6 fatty acids. This is nearly a 400% increase in the past century. A hundred years ago the average omega-6 to omega-3 fatty acid content of the American diet was about 2:1. Today, it is closer to 20:1.

And you say in your recent book that fast food is a major source of the imbalance.

The typical fast food meal has an even higher level of omega-6 fatty acids than grocery-store bought foods. That’s because they are the cheapest source of calories known and make any food taste better (especially fried chicken or French fries). Industrialized beef, chicken, and pork products are also rich in omega-6 fatty acids as they also drive the fattening process.

I've read research suggesting that a super saturation of omega-6 fatty acids in our diets may be linked to psychological-emotional issues, even higher rates of violence. Do you think there's credible evidence for this?

The answer is yes from both epidemiological studies and intervention studies using high-dose omega-3 fatty acids (especially in the treatment of depression, ADHD, and anxiety). I believe the increase in the cellular inflammation in the brain is causing disruption of neurotransmitter signaling patterns, thus preventing the transmission of the appropriate signals to the interior of the nerve cell.

Many people think that eating more fish, like salmon, will boost omega-3 levels, but you’ve said that there are problems with this approach.

First, fish are being hunted to extinction. Second, all fish are contaminated by pollutants we have put into the environment over the past two generations. These include mercury from burning coal to industrial toxins such as PCBs, dioxins, and flame retardants. All of these toxins are known neurotoxins and carcinogens. Third, most people like lean fish, which means they are low in omega-3 fatty acids. Salmon is the only popular fish that is rich in omega-3 fatty acids. Finally, only the Japanese eat enough fish to maintain an appropriate balance of omega-6 to omega-3 fatty acids in their blood.

The solution is to use highly purified, omega-3 concentrates derived from anchovies and sardines to reduce cellular inflammation. The first reported use of fish oil to reduce inflammation was reported in 1989 by Harvard Medical School investigators in The New England Journal of Medicine. Fortunately, omega-3 fatty acid concentrates today have a much higher purity and potency than those used in 1989.

What about farmed fish, particularly the restaurant standbys – farmed salmon and tilapia?

Farmed tilapia has very high levels of omega-6 fatty acids and very low levels of omega-3 fatty acids. This is because the tilapia can grow on cheap vegetable oils instead of more expensive crude omega-3 fatty acids needed for farmed salmon growth. This was discussed in a 2008 article in the Journal of the American Dietetic Association. In fact, the levels of omega-6 fatty acids in farmed tilapia were similar to eating the same amount as found in a hamburger that you might find in a fast food restaurant.

Farmed salmon have higher levels of omega-3 fatty acids because they require omega-3 fatty acids for growth. However, the crude fish oil used in the farming is rich in PCB so that the PCB levels of farmed salmon are about five times that of wild salmon. However, with the increasing price of crude fish oil, the levels of omega-3 fatty acids in farmed-raised salmon are dropping as cheaper alternatives are being used. These include vegetable oil (2/3 of oil used in Norwegian farmed-raised salmon is now vegetable oil rich in omega-6 fatty acids) as well as algae, insects, barley protein and trimmings from seafood processing plants.

If someone wants to tackle inflammation and turn things around, where do they begin?

Try to remember what you grandmother told your parents. Eat small, but balanced meals (at least in terms of calories) throughout the day, never consume any more low-fat protein at a meal than you can fit in the palm of your hand, and never leave the table without eating all your vegetables. Her final advice was to take a tablespoon of cod liver oil (rich in omega-3 fatty acids) before you left the house. Who knew she was at the cutting-edge of 21st century anti-inflammation nutrition? If a person wants more detail on modern anti-inflammatory diets, then I would recommend going to www.zonediet.com.

Life is lived in loops. Here’s one you may know: we experience stress; to relieve the stress we do something pleasurable; when that pleasure exhausts itself, we experience more stress. Sound familiar? Psychologists tell us that when we run in this loop long enough, we may encounter something called “anhedonia” – the inability to experience pleasure from things we’d normally enjoy. Does binging on chocolate do it for you? Binge long enough and it probably won’t.

To this stinging realization we can add another, and—apologies ahead of time—it’s also a bit of a pill. Researchers have shown that not only does stress predispose us to wanting pleasure, it makes our desire for it drastically out of proportion to our enjoyment. The reward never reaches the level of our want.

To demonstrate this, researchers recruited two groups of study participants—all of whom were chocolate lovers—for some fun with water and sweets. (The chocolate-lover part will make sense in a moment.)

Members of the first group were made to experience stress by holding their hands in ice water (a well-tested means of inducing stress in psych research) while they were observed by the researchers. Those in the other group placed their hands in lukewarm water. After a little while, both groups were told to squeeze a handgrip that, they were instructed, would give them a nice stout whiff of chocolate.

As you might predict, the stressed group squeezed considerably harder for their chocolate – three times harder. Having received their dose of reward, the groups were then asked to rate their satisfaction. You might think that the group desiring the chocolate with three times the intensity would rate it proportionally higher, but in the end the groups’ ratings were really no different. Using stress to spike desire did nothing to increase enjoyment.

Quoting Dr. Tobias Brosch, professor of psychology at the University of Geneva and one of the study’s authors: “Stress seems to flip a switch in our functioning: If a stressed person encounters an image or a sound associated with a pleasant object, this may drive them to invest an inordinate amount of effort to obtain it.”

Which explains why stress is a consistent trigger in everything from failing to stay on diets to addiction relapses. Whatever the object of our desire, feeling stress makes us think we need it like we need air to breath.

The same dynamic has held true in animal studies with our friends the rats. It seems that if you get rats hooked on cocaine, and then get them clean from their addiction, you can induce a rapid relapse back into addiction by stressing them out with icy cold water. The same principle applies to us: stress is the trigger that makes us want “it” more.

One takeaway: keep a close eye on how you are reacting to stress -- and the earlier the better. Your best chance of affecting your "loop" is well before you've reached the point of smoldering hot desire.

]]>Stress Makes You Want It, But Will You Enjoy It? Five Minutes with Author Dan Ariely on How to Manage Your Time David DiSalvoSat, 17 Jan 2015 22:15:26 +0000http://www.daviddisalvo.org/the-daily-brain/2015/1/17/five-minutes-with-author-dan-ariely-on-how-to-manage-time54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54badd6de4b07b4a7d18071d

If you’ve spent any time in the psychology and self improvement sections of any bookstore, you know that Dan Ariely quite literally wrote the book on human irrationality. With bestsellers like Predictably Irrational and The Honest Truth About Dishonesty, Ariely is a go-to source for knowledge about why we do what we do, even when doing it just doesn’t make much sense.

Now Ariely, professor of psychology and behavioral economics at Duke University, has turned his attention to another topic that vexes the best of us: time management. He’s part of a team that created a smartphone app called Timeful, which I’d describe as an intuitive self-management tool that goes a few steps beyond typical schedulers and time planning apps. I recently spent some time chatting with Ariely about time management and related topics.

DiSalvo: Time management -- we’re obsessed with it, yet we generally seem really bad at mastering it. Where do you think people often fail in getting it right?

Ariely: I think we are obsessed with time management precisely because we are so bad at it, but the reality is it’s no wonder we are bad at it because it’s a really, really hard thing to do. Not only is it hard to manage multiple things, but you also have dynamic changes throughout the day in which you have some hours where you are more alert and have high cognitive capacity and some hours where you are more tired. And then things change dynamically like deadlines or additional requests come in, so the prioritization problem is incredibly hard. In fact, I think it’s not humanly possible. So what do we do? We do our best. We say “Ok, what’s the next best thing to do?”

And is technology helping or hurting us in this regard?

Technology is actually making things worse. Take a To Do list for example. I personally have over 729 things on my list in Evernote, and one of those To Dos is to go in and organize all of other things and figure out what I should do and what I should abandon. The problem is that it’s such a big task, it’s probably not going to happen. The world is rich, there are lots of things we can do, and we have an easier and easier time of putting things on our plate, and searching through all of those things and figuring out what is the next best thing to do is incredibly tough. We do a terrible job at it and the consequences are that we are stressed, unhappy and not as productive as we could be.

Which explains why when I talk to people about how they manage their time, the overriding vibe I get is stress. Figuring out how to make best use of time is stressful, and dwelling on that, it seems to me, can become its own time-consuming monster.

If you were a farmer and worked from sunrise to sunset, and farming includes very basic things to do with no real questions, life would be very simple. But we live in an incredibly wonderful age with lots of things vying for our time, more than we can handle, and on top of that we aren’t limited to sunrise to sunset. All of this richness, while wonderful, also creates very strong constraints.

This is why we started Timeful. Imagine that your life is like a factory with a productivity function – we need to figure out, from all of the jobs that can be done, which are the ones that are most productive to do right now. This depends on deadlines, and reporting requirements, and whether you need maintenance or need to be reinvigorated, etc., and once we understand all of those factors we can do a much better job of managing them. The stress is inevitable because of the richness of our lives, and our challenge is to harness technology to help us figure it out.

I’ve heard you say, in so many words, that our sense of the world “not being on our side”—not acting in our best interests—is sort of correct. Our attention is flooded with distractions on all sides. How do we defend our attention, and our time, against these unrelenting forces?

It’s true that every organization wants our attention. Not only do they control our shopping environment, but they control our phones, (just think about every app that’s trying to get your attention), and they compete. Sometimes this competition yields improved results, but sometimes it creates negative outcomes. It’s really all about trying to understand the “attention economy.” And, of course, right now Android and Apple control the rules for the attention economy to a large degree. Sure we can turn things off ourselves, but we still have a lot of apps trying to get us to do different things all the time.

We don’t always understand how limited our attention is, or how when we switch attention things become even harder. For example, when you check your email, after you finish reading it and go back to your work it takes an extra 15 minutes before you can actually focus again. Shifting of attention to a different mindset is not quick. This is something we need to understand.

Your app, Timeful, is easy to use, straightforward, and integrates well with other programs. My question (to myself and now to you) is, what’s going to make me want to use it more than I did the other apps that I started and then eventually stopped using?

The question is whether the app would give you sufficient benefits, which are basically to take scheduled things away from people’s lives. There are lots of things that don’t fit the characterization of the calendar. For example, if you have to do laundry, you might need to do it any day this week, sometime in the evening when you have time, but it doesn’t have to be on any particular day from 7-9. So it’s useful to be on your list of things to do and the app can suggest when you should be doing it. As we learn more, we’ll be able to be more helpful in suggesting and making peoples’ lives less stressful.

One of the features that we find most useful is called Good Habits. As an example, you can enter that you want to run three times a week and call your mother once a week and maybe do a five-minute meditation. After setting it once, we can recommend when you should do it depending on your schedule.

Since we’re not far into the New Year, give me your thoughts on New Year’s resolutions. Worth it?

I think New Year’s resolutions are great. You know, we tend to think of ourselves in binary terms. Either good or bad, and once we start being bad we figure what the hell, we might as well enjoy it. If you think about dieting, for example, someone will be on a diet and then eat a muffin and say “Oh well, I’m not really a dieter, I might as well enjoy it.” New Year’s gives you a chance to start with a clean slate.

The real issue is how can we create rules that we won’t break too quickly? How can we make the rules specific enough that we’ll follow them? A rule like “going on a diet” isn’t helpful because it’s too general. We need something specific like “no dessert during the week” or something along those lines. And then, how do we make the rules not too restrictive so that when we do break one occasionally it doesn't collapse us into bad behavior? These are all things we need to keep in mind to make the most of our resolutions.

]]>Deconstructing the Power of Overconfidence David DiSalvoSat, 15 Nov 2014 17:28:20 +0000http://www.daviddisalvo.org/the-daily-brain/2014/11/15/the-infuriating-power-of-overconfidence54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:546789fde4b056d725948317

Overconfidence. The word itself is irritating. Knowing that it often works is infuriating.

We’d prefer to believe that, like pride, overconfidence will reliably trip and stumble its way to a predictable fall and clear the path for those with level-headed confidence, gilded with humility, to climb the ladder. But that isn’t how it even usually goes.

Psychology research has often asked why that is, and churned up a few possible answers. A 2012 study concluded that even when overconfidence produces subpar results, its charm still wins the day. We might expect someone with more confidence than ability to underperform when pressed. The study tested that expectation and found it more or less accurate—but also found that it really doesn’t matter. Overconfidence may not deliver when objectively tested, but it has a knack for seducing people to such a degree that they ignore the results in favor of keeping the golden child on a pedestal.

If you had to isolate why this happens, it appears to come down to a matter of status—a commodity that overconfidence is expert at creating and nurturing. When managed well, the social status conferred by overconfidence has an aura just shy of magical, capable of keeping our attention diverted from measurable results.

That’s a jarringly paradoxical conclusion when you consider the average person’s gut reaction to "that overconfident jerk." How can we be both repulsed and seduced by the same thing? The question gets stranger in light of another study showing how even rudeness gets a pass if a person’s overconfidence has already alchemized sufficient status.

In one of the study’s experiments, participants watched a video of a man at a sidewalk café put his feet on another chair, tap cigarette ashes on the ground and rudely order a meal. Participants rated the man as more likely to “get to make decisions” and able to “get people to listen to what he says” than participants who saw a different video of the same man behaving politely. Through a few other experiments in the study, the same results prevailed—people tended to rate the rule breakers as more in control and more powerful than people who toed the line.

And what’s the essential ingredient in believing oneself to be above the rules? Overconfidence, of course. (This may also help explain why rude sales associates outsell others at luxury stores.)

Those studies circle the question of why we’re prone to falling for the chutzpah of overconfidence, but say little about why the overconfident are so good at pulling it off. The most recent study on the subject has an answer that’s not likely to lessen our irritation, but, irritatingly, it makes some sense, and can be summarized like this: Belief sells, whether it’s true or not. In the case of overconfidence, the belief in one’s ability—however out of proportion to reality—generates its own infectious energy. Self-deception is a potent means of convincing the world to see things your way.

Participants in the study (a group of college students) were asked to rate their own and their peers' abilities at the beginning of a 6-week course. About half were under-confident and a little less than half were overconfident (with the small balance of students accurately judging their abilities). They were then asked to reassess at the end of the course, after everyone had a chance to perform and real results were there for all to see.

The study showed that at the start of the course, overconfident students received higher peer ratings—and at the end of the course, regardless of how well or poorly they did, the overconfident students were still rated higher than others. In effect, the students' actual performance hardly mattered compared to the seductive signals sent by those believing they were the best.

This study added the wrinkle of gauging participants’ level of self-deception about their abilities, and found a strong correlation between the social sway of overconfidence and depth of self-deception. As the researchers summed it up, “Our findings suggest that people may not always reward the more accomplished individual but rather the more self-deceived.”

We may not like that conclusion, but it’s difficult to argue that it isn’t in evidence around us every day. People who don’t believe in themselves—whether that belief is well-grounded or not—aren’t likely to convince others to buy in. That’s partly what the psychological dynamic of self-efficacy is about: If you expect others to think you’re capable, you’d better believe it yourself. That's as true for healthy confidence as it is for its inflated alter-ego.

What the latest study and elements of the others are telling us is that self-deception is an especially potent brand of status fertilizer. When packaged with personality, it makes others want to believe even when the results would counsel otherwise. And though that doesn’t make the topic any less infuriating, it does throw some light on what makes overconfidence effective, despite its reputation.

Be honest. Do you draw conclusions about someone based on his or her profile photo?

Whether it’s on dating websites or Facebook or any other social media venue, the power of a single photo is immense. Looking at a person’s profile photo, we develop first impressions that frame the rest of what we see and read.

Psychology researchers want us to know something about our profile photo-centrism: it’s a lie, and it’s leading us to draw conclusions that likely have zero basis in reality.

"Our findings suggest that impressions from still photos of individuals could be deeply misleading," says psychological scientist and study author Alexander Todorov of Princeton University, an expert on the personality dynamics underlying first impressions.

Todorov and his research team conducted a series of studies to demonstrate how easily swayed we are by profile photos, and how even slight variations in photos can significantly change our opinions of a person’s personality.

Researchers asked participants in an online survey to view and rate headshots on personality characteristics, including attractiveness, competence, creativity, cunning, extraversion, meanness, trustworthiness, and intelligence. The photos were all taken in similar lighting, but headshots of some of the people were varied to show slightly different facial expressions.

The results showed that participants’ personality ratings of these slightly changed facial expressions varied just as much as their ratings of different people. In other words, virtually any change in photos of the same person altered personality impressions as much as viewing photos of different people.

In another study, the researchers asked participants to rate headshots shown in different contexts. The results in this case showed that participants’ ratings changed solely based on which context the photo appeared in. According to the research team, “(participants) tended to prefer one shot of an individual when they were told the photo was for an online dating profile, but they preferred another shot when they were told the individual was auditioning to play a movie villain, and yet another shot when they were told he was running for political office.”

The studies also examined how long it took for someone to make a personality judgment based on a profile photo, and found that strong preferences for specific images developed even when the photos were shown for a fraction of a second – a result that underscores just how sure we are that a profile photo tells a true story.

The takeaway from these studies is that our impressions formed by looking at profile photos are extremely malleable, no matter how sure we are that the photos are telling us something accurate about someone’s personality. That's worth keeping in mind especially on dating sites, where we're tempted to draw sweeping personality conclusions based on a passing glance at a photo.

Three more cases of Alzheimer’s disease will have been diagnosed by the time you finish reading this article. More than 5 million people have Alzheimer’s in the United States alone (44 million worldwide), and the rate of new diagnosis is about one patient every minute, with no cure on the horizon. Now a new study adds evidence to the argument that fish oil supplementation could be one of the best preventives we have against the disease--at least for people not at genetic risk of developing it.

Researchers from Rhode Island Hospital studied three groups of older adults, ages 55-90, using neuropsychological tests and brain magnetic resonance imaging (MRI) every six months. The group included 229 adults with no signs of the disease; 397 who were diagnosed with mild cognitive impairment; and 193 with Alzheimer’s. All participants were part of the Alzheimer’s Disease Neuroimaging Initiative (ADNI), which began in 2003 and ended in 2010.

Results showed that adults taking fish oil, who had not yet developed Alzheimer’s, experienced significantly less cognitive decline and brain shrinkage than adults not taking fish oil. Cognitive decline was measured using the Alzheimer's Disease Assessment Scale (ADAS-cog) and the Mini Mental State Exam (MMSE). (Unfortunately, the study did not specify the amount of fish oil taken, nor the percentage of EPA and DHA in the supplements.)

These are promising results, but they have one notable caveat: benefits of taking fish oil only held true for people lacking the main genetic risk factor for developing Alzheimer’s, known as APOE ε4 . The researchers think that people with APOE ε4 are incapable of metabolizing DHA, the fatty acid in fish oil thought to promote cognitive benefits.

The researchers add, however, that it’s still possible that starting fish oil supplementation during or before middle age could protect against developing Alzheimer’s even for people with the genetic marker. If you think of the gene for Alzheimer’s as a light switch, taking fish oil earlier in life could prevent the switch from being flicked on.

At least that’s the hope, and given the fact that Alzheimer’s—the sixth leading cause of death in the U.S.—still evades a cure, fish oil will continue to be a hot target of cognitive research as a possible shield against developing the disease.

Ancient wisdom traditions have long held that gratitude is a prerequisite for fulfillment. Focusing on what we have, instead of what we think we need, fortifies the mind against rampant desire that ultimately leaves us feeling empty.

The difficulty we face in living out that wisdom comes in the form of challenges to self-control – our perilous dance with instant gratification and temptation. Now new research suggests that gratitude can help us out here as well, by improving our decision-making chops by fortifying our patience.

Researchers tested this theory by putting study participants through a test of financial self-control after they were pre-conditioned to feel one of three emotional states: (1) Grateful, (2) Happy, or (3) Neutral. The pre-conditioning was achieved by having the participants write about a life experience that made them feel either grateful, happy, or left them feeling a lot of nothing.

The financial test was very basic: participants could either choose to receive $54 now or $80 in three days. Alternatively, they could negotiate to receive a lesser or greater amount now instead of more later (for example, a participant could choose to take a $60 payout now instead of an $85 payout later).

The typical reaction to these tests is that most people opt for less money upfront instead of waiting for more. Participants in this study who were pre-conditioned to feel happy or no particular emotion mostly reacted exactly as expected – they wanted the cash now.

But participants feeling grateful showed more restraint, with significantly more in this group opting to wait longer for the larger amount of money. And the more grateful participants reported feeling, the more patient they were.

“Showing that emotion can foster self-control and discovering a way to reduce impatience with a simple gratitude exercise opens up tremendous possibilities for reducing a wide range of societal ills from impulse buying and insufficient saving to obesity and smoking,” said study co-author Assistant Professor Ye Li from the University of California, RiversideSchool of Business Administration.

It’s not entirely clear why feeling more grateful increases patience, but it may simply be that gratitude is an emotional counterbalance to selfishness.

Juliann Wiese, an executive coach who consults with several top-tier companies, says the study buttresses something she’s found to be repeatedly true in her experience. "In my coaching practice, I've observed that once people shift their focus to what's already worthwhile in their lives--instead of what they think they are missing--their decision-making skills rapidly improve."

Just as people in this study were pre-conditioned to feel grateful, there’s likely benefit in putting ourselves in a grateful mindset to alter our perspective. According to Wiese: "Filtering decisions through a gratitude-centered perspective isn't only important for making better decisions on the job, but across all aspects of our lives. People are simply more patient and less prone to jumping at the first offer or fleeting hint of temptation when they’re grounded in gratitude.”

You’ve seen the advertisements all around the Web: the curve is coming to a TV near you. It seems at first glance a simple innovation, in some ways even a predictable one. Watching commercials for Samsung’s new line of televisions, I find myself wondering why it’s taken this long for curved screens to arrive. And it’s altogether possible that I’m asking that question because to my brain—and quite likely to yours—the curve simply fits.

Behavioral researchers have known this for some time; people consistently show a preference for curves over hard lines and angles. Whether the object is a wristwatch, a sofa, or a well-designed building, curves curry favor. Neuroscience, following the lead of behavioral science, is on the hunt for a neurally-hardwired preference for curvy elegance.

Of course, televisions with curved screens offer technical advantages beyond aesthetics. Curvature reduces reflection, making the viewing experience easier on the eyes, and simultaneously creates the illusion that the viewer is surrounded by the screen. The so-called “sweet spot” at the center of the illusion is a comfortable magnet for our attention. (Most of this has been known since the 1950s with the introduction of the first curved movie theater screen, the Cinerama.)

But aside from those advantages, studies suggest that merely viewing the curves of an object triggers relief in our brains – an easing of the threat response that keeps us on guard so much of the time. While hard lines and corners confer a sense of strength and solidity, they are also subtly imposing. Something about them keeps our danger-alert system revved.

Research shows that we subjectively interpret sharp, hard visual cues as red flags in our environment (with corresponding heightened activity in our brain’s threat tripwire, the amygdala). Even holding a glass with pronounced hard lines and edges has been shown to elevate tension across the dinner table. Curves take the perceptual edge off.

The softness of contour may also play out in our brains not unlike an emotionally satisfying song or poem. A new discipline known as neuroaesthetics—an ambitious vector between neuroscience and the fine arts—is exploring this idea, shedding light on why the love of curves is seducing technology manufacturers. Recent research under this new banner suggests that curved features in furniture, including TVs, trigger activity in our brains’ pleasure center. We derive a buzz from curves much as we do when viewing a beautiful work of art.

The overlap of visual impact from things as commonplace as chairs and tables and TVs, and emotional impact at such a high level (a level we’d normally reserve for art and music) suggests that the ordinary elements in our environments aren’t so ordinary after all. Skilled industrial designers have known this for quite some time, but now science is adding an explanatory dimension that makes the point all the more compelling.

And if it’s true, as the research indicates, that the curve is an emotional elixir for anxiety-prone brains, the latest trend is likely to take hold and transform our interface with all things digital. We may eventually look back on the non-curved days of technology the way we think of black and white television now. The term “flat-screen TV” will go the way of “rotary dial phone”.

Having said that, the true test of the curvy trend’s appeal won’t happen until the price point drops considerably. Curves may calm, but the prices on many curved screen TVs are anything but calming. We’ll have to wait a while for that to play out to know whether the brain's love of the curve translates into an enduring shift in technology.

We’ve all known people who should have to wear a flashing red DANGER! sign if they miss lunch, though even without the warning we instinctively know to steer clear if someone is running on empty. A grumbling stomach means a drop in blood sugar, and through excruciating experience most of us realize that means trouble. But could the blood sugar-anger connection lurk behind more relationship conflicts than we realize?

A new study probed that question with a research methodology as painfully funny as it was effective. Researchers rounded up 107 married couples for a 21-day couples’ boot-camp to draw a direct line between blood glucose (aka circulating blood sugar) and aggression.

First they asked the couples to complete a relationship questionnaire that evaluated their level of satisfaction with their marriages, which allowed the research team to control for variables like how rocky the marriage was to begin with. They also measured all of the participants’ blood glucose levels to set a benchmark, and continued to measure the levels throughout the 21-day study.

The researchers predicted that drops in blood sugar would consistently correlate with heightened aggression between the spouses. Aggression was defined in two ways: aggressive impulse and aggressive behavior. The distinction was meant to identify aggression in thought versus action, because aggression rarely happens in a vacuum—there’s usually a thought impulse that precedes it, even if that impulse doesn’t occur immediately before the action but compounds over time.

To test aggressive impulse, the researchers gave participants a voodoo doll and 51 pins, with instructions to place as many pins in the doll every night as needed to show how angry they were with their spouse. A light conflict day might get just a couple pokes, while a “cover the kids' eyes and ears” day might warrant the full 51 to the head.

To test aggressive behavior, the researchers had the spouses wear headphones while they competed against each other in 25-part tasks. After each task, the winner decided how loudly and for how long to blast the loser with a noise through the headphones.

At the end of the 21 days, with riddled voodoo dolls and ringing ears aplenty, the hypothesis was proven out. The lower the level of blood glucose, the more pins the spouses poked, and the higher the intensity and longer the duration they blasted their partners through their headphones.

The study provides a couple of worthwhile takeaways. First, quoting Brad Bushman, professor of psychology and communication at Ohio State University and lead study author, “Before you have a difficult conversation with your spouse, make sure you're not hungry." Simple to say, harder to do.

Second, and the reason why that’s such good advice, is our brains are energy hogs. "Even though the brain is only two percent of our body weight, it consumes about 20 percent of our calories. It is a very demanding organ when it comes to energy," added Bushman. When the brain is short on energy, it’s also short on self-control, and the door is opened for aggressive impulses and behavior to take center stage. And if the study results are a true indication, we’re red lining our self-control more often than we realize.

I’d love to see a follow-up study that attempts to track these results against the blood sugar rollercoaster associated with fast food-laden diets. I have a suspicion that glucose-related aggression isn’t solely about how much or little food we eat, but also the sorts of food we eat. Just a hunch, but it stands to reason that shoveling in foods that cause our blood sugar levels to spike and crash day after day may also trigger spousal (and other) explosions. A little food for thought while you're sitting in the drive-thru.

]]>How Your Blood Sugar Could Be Wrecking Your RelationshipsWhen It Comes To Choosing Mates, Women And Men Often Get FramedDavid DiSalvoMon, 09 Jun 2014 00:21:13 +0000http://www.daviddisalvo.org/the-daily-brain/2014/6/8/when-it-comes-to-choosing-mates-women-and-men-often-get-fram.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dca6

If I tell you that seven of ten doctors believe a medication is helpful, the positive weight of the seven endorsements will trump potential negatives. But if I tell you that three of ten doctors believe that a medication should be avoided, the weight of those three negative critiques will overpower potential positives. The information in either case is the same – the only difference is how it's framed.

Our susceptibility to the framing bias has been demonstrated in study after study (most notably by Nobel Prize-winning psychologist Daniel Kahneman), and now a new study by Concordia University researchers shows how framing influences our selection of love interests.

Hundreds of study participants were given positively and negatively framed descriptions of potential partners. For example:

"Seven out of 10 people who know this person think that this person is kind." [positive frame]

versus

"Three out of 10 people who know this person think that this person is not kind." [negative frame]

The researchers tested the framing effect across six attributes that are known from previous research to rank high in importance to men and women; four that are important to either sex, and two that are important to both sexes:

Attractive body (usually more important to men)

Attractive face (usually more important to men)

Earning potential (usually more important to women)

Ambition (usually more important to women)

Kindness (equally important to both)

Intelligence (equally important to both)

Participants evaluated both “high-quality” (e.g. seven out of 10 people think this person is kind) and “low-quality” (e.g. three out of 10 people think this person is kind) prospective mates for each of these attributes, in the context of a short-term fling or a long-term relationship.

What the research team found is that more often than not, women were significantly less likely to show interest in men described with the low-quality frame, even though they were being presented with exactly the same information as they were in the high-quality frames.

"When it comes to mate selection, women are more attuned to negatively framed information due to an evolutionary phenomenon called 'parental investment theory,'" says study co-author and Concordia marketing professor Gad Saad, a noted researcher on the evolutionary and biological roots of consumer behavior.

"Choosing someone who might be a poor provider or an unloving father would have serious consequences for a woman and for her offspring. So we hypothesized that women would naturally be more leery of negatively framed information when evaluating a prospective mate.”

In particular, women were most susceptible to the framing bias when evaluating a man’s earning potential and ambition.

Men, on the other hand, fell prey to framing most often when evaluating a woman’s physical attractiveness.

While these results at first seem to reinforce stereotypes about how women and men seek mates, they make sense in light of what we know about evolutionary psychology. And they provide an important takeaway for both sexes: before you draw a final conclusion about a would-be mate, consider whether you’re being overly influenced by how their good or bad attributes have been framed, either by others or by the person her or himself. Better to check your bias early than suffer its consequences later.

]]>When It Comes To Choosing Mates, Women And Men Often Get FramedCould Cooperation and Corruption Originate with the Same Hormone?David DiSalvoMon, 19 May 2014 01:07:58 +0000http://www.daviddisalvo.org/the-daily-brain/2014/5/18/could-cooperation-and-corruption-originate-with-the-same-hor.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dca4We humans contend with quite a few wicked flip sides in our personal and interpersonal lives. Gratitude can transform into resentment. Concern can morph into apathy. Love can quickly become hate. New research digs deeper into a similar neurobiological duality that can, and frequently does, run rampant in groups: the Jekyll and Hyde of cooperation and corruption.

Researchers hypothesized that oxytocin—the same hormone that previous studies have linked to collaboration and altruism—can predispose us to acting dishonestly if we think doing so will benefit our group of choice. A “group” in this case means anyone to whom we feel some sense of obligation, be it family, coworkers, peers, political cronies or our Friday night craft beer buddies.

In our day-to-day lives, oxytocin is thought to play a big role in how closely bonded we feel to our group. It isn’t just the “cuddle hormone” (often discussed in studies about love and affection) but also the group-cohesion hormone.

To test the hypothesis, the research team gave one group of healthy male participants a dose of oxytocin via nasal spray and another group a placebo nasal spray (neither the participants nor the researchers knew which participants received which spray). The participants were then asked to toss a coin multiple times and make predictions on whether they’d flip heads or tails, and then self-report on the results. How well they did, they were told, would win or lose money for their fellow group members. How they reported—honestly or dishonestly—was kept anonymous, assuring the participants that how they chose to respond wouldn’t reflect back on them personally.

We might guess that participants would lie more often about the results only if they, individually, could benefit – but instead participants given oxytocin lied significantly more about the coin flip than the placebo group onlyif doing sogained money for their fellow group members. And they lied for the group even if they thought that the favor wouldn't be reciprocated.

To find out how participants would react if they thought they’d benefit individually, the researchers put another group through the same testing conditions but told participants that the results of their predictions would only win or lose them money, with no group benefit or loss attached. The results showed that oxytocin did not influence participants to lie any more than those in the placebo group.

In other words, oxytocin promoted lying for group but not individual benefit.

The study has a few limitations, the most obvious of which is that it used only male participants. Whether or not oxytocin would influence females toward group dishonesty is impossible to tell from these results.

But, at least for men, it seems that higher levels of oxytocin potently affect decisions to lie for the group’s benefit. This may help explain the “you go, I go, we all go” nature of fraternal groups. And the results highlight the role of group bonding in forging hard-to-crack corruption. Last year's hit movie, The Wolf of Wall Street, a true tale about a group of corrupt stock brokers making an obscene amount of ill-gotten money, and lying to ensure that no one got caught (at least for a while), comes to mind as a vivid illustration.

]]>Your Brain Channel Has Launched! David DiSalvoWed, 07 May 2014 00:10:38 +0000http://www.daviddisalvo.org/the-daily-brain/2014/5/6/your-brain-channel-has-launched.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dca2Hi everyone! Just wanted to let you know that I've launched a new science video channel on YouTube called Your Brain Channel. We'll be featuring brief "News You Can Use" video segments on a range of science-related topics. Please check out the first couple of entries on the green tea-memory connection and testing the sleep debt theory. Plenty more videos to come, so check back often. Thanks!

]]>The Connection Between Playing Video Games and a Thicker BrainDavid DiSalvoThu, 01 May 2014 17:18:43 +0000http://www.daviddisalvo.org/the-daily-brain/2014/5/1/the-connection-between-playing-video-games-and-a-thicker-bra.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dca1For all the negative news about the alleged downsides of playing videogames, it’s always surprising to come across research that shows a potentially huge upside. A new study fills the bill by showing that heavy video game play is associated with greater “cortical thickness” – a neuroscience term meaning greater density in specific brain areas.

Researchers studied the brains of 152 adolescents, both male and female, who averaged about 12.6 hours of video gaming a week. As one might guess, the males, on average, played more than the females, but all of the participants spent a significant amount of time with a gaming console. The research team wanted to know if more time spent gaming correlated with differences in participants’ brains.

The prefrontal cortex is often referred to as our brain’s command and control center. It’s where higher order thinking takes place, like decision-making and self-control. Previous research has shown that the DLPFC plays a big part in how we process complex decisions, particularly those that involve weighing options that include achieving short-term objectives with long-term implications. It’s also where we make use of our brain’s working memory resources – the information we keep “top of mind” for quick access when making a decision.

The FEF is a brain area central to how we process visual-motor information and make judgments about how to handle external stimuli. It’s also important in decision-making because it allows us to efficiently figure out what sort of reaction best suits what’s happening around us. The term “hand-eye coordination” is part of this process.

Together, the DLPFC and FEF are crucial players in our brain’s executive decision-making system. Greater “thickness” in these brain areas (in other words, more connections between brain cells) indicates a greater ability to juggle multiple variables, whether those variables have immediate or long-term implications, or both.

While this study doesn’t quite show that playing hours of videos games each week causes these brain areas to grow thicker, the correlation is strong – strong enough to consider the possibility that gaming is sort of like weight lifting for the brain.

And that, even more than the video game connection, is what makes this study really interesting. It suggests that the popular terms “brain training” and "brain fitness" are more than marketing ploys to sell specialized software. If it’s true that playing video games is not unlike exercise that beefs up our brain’s decision-making brawn, then it logically follows that we can not only perceptually, but physically improve our brains with practices designed for the purpose. Future research will continue exploring precisely that possibility.

]]>Can Chocolate In A Pill Boost Heart Health?David DiSalvoMon, 28 Apr 2014 17:23:13 +0000http://www.daviddisalvo.org/the-daily-brain/2014/4/28/can-chocolate-in-a-pill-boost-heart-health.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dca0We've been hearing about the alleged health benefits of eating dark chocolate for the last decade or so, including lower blood pressure and improved cholesterol levels. Those claims are about to be put to an exhaustive test in a study of 18,000 adults in Boston and Seattle. But instead of eating chocolate bars every day, the study participants will take capsules containing concentrated amounts of the bio-active chemicals in cocoa beans, known as cocoa flavanols.

If study results are consistent with previous studies showing health benefits of eating cocoa flavanols, it will be a semi-sweet outcome for chocolate lovers. Generally, the higher the level of cocoa, the less sweet the chocolate -- though even chocolate with 72% cocoa contains in the neighborhood of 240 calories per serving, including 10 grams of sugar and 18 grams of fat.

The study participants will theoretically get all of the good stuff without the extra calories from fat and sugar. Each participant will take two flavorless capsules a day containing 750 milligrams of cocoa flavanols (or dummy pills for those in the control group) for four years. Over that time, participants' heart health will be monitored to determine if the mega dose of cocoa does what previous, smaller studies indicate. To ingest the same amount of cocoa flavanols as the study participants would require eating almost five bars of dark chocolate a day.

The latest research is being funded by Mars Inc., makers of M&Ms and other candies, and the National Heart, Lung and Blood Institute. Mars co-sponsoring the study will raise red flags with critics, but it’s worth noting that the company has funded cocoa flavanol research since the 1990s, and much of what we know about the possible benefits of cocoa has emerged from Mars-supported studies.

]]>How A Tiny Bit Of Procrastination Can Help You Make Better DecisionsDavid DiSalvoSun, 06 Apr 2014 19:47:51 +0000http://www.daviddisalvo.org/the-daily-brain/2014/4/6/how-a-tiny-bit-of-procrastination-can-help-you-make-better-d.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dc9fDecision-making is what you might call “practical science.” Findings on how we make decisions have direct applicability to life outside the psychology lab, and in recent years there’s been quite a lot said about this most commonplace, yet complicated, feats of mind. A new study adds to the discussion by suggesting that a wee bit of procrastination can make us better decision-makers.

Researchers from the Columbia University Medical Center wanted to know if they could improve decision accuracy by inserting just a smidgen more time between peoples’ observation of a problem and their decision on how to respond.

The research team conducted two experiments to test this hypothesis. First, they asked study participants to make a judgment about the direction of a cluster of rapidly moving dark dots on a computer monitor. As the dots (called the “target dots” in the study) traveled across the screen, participants had to determine if the overall movement was right or left. At the same time, another set of brighter colored dots (the “distractor dots”) emerged on the screen to obscure the movement of the first set. Participants were asked to make their decisions as quickly as possible.

When the first and second set of dots moved in generally the same direction, participants completed the task with near-perfect accuracy. When the second set of dots moved in a different direction than the first, the error rate significantly increased. Simple enough.

The second experiment was identical to the first, except this time participants were told to make their decisions when they heard a clicking sound. The researchers varied the clicks to be heard between 17 and 500 milliseconds after the participants began watching the dots – a timespan chosen to mimic real-life situations, such as driving, where events happen so quickly that time seems almost imperceptible.

The research team found that when participants’ decisions were delayed by about 120 milliseconds, their accuracy significantly improved.

"Manipulating how long the subject viewed the stimulus before responding allowed us to determine how quickly the brain is able to block out the distractors and focus on the target dots," said Jack Grinband, PhD, one of the study authors. "In this situation, it takes about 120 milliseconds to shift attention from one stimulus, the bright distractors, to the darker targets."

The researchers were careful to distinguish “delaying” from “prolonging” the decision process. There seems to be a sweet spot that allows the brain just enough time to filter out distractions and focus on the target. If there’s too little time, the brain tries to make a decision while it’s still processing through the distractions. If there’s too much time, the process can be derailed by more distractions.

If you’re wondering how anyone can actually do this with so little time to make a decision, the answer—suggested by this study—is practice. Just as the participants were cued by the clicks to make a decision, it would seem that we have to train ourselves to delay just long enough to filter distractions.

Said another way, doing nothing--for just a tiny amount of time--gives the brain an opportunity to process and execute. (In my book, Brain Changer, I refer to this as the "awareness wedge" because it's a consciously inserted "wedge" between the immediacy of whatever situation we're facing and our next action.)

This research also underscores just how dangerous it can be to add distractions to the mix—like using a phone while driving—when our brains already need a cushion of time to filter the normal array of distractions we experience all the time.

]]>The Good and Bad News About Your Sleep DebtDavid DiSalvoSat, 15 Mar 2014 18:00:59 +0000http://www.daviddisalvo.org/the-daily-brain/2014/3/15/the-good-and-bad-news-about-your-sleep-debt.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dc9cSleep, science tells us, is a lot like a bank account with a minimum balance penalty. You can short the account a few days a month as long as you replenish it with fresh funds before the penalty kicks in. This understanding, known colloquially as “paying off your sleep debt,” has held sway over sleep research for the last few decades, and has served as a comfortable context for popular media to discuss sleep with weary eyed readers and listeners.

The question is – just how scientifically valid is the sleep debt theory?

Recent research targeted this question by testing the theory across a few things that sleep, or the lack of it, is known to influence: attention, stress, daytime sleepiness, and low-grade inflammation. The first three are widely known for their linkage to sleep, while the last—inflammation—isn’t, but should be. Low-grade tissue inflammation has been increasingly linked to a range of unhealthiness, with heart disease high on the list.

Study participants were first evaluated in a sleep lab for four nights of eight-hour sleep to establish a baseline. This provided the researchers with a measurement of normal attention, stress, sleepiness and inflammation levels to measure against.

The participants then endured six nights of six-hour sleep (a decent average for someone working a demanding job and managing an active family and social life). They were then allowed three nights of 10-hour catch-up sleep. Throughout the study, participants’ health and ability to perform a series of tasks were evaluated.

Sleep debt theory predicts that the negative effects from the first six nights of minimal sleep would be largely reversed by the last three nights of catch-up sleep – but that’s not exactly what happened.

The analysis showed that the six nights of sleep deprivation had a negative effect on attention, daytime sleepiness, and inflammation as measured by blood levels of interleukin-6 (IL-6), a biomarker for tissue inflammation throughout the body — all as predicted. It did not, however, have an effect on levels of the stress hormone cortisol—the biomarker used to measure stress in the study—which remained essentially the same as baseline levels.

After three-nights of catch-up sleep, daytime sleepiness returned to baseline levels – score one for sleep debt theory. Levels of IL-6 also returned to baseline after catch-up – another score in the theory’s corner. Cortisol levels remained unchanged, but that’s not necessarily a plus for the theory (more on that in a moment).

Attention levels, which dropped significantly during the sleep-deprivation period, didn't return to baseline after the catch-up period. That’s an especially big strike against the theory since attention, perhaps more than any other measurement, directly affects performance. Along with many other draws on attention—like using a smart phone while trying to drive—minimal sleep isn’t just a hindrance, it’s dangerous, and this study tells us that sleeping heavy on the weekends won’t renew it.

Coming back to the stress hormone cortisol, the researchers point out that its level remaining relatively unchanged probably indicates that the participants were already sleep deprived before they started the study. Previous research has shown a strong connection between cortisol and sleep; the less sleep we get, the higher the level of the stress hormone circulating in our bodies, and that carries its own set of health dangers. This study doesn’t contradict that evidence, but also doesn’t tell us one way or the other if catch-up sleep decreases cortisol levels.

The takeaway from the study is that catch-up sleep helps us pay off some, but by no means all of our sleep debt. And given the results on impaired attention, another takeaway is that it’s best to keep your sleep-deprived nights to a minimum. Just because you slept in Saturday and Sunday doesn’t mean you’ll be sharp Monday morning.

]]>Balancing the Self-Control SeesawDavid DiSalvoSat, 01 Mar 2014 02:42:31 +0000http://www.daviddisalvo.org/the-daily-brain/2014/2/28/balancing-the-self-control-seesaw.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dc9bImagine a seesaw in your brain. On one side is your desire system, the network of brain areas related to seeking pleasure and reward. On the other side is your self-control system, the network of brain areas that throw up red flags before you engage in risky behavior. The tough questions facing scientific explorers of behavior are what makes the seesaw too heavy on either side, and why is it so difficult to achieve balance?

A new study from University of Texas-Austin, Yale and UCLA researchers suggests that for many of us, the issue is not that we’re too heavy on desire, but rather that we’re too light on self-control.

Researchers asked study participants hooked up to a magnetic resonance imaging (MRI) scanner to play a video game designed to simulate risk-taking. The game is called Balloon Analogue Risk Task (BART), which past research has shown correlates well with self-reported risk-taking such as drug and alcohol use, smoking, gambling, driving without a seatbelt, stealing and engaging in unprotected sex.

The research team used specialized software to look for patterns of activity across the brain that preceded someone making a risky or safe decision while playing the game.

The software was then used to predict what other subjects would choose during the game based solely on their brain activity. The results: the software accurately predicted people's choices 71 percent of the time.

What this means is that there’s a predictable pattern of brain activity associated with choosing to take or not take risks.

"These patterns are reliable enough that not only can we predict what will happen in an additional test on the same person, but on people we haven't seen before," said Russ Poldrack, director of UT Austin's Imaging Research Center and professor of psychology and neuroscience.

The especially intriguing part of this study is that the researchers were able to “train” the software to identify specific brain regions associated with risk-taking. The results fell within what’s commonly known as the “executive control” regions of the brain that encompass things like mental focus, working memory and attention. The patterns identified by the software suggest a decrease in intensity across the executive control regions when someone opts for risk, or is simply thinking about doing something risky.

"We all have these desires, but whether we act on them is a function of control," says Sarah Helfinstein, a postdoctoral researcher at UT Austin and lead author of the study.

Coming back to the seesaw analogy, this research suggests that even if our desire system is level, our self-control system appears to slow down in the face of risk; less intensity on that side of the seesaw naturally elevates intensity on the other side.

And that’s under normal conditions. Add variables like peer pressure, sleep deprivation and drug and alcohol use to the equation--all of which further handicap self-control--and the imbalance can only become more pronounced.

That’s what the next phase of this research will focus on, says Helfinstein. "If we can figure out the factors in the world that influence the brain, we can draw conclusions about what actions are best at helping people resist risks.”

Ideally, we'd be able to balance the seesaw -- enabling consistently healthy discretion as to which risks are worth taking. While it's evident that too much exposure to risk is dangerous, it's equally true that too little exposure to risk leads to stagnation.

We are, after all, an adaptive species. If we're never challenged to adapt to new risks, we stop learning and developing, and eventually sink into boredom, which, ironically, sets us up to take even more radical risks.

]]>How to Squeeze Snake Oil From Deer Antlers and Make MillionsDavid DiSalvoSat, 22 Feb 2014 04:05:48 +0000http://www.daviddisalvo.org/the-daily-brain/2014/2/21/how-to-squeeze-snake-oil-from-deer-antlers-and-make-millions.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dc9aIf I offered to sell you a liquid extract made from the velvety coating of deer antlers, claiming that it will catalyze muscle growth, slow aging, improve athletic performance and supercharge your libido – I’d expect you'd be a little skeptical. But what if I added that a huge percentage of professional athletes are using the stuff and paying top dollar, $100 or more an ounce, and swear up and down that just a few mouth sprays a day provides all benefits as advertised? Would you be willing to give it a try?

Ever since former Baltimore Ravens star Ray Lewis admitted a few months ago that he used deer antler spray (though subsequently denied it), the market for the stuff has exploded. Some estimates say that close to half of all professional football and baseball players are using it and a hefty percentage of college players as well, to say nothing of the army of weightlifters and bodybuilders that have made the spray a daily part of their routines.

TV journalism bastion 60 Minutes recently ran a special sports segment about "Deer Antler Man" Mitch Ross, the product's highest profile salesman, and the tsunami of buyers for oral deer antler spray and its growing list of celebrity devotees. Without question, deer antler spray has captivated the attention of the sports world and is rapidly pushing into mainstream markets.

Let’s take a look at the science behind the claims and try to find out what’s really fueling the surge in sales for this peculiar product.

The velvety coating of deer antlers is a chemically interesting material. For centuries it’s been used in eastern traditions as a remedy for a range of maladies, and there’s an underlying rationale for why it theoretically could be useful for certain conditions. The velvet coating contains small amounts of insulin-like growth factor 1, or IGF-1, that has been studied for several decades as a clinically proven means to reverse growth disorders in humans. For example, in children born with Laron Syndrome—a disorder that causes insensitivity to growth hormone, resulting in dwarfism—treatment with IGF-I has been shown to dramatically increase growth rates. IGF-1 appears to act as a chemical facilitator for the production of growth hormone from the pituitary gland, and in sufficient amounts even synthetically derived IGF-1 can help boost physical growth.

That’s the reason why IGF-1 has been banned by the Food and Drug Administration (FDA) and the World Anti-Doping Agency in certain forms as having similar outcomes to using human growth hormone and anabolic steroids. The forms these agencies have banned, however, are high-dosage, ultra-purified liquids administered by injection.

Why can’t the FDA and anti-doping agencies ban IGF-1 outright? For the simple reason that the chemical, in trace amounts, is found in things we eat every day: red meat, eggs and dairy products. Every time you eat a juicy ribeye or have a few eggs over easy, you’re ingesting IGF-1.

In the tiny amounts of the substance found in these foods, we may experience a cumulative, positive effect on muscle repair over time, but you’ll never be able to drink enough whole milk in a sitting to experience the anabolic effects you’d get from a syringe full of concentrated and purified IGF-1.

As I mentioned, the velvety substance on growing deer antlers also contains trace amounts of IGF-1, and (along with oddities like powdered tiger bone) has been sold in China for centuries as a traditional cure for several ailments. In traditional Chinese medicine, the antler is divided into segments, each segment targeted to different problems. The middle segment, for example, is sold as a cure for adult arthritis, while the upper section is sold as a solution for growth-related problems in children. The antler tip is considered the most valuable part and sells for top dollar.

The main source for the market explosion in deer antler spray is New Zealand, which produces 450 tons of deer velvet annually, compared to the relatively small amount produced by the US and Canada: about 20 tons annually. Deer can be killed outright for their antlers, but in New Zealand the more accepted procedure is to anesthetize the deer and remove the antlers at the base. The antlers are then shipped overseas to the growing market demanding them.

The reason why deer antler velvet is usually turned into an oral liquid spray instead of a pill (although it is also sold in pill form around the world) is that the trace proteins in the substance are rapidly broken down by the digestive system, so only a fraction of the already tiny amount actually makes it into the bloodstream. In spray form, IGF-1 can potentially penetrate mucosal membranes and enter the bloodstream intact more quickly. Purchasing the spray form can run from anywhere between about $20 for a tiny bottle to $200 for two ounces. Standard doses are several sprays per day, so the monthly costs of using the product are exorbitant.

The question is does using deer antler spray deliver the benefits its sellers claim? These alleged benefits include accelerated muscle growth and muscle repair, tendon repair, enhanced stamina, slowing of the aging process, and increased libido – a virtual biological panacea of outcomes.

The consensus opinion from leading endocrinologists studying the substance, including Dr. Roberto Salvatori at the Johns Hopkins School of Medicine and Dr. Alan Vogol at the University of Virginia, is that the chances of it delivering on any of these benefits are slim to none. The reason is simply that there's far too little of the substance in even the purest forms of the spray to make any difference.

Think of it this way: If a steak contains roughly the same trace amount of IGF-1 as deer antler velvet, is there any evidence to suggest that eating steak can provide the same array of benefits claimed for deer antler spray? No, there’s not a shard of clinical evidence to support that claim.And yet, thousands of people are paying close to $200 a bottle for the spray believing that it will deliver these benefits. With such high-profile celebrity connections as Ray Lewis and golf superstar Vijay Singh, there’s little wonder why the craze has picked up momentum. But in light of scientific evidence, there’s no credible reason to pay $200 or any amount for a bottle of deer antler spray.

Aside from the lack of evidence supporting benefits, it’s unclear what the negative effects may be of using the product long-term. WebMD reports that the compounds in the spray may mimic estrogen in the body, which could contribute to spawning a variety of cancers or worsening of conditions such as uterine fibroids in women. Elevated estrogen levels in men can throw off hormonal balance and lead to a thickening waistline and a host of related metabolic problems.

The takeaway is this: deer antler spray is the latest high-priced snake oil captivating the market. Not only will it cost you a lot of money and not deliver promised benefits, but it could lead to negative health outcomes. Let the deer keep their antler velvet and keep your cash in your wallet.

]]>The Era Of Genetically-Altered Humans Could Begin This YearDavid DiSalvoSun, 09 Feb 2014 23:20:41 +0000http://www.daviddisalvo.org/the-daily-brain/2014/2/9/the-era-of-genetically-altered-humans-could-begin-this-year.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dc99By the middle of 2014, the prospect of altering DNA to produce a genetically-modified human could move from science fiction to science reality. At some point between now and July, the UK parliament is likely to vote on whether a new form of in vitro fertilization (IVF)—involving DNA from three parents—becomes legally available to couples. If it passes, the law would be the first to allow pre-birth human-DNA modification, and another door to the future will open.

The procedure involves replacing mitochondrial DNA (mtDNA) to avoid destructive cell mutations. Mitochondria are the power plants of human cells that convert energy from food into what our cells need to function, and they carry their own DNA apart from the nuclear DNA in our chromosomes where most of our genetic information is stored. Only the mother passes on mtDNA to the child, and it occasionally contains mutations that can lead to serious problems.

According to the journal Nature, an estimated 1 in 5,000-10,000 people carry mtDNA with mutations leading to blindness, diabetes, dementia, epilepsy and several other impairments (the equivalent of 1,000 – 4,000 children born each year in the U.S.). Some of the mutations lead to fatal diseases, like Leigh Syndrome, a rare neurological disorder that emerges in infancy and progressively destroys the ability to think and move.

By combining normal mitochondrial DNA from a donor with the nucleus from a prospective mother’s egg, the newborn is theoretically free from mutations that would eventually lead to one or more of these disorders. While never tried in humans (human cell research on mtDNA has so far been confined to the lab), researchers have successfully tested the procedure in rhesus monkeys.

Last March, the UK Human Fertilization and Embryology Authority wrapped up a lengthy study of safety and ethical considerations and advised parliament to approve the procedure in humans. According to New Scientist magazine, parliament is likely to vote on the procedure by July of this year. If the procedure overcomes that hurdle, it will still take several months to pass into law, but the initial vote will allow researchers to begin recruiting couples for the first human mtDNA replacement trials.

The U.S. is not nearly as close to approving mtDNA replacement as the UK seems poised to do; the U.S. Food and Drug Administration will start reviewing the data in earnest in February. Among the concerns on the table is whether the mtDNA donor mother could be considered a true “co-parent” of the child, and if so, can she claim parental rights?

Even though the donor would be contributing just 0.1 percent of the child’s total DNA (according to the New Scientist report), we don’t as yet have a DNA benchmark to judge the issue. Who is to say what percentage of a person’s DNA must come from another human to constitute biological parenthood?

Other scientists have raised concerns about the compatibility of donor mtDNA with the host nucleus and believe the push to legalize human trials is premature. By artificially separating mtDNA from the nucleus, these researchers argue, we may be short-circuiting levels of genetic communication that we're only beginning to fully understand.

These are but two of many issues that this procedure will surface in the coming months. One thing is certain: we’re rapidly moving into new and deeper waters, and chances are we're going to need a bigger boat.

]]>Why Is Heroin Abuse Rising While Other Drug Abuse Is Falling?David DiSalvoTue, 04 Feb 2014 21:36:26 +0000http://www.daviddisalvo.org/the-daily-brain/2014/2/4/why-is-heroin-abuse-rising-while-other-drug-abuse-is-falling.html54445da1e4b02b47d4967a37:54485e3be4b0c9296f40db86:54485e3be4b0c9296f40dc97Peter Shumlin, Democratic governor of Vermont, moved heroin addiction to the front burner of national news by devoting his entire State of the State address to his state’s dramatic increase in heroin abuse. Shumlin described the situation as an “epidemic,” with heroin abuse increasing 770 percent in Vermont since 2000.

At the same time, non-medical prescription opiate abuse has slowly decreased. According to the SAMHSA 2012 National Survey on Drug Use and Health, the number of new non-medical users of pain killers in 2012 was 1.9 million; in 2002 it was 2.2 million. [It bears repeating that these stats are for abuse of non-medical prescription pain killers, not abuse of drugs obtained with a prescription.]

In the same time-frame, abuse of methamphetamine also decreased. The number of new users of meth among persons aged 12 or older was 133,000 in 2012, compared to about 160,000 in 2002.

Cocaine abuse also fell, from about 640,000 new users in 2012 from over 1 million in 2002. Crack abuse fell from over 200,000 users in 2002 to about 84,000 in 2012 (a number that’s held steady for the last three years).

The statistics suggest that heroin has taken up the slack from fall offs among other major drugs (only marijuana and hallucinogens like ecstasy have held steady or slightly increased among new users over the last decade; not surprising since they’re the drugs of choice among the youngest users, and since pot has been angling toward legalization for the last few years).

Most surprising in this sea of stats is the drop in non-medical prescription opiate abuse overlapping with an increase in heroin abuse. The reason may come down to basic economics: illegally obtained prescription pain killers have become more expensive and harder to get, while the price and difficulty in obtaining heroin have decreased. An 80 mg OxyContin pill runs between $60 to $100 on the street. Heroin costs about $9 a dose. Even among heavy heroin abusers, a day’s worth of the drug is cheaper than a couple hits of Oxy.

Laws cracking down on non-medical prescription pain killers have also played a role. The amount of drugs like Oxy hitting the streets has decreased, but the steady flow of heroin hasn’t hiccupped. Many cities are reporting that previous non-medical abusers of prescription pain killers—who are often high income professionals—have turned to heroin as a cheaper, easier-to-buy alternative.

One conclusion that can be drawn from the stats is that prescription opiates are serving as a gateway drug for heroin, not so much by choice but by default. The market moves to fill holes in demand, and heroin is effectively filling fissures in demand opened by legal pressures and cost.

Another interesting stat is that among first-time drug users, the mean age of initiation for non-medical prescription pain killers and heroin is virtually identical: 22 to 23 years old. That would also support an argument that there’s a cross-over effect from drugs like Oxy to heroin (in contrast, the mean ages for first-time users of pot and ecstasy are 18 and 20, respectively).

Vermont’s heroin problem would seem a foretelling of things to come in the more affluent parts of the country. According to the U.S. Census Bureau, Vermont’s median household income, home ownership rate, and percentage of people with graduate and professional degrees are all higher than the national averages, and Vermont’s percentage of those living at or below poverty level is significantly lower than the national average.

The bottom line: Vermont’s stratospheric heroin increase is happening where the money is, and the national drug abuse trends suggest that the same thing is happening across the country.