Faith, Hope, Love, Science.

Main menu

Tag Archives: dementia

Fad diets come and go. One of the most popular fad diets of recent times is Paleo.

The Palaeolithic diet, also called the ‘Stone Age diet’, or simply ‘Paleo’, is as controversial as it is popular. It’s been increasing in popularity over the last few years, and has had some amazing claims made of it by wellness bloggers and celebrity chefs. Advocates like ‘Paleo’ Pete Evans of MKR fame, claim that the Palaeolithic diet could prevent or cure poly-cystic ovarian syndrome, autism, mental illness, dementia and obesity [1].

So what does the published medical literature say? Is there really good research evidence to support the vast and extravagant claims of Paleo?

About 10 months ago, I started reviewing the medical research to try and answer that very question. My review of the medical literature turned up some interesting results, and so rather than post it just as a blog, I thought I would submit it to a peer-reviewed medical journal for publication. After a very nervous 9-month gestation of submission, review, and resubmission, my article was published today in Australian Family Physician [2].

So, why Paleo, and what’s the evidence?

Why Paleo?

The rationale for the Palaeolithic diet stems from the Evolutionary Discordance hypothesis – that human evolution ceased 10,000 years ago, and our stone-age genetics are unequipped to cope with our modern diet and lifestyle, leading to “diseases of civilization” [3-9]. Thus, only foods that were available to hunter-gatherer groups are optimal for human health – “could I eat this if I were naked with a sharp stick on the savanna?” [10] Therefore meat, fruits and vegetables are acceptable, but grains and dairy products are not [11].

Such views have drawn criticism from anthropologists, who argue that that there is no blanket prescription of an evolutionarily appropriate diet, but rather that human eating habits are primarily learned through behavioural, social and physiological mechanisms [12]. Other commentators have noted that the claims of the Palaeolithic diet are unsupported by scientific and historical evidence [13].

So the Palaeolithic diet is probably nothing like the actual palaeolithic diet. But pragmatically speaking, is a diet sans dairy and refined carbohydrates beneficial, even if it’s not historically accurate?

Published evidence on the Palaeolithic Diet

While the proponents of the Palaeolithic diet claim that it’s evidence based, there are only a limited number of controlled clinical trials comparing the Palaeolithic diet to accepted diets such as the Diabetic diet or the Mediterranean diet.

Looking at the studies as a whole, the Palaeolithic diet was often associated with increased satiety independent of caloric or macronutrient composition. In other words, gram for gram, or calorie for calorie, the Paleo diets tended to make people fuller, and therefore tend to eat less. Of course, that may have also been because the Paleo diet was considered less palatable and more difficult to adhere to [14]. A number of studies also showed improvements in body weight, waist circumference, blood pressure and blood lipids. Some studies showed improvements in blood sugar control, and some did not.

The main draw back of clinical studies of Paleo is that the studies were short, with different designs and without enough subjects to give the studies any statistical strength. The strongest of the studies, by Mellburg et al, showed no long-term differences between the Palaeolithic diet and a control diet after two years [15].

The other thing to note is that, in the studies that measured them, there was no significant difference in inflammatory markers as a result of consuming a Palaeolithic diet. So supporters of Paleo don’t have any grounds to claim that Paleo can treat autoimmune or inflammatory diseases. No clinical study on Paleo has looked at mental illness or complex developmental disorders such as autism.

Other factors also need to be considered when thinking about Paleo. Modelling of the cost of the Palaeolithic diet suggests that it is approximately 10% more expensive than an essential diet of similar nutritional value, which may limit Paleo’s usefulness for those on a low income [16]. Calcium deficiency also remains a significant issue with the Palaeolithic diet, with the study by Osterdahl et al (2008) demonstrating a calcium intake about 50% of the recommended dietary intake [17]. Uncorrected, this could increase a patients risk of osteoporosis [18].

To Paleo or not to Paleo?

The bottom line is the Paleo diet is currently over-hyped and under-researched. There are some positive findings, but these positive findings should be tempered by the lack of power of these studies, which were limited by their small numbers, heterogeneity, and short duration.

If Paleo is to be taken seriously, larger independent trials with consistent methodology and longer duration are required to confirm the initial promise in these early studies. But for now, claims that the Palaeolithic diet could treat or prevent conditions such as autism, dementia and mental illness are not supported by clinical research.

If you’re considering going on the Palaeolithic diet, I would encourage you to talk with an accredited dietician or your GP first, and make sure that it’s right for you. Or you could just eat more vegetables and drink more water, which is probably just as healthy in the long run, but without the weight of celebrity expectations.

Comparison of the current Australian Dietary Guidelines Recommendations [19] to the Palaeolithic diet [17]

Australian Dietary Guidelines

The Palaeolithic Diet

Enjoy a wide variety of nutritious foods from these five groups every day:

Plenty of vegetables, including different types and colours, and legumes/beans

For the last few years, I’ve worked as a doctor for a number of my local nursing homes. On my morning rounds, I would literally reintroduce myself to every second patient, because even though I’d seen them every week for the previous few months, they still couldn’t remember who I was.

And it’s not just because I have a less than memorable face. Most of my nursing home residents had dementia.

While there are many different causes for dementia, the one first described by Mr Alzheimer in the (early 1900’s) is the best known and most feared. It is also the most common, and is a significant drain on the nation’s economy as well as the quality of life in the twilight of years.

Recently, an article was published by a group of researchers from Yale University in the US which claimed to show that the attitude a person had towards aging contributes to their chances of Alzheimer Disease. I first saw it yesterday on the social media feed of Dr Caroline Leaf, communication pathologist and self-titled cognitive neuroscientist. Dr Leaf is known for her scientifically dubious assumptions that the mind changes the brain, not the other way around, and has previously publically stated that dementia was caused by toxic thinking. This article seems to vindicate her assumptions.

However, this article also made it onto Facebook’s trending list ands was picked up by news site all over the world (such as this article in the Australian http://goo.gl/RavbMl), so the interest wasn’t just from Dr Leaf, but also from the broader public. And I can understand why. No one wants to ‘grow old and senile’, or to ‘lose our marbles’. Any potential cure or prevention for Alzheimer Dementia is worth paying attention to.

I admit, the headline intrigued me too, both personally and professionally. I wasn’t aware that one’s attitude towards aging would contribute to Alzheimers, since Alzheimers is predominantly genetic, and the other associated risk factors have more to do with physical health (like diabetes, high blood pressure etc). Psychological stress is a risk factor for Alzheimers in mice, but good evidence in humans has been lacking [1].

So, does negative attitudes to aging really cause stress which then leads to Alzheimers as the report suggested, or is there a much better explanation?

The scientific article that the news reports were based on is A Culture-Brain Link: Negative Age Stereotypes Predict Alzheimer’s Disease Biomarkers [2]. This study was done in two stages. Volunteers were recruited from a larger study called the Baltimore Longitudinal Study of Aging. At entry point, the participants answered a questionnaire about their attitudes towards aging. This was about 25 years before the participants were actively studied.

The first study examined the change in volume of a part of the brain called the hippocampus (which plays an essential part in our memory system). The second part of the study examined the volunteers’ brains at autopsy for markers of Alzheimer Dementia, namely ‘plaques’ and ‘tangles’. The number of plaques tangles were combined to form a single composite score, which was then compared to the baseline attitude towards aging score.

In the first study, the researchers reported that those people who held negative views of aging were more likely to have a smaller hippocampus which more rapidly decreased in size over time.

In the second study, the researchers reported that those people who held negative views of aging were more likely to have more plaques and tangles in their brain.

On the surface, this seems to suggest that people who hold negative views on aging contribute to the development of Alzheimer Dementia, and certainly this is how the different news agencies seemed to interpret the outcomes of the study. Though on deeper palpation, a number of questions arise about how the researchers did the study and chose to interpret the results.

For example, the aging attitude survey was only done once, which means there’s a 25 year gap or longer between the questionnaire and the active studies. That’s a long time, and the attitudes of the volunteers may have improved or worsened in that time, but that doesn’t seem to have been considered

Levy and her researchers also report that the average size of the hippocampus changed significantly when they averaged the size of the left and the right hippocampus. But when they analyzed the two sides separately, there was no significant change over time. So this makes me wonder about the validity of their analysis too – if the volume of each side separately doesn’t change much at all, then how can the average volume of the two sides change so much?

I’m not much of a statistician, but I wonder if the secret’s in their modeling. They used a linear regression model to compare their data to their hypothesis, a legitimate statistical method, but which involves adjustment for other variables. If you do enough adjusting, you can get a significant result statistically, but according to their numbers, their Cohen’s d was 0.29, which is considered a weak effect overall.

Then there’s the question of clinical significance. Even if the hippocampus did shrink in those who thought aging was negative two decades ago, was the shrinkage enough to contribute to the cognitive impairment seen in Alzheimer Dementia? When compared to other studies, probably not. Looking at Levy’s graph, the “negative” attitudes group changed about 150mm3 over the 10 year follow up period, or about 5%. A recent study also showed that the the hippocampal size of subjects with mild memory loss is about 12% less than a healthy age matched control [3].

The same problems are seen in study 2 – Levy and her researchers reported an increase in the number of plaques and tangles in the “aging is bad” group. But her numbers are small, and not statistically strong. And again, the question of clinical significance arises. Plaques and tangles represent biomarkers of Alzheimer Dementia, not necessarily a diagnosis. Normal aging brains without dementia also have plaques and tangles, and it’s the number of tangles that seem more significant for developing cognitive impairment [4, 5], not the combined score that they used in this study.

And when all is said and done, all Levy and colleagues have shown is a correlation between attitude to aging and changes in the brain. But correlation does not equal causation. Just because two things are associated does not mean that one causes the other. There maybe another variable or factor that causes both observations to co-occur.

In Levy’s case, the common connecting cause could easily be neuroticism, which they discussed as a co-variant but did not say if or how they corrected for it. The other thing they did not examine in this study is the ApoE gene subtypes, which contribute significantly to the onset of Alzheimer Dementia [6]. The action of ApoE subtypes in the brain may contribute to both negative attitudes and Alzheimers changes?

The bottom line is that Levy’s study shows a weak correlation between a single historical sample of attitude towards aging, and some changes in the brain that are known to be markers for Alzheimer Dementia some three decades later.

They’ve certainly NOT shown that stress, or a person’s attitude to aging, in anyway causes Alzheimer Dementia. They did not correct for genetics in this study which is the major contributor to the risk of developing Alzheimers. So the results mean very little as it stands, and further research is required to delineate the cause and effect relationship here.

So don’t stress. It’s not definitely proven that how you view the aging process determines your risk of dementia. There will be those like Dr Leaf who will trot out this cherry-picked little titbit of information in the future to try and justify their pretense that thought can change our brain and impact our mental health, but what the press release says and what the study shows appear to be two different things altogether.

I meet a lot of people in my job. Some are not particularly memorable, and some I truly wish to forget. But every now and then, I meet a person who’s memorable for all the right reasons. Kev was one of those people.

Once upon a time, Kev was a business man, a corporate manager who started in the postal service in his late teens, but got more experience and moved into the Commonwealth Bank, where he quickly moved through their ranks and became a regional manager. Towards the end of his career, he moved industries to become the CEO of one of the smaller private hospitals in Brisbane in the 1980’s.

After he retired, his wife developed dementia, and he cared for her at home for many years, before he became too weak. They both moved into a nursing home, but his wife succumbed a couple of years later.

When I met Kev in early 2013, he was dying. His heart and his lungs were failing, and he couldn’t walk ten metres without gasping or needing oxygen. He was gaunt and frail, and extremely thin. I was worried that if he fell, he might snap.

But his intellect remained untouched by the disease ravaging the rest of his body. He was quick-witted, jovial, and always polite. He was the consummate professional – always showing respect, and earning it. I could see why he was so good as a businessman. He was a pleasure to be around – so much so that I spent extra time with him every week just chatting, when I should have been finishing off my work.

In the week before he died, the last time I saw him, as I sat in his room listening to some more of his stories, he looked me in the eye and said,

“Don’t sweat the small stuff. You don’t have to do everything. Let people flow in the things they can do. There are more important things in life.”

He smiled as he looked at the photos on his wall of his wife and kids.

I smiled and shook his hand. “I’ll see you later, Kev”, I said. I never did see him again.

I still remember him now, skinny and breathless, but with a big smile on his face and a sparkle in his eyes every time I entered his room. And I remember his advice on living a life driven by values.

New Years Day is a time to start afresh, a celebration of new beginnings, a focal point to take stock and refocus. But if we’ve learnt anything at all from our previous attempts at New Years resolutions, it is that they don’t work. Don’t be mislead by the occasional partial successes. I sometimes hit a golf ball straight, but that still doesn’t mean my golf swing is any good. New Years resolutions are the same – they are fundamentally flawed, in spite of the accidental successes that we sometimes have.

The truth is that etherial statements, or short term goals for self-improvement don’t help us. We don’t need New Years resolutions, we need New Years re-evaluations.

Values are different to goals. A goal is like a destination, where as a value is like a direction. Our individual values are like the direction of the breeze. It’s easier to sail with the breeze of our values than against it.

We often get goals and values confused. Goal orientation means that we move from place to place, sometimes travelling in the same direction as our values, but sometimes against them. When we live according to our values, the goals seem to set themselves as we live according to what we truly believe in, what truly motivates us.

A few things can acts as guides to help us learn what our values are. What are your passions or what makes you mad? Is it justice, or injustice? Is it relationships? Is it children, or family? The environment? What is it that gets your juices flowing?

Another way of understanding your values is to do the eulogy exercise. It’s a little morbid, perhaps. But simply, the eulogy exercise involves writing your own eulogy. What is it that you want others to remember you for? What do you want your epitaph to say?

The eulogy exercise helps us to plan our lives with the end in mind. When you’re on your death bed, will you regret not finishing that report, or will you regret whether you lived according to your values, your deepest desires. Putting your values into perspective makes it much easier to let things go that aren’t truly important. It’s a lesson I’m continually working at too.

May 2014, and the rest of your life, be about the important things. Don’t sweat the small stuff.

“We can chart our future clearly and wisely only when we know the path which has led to the present.” Adlai E. Stevenson

I always thought history was boring, and I must admit, If you want to put me to sleep, start reading early Australian history to me. “Convicts … first fleet … zzzzzz.”

But as Stevenson wrote, the key to the future is the past. With autism, I don’t want to see a future as checkered as its past. In this series of essays, I want to help our community see a future in which autism is recognised and appreciated for its strengths. To properly lay the groundwork, I want to look at the history of autism. This will help provide context for the current understanding of autism, which will then give a framework for understanding the autistic person, and for a glimpse into the future as new research unfolds.

The autistic spectrum has been present for as long as humans have. But to our knowledge, one of the first specific descriptions of someone who met the characteristics of the autistic spectrum was in the mid 1700’s. In 1747, Hugh Blair was brought before a local court to defend his mental capacity to contract a marriage. Blair’s younger brother successfully had the marriage annulled to gain Blair’s share of inheritance. The recorded testimony describes Blair as having the classic characteristics of autism, although the court described him at the time as lacking common sense and being afflicted with a “silent madness”.[1]

Isolated case reports appeared sporadically in medical journals. John Haslam reported a case in 1809, although with modern interpretation, the child probably had post-encephalitis brain damage rather than true autism. Henry Maudsley described a case of a 13 year old boy with Aspergers traits in 1879. There were no other reports of children with autism in the early literature, although at the turn of the 19th century, Jean Itard reported on the case of an abandoned child found roaming in the woods like a wild animal. This child, called Victor, displayed many features of autism, although he may have simply had a speech disorder. Either diagnosis was obscured by the effects of severe social isolation.[1]

Others described syndromes which shared autistic features, but without describing autism itself. The names given to each syndrome reveals how autistic features were regarded in the 19th century: Dementia Infantalis, Dementia Praecocissima, Primitive Catatonia of Idiocy.[1]

Around 1910, Eugen Bleuger was a Swiss psychiatrist who was researching schizophrenic adults (and as an aside, Bleuger was the person to first use the term ‘schizophrenia’). Bleuger used the term ‘autismus’ to refer to a particular sub group of patients with schizophrenia, from the Greek word “autos,” meaning “self”, describing a person removed from social interaction, hence, “an isolated self.”[2]

But it wasn’t until the 1940’s that the modern account of autism was articulated, when two psychiatrists in different parts of the world first documented a handful of cases. Leo Kanner documented eleven children who, while having variable presentations, all shared the same pattern of an inability to relate to people, a failure to develop speech or an abnormal use of language, strange responses to objects and events, excellent rote memory, and an obsession with repetition and sameness[3].

Kanner thought that the condition, which he labelled ‘infantile autism’, was a psychosis[1] – in the same family of disorders as schizophrenia, although separate to schizophrenia itself[2]. He also observed a cold, distant or anti-social nature of the parents relationship towards the child or the other parent. He thought this may have contributed (although he added that the traits of the condition were seen in very early development, before the parents relationship had time to make an impact)[3]. True to the influence of Freud on early 20th century psychiatry, Kanner said of the repetitive or stereotyped movements of autistic children, “These actions and the accompanying ecstatic fervor strongly indicate the presence of masturbatory orgastic gratification.”[3]

Despite the otherwise reserved, cautious discussion of possible causes of this disorder, the link with schizophrenia and “refrigerator mothers” took hold in professional and lay communities alike. In the 1960s and 70s, treatments for autism focused on medications such as LSD, electric shock, and behavioral change techniques involving pain and punishment. During the 1980s and 90s, the role of behavioral therapy and the use of highly controlled learning environments emerged as the primary treatments for many forms of autism and related conditions.[2]

Unbeknown to Kanner, at the same time as his theory of ‘infantile autism’ was published in an English-language journal, a German paediatrician called Hans Asperger published a descriptive paper of four boys in a German language journal. They all shared similar characteristics to the descriptions of Kanner’s children, but were functioning at a higher level. They shared some aggression, a high pitched voice, adult-like choice of words, clumsiness, irritated response to affection, vacant gaze, verbal oddities, prodigious ability with arithmetic and abrupt mood swings. Asperger was the first to propose that these traits were the extreme variant of male intelligence[4].

But the full impact of Asperger wasn’t felt until 1981, when British psychiatrist Lorna Wing translated Aspergers original paper into English. By this time, autism had become a disorder of its own according to the DSM-III, the gold-standard reference of psychiatric diagnosis, but it was still largely defined by the trait of profound deficit. Aspergers description of a ‘high-functioning’ form of autism resonated amongst the autism community, and a diagnosis of Aspergers Syndrome became formally recognised in the early 1990’s with the publication of the DSM-IV.

The most recent history of autism comes in two parts. The first was the revision of the DSM-IV. For the first time, rather than two separate diagnoses, Autism and Aspergers have been linked together as a spectrum and collectively known as the Autism Spectrum Disorders (although autism self-advocates prefer the term ‘conditions’ to ‘disorders’).

The second part is a highly controversial chapter that will stain the history of autism research and scientific confidence, into the next few decades. Chris Mooney, in a piece for Discover Magazine, sums it up nicely:

“The decade long vaccine-autism saga began in 1998, when British gastroenterologist Andrew Wakefield and his colleagues published evidence in The Lancet suggesting they had tracked down a shocking cause of autism. Examining the digestive tracts of 12 children with behavioral disorders, nine of them autistic, the researchers found intestinal inflammation, which they pinned on the MMR (measles, mumps, and rubella) vaccine. Wakefield had a specific theory of how the MMR shot could trigger autism: The upset intestines, he conjectured, let toxins loose in the bloodstream, which then traveled to the brain. The vaccine was, in this view, effectively a poison.”[5]

Inflamed by a post-modern distrust of science and a faded memory of what wild-type infectious diseases did to children, the findings swept through the internet and social media and lead to a fall in vaccination rates (from about 95% to below 80% at its lowest)[6].

But the wise words, “Be sure your sins will find you out”, still hold true, even in modern science. In 2010, Wakefield was found guilty of Serious Professional Misconduct by the British General Medical Council, and was struck off the register of medical practitioners in the UK. In the longest ever hearing into such allegations, the GMC considered his conduct surrounding the research project, the medical treatment of his child subjects, and his failure to disclose his various conflicts of interest to be dishonest and professionally and clinically unethical[7]. There is evidence that he also selectively chose his subjects to confound the results, misrepresented the time course of their symptoms related to the vaccinations, misrepresented their diagnosis of autism, and altered the reports of their bowel tests[8, 9].

For the record, this isn’t a comment on the science of Wakefield’s rise and fall, but the history. I am not suggesting that the proposed autism/vaccination link should be discounted solely on the basis of Wakefield’s scientific fraud. Rigorous science has already done that. The science for and against the proposed link between autism and vaccinations deserves special attention, and will be discussed in a future post. Rather, lessons need to be learned from what is one of the most destructive cons in the recent history of medicine.

The losers of this hoax are twofold. Thousands of children have unnecessarily suffered from preventable infectious disease because of a fear of vaccines that has turned out to be unfounded, and those who actually have autism miss out on actual funding because it was syphoned off into Wakefield’s pockets and into research disproving his rancid theory. As the editorial in the BMJ stated, “But perhaps as important as the scare’s effect on infectious disease is the energy, emotion, and money that have been diverted away from efforts to understand the real causes of autism and how to help children and families who live with it.”[6]

As with all good history, there are lessons for the future. Autism is still largely misunderstood. The vacuum of definitive scientific knowledge is slowly being filled, gradually empowering people with autism and the people that interact with them to truly understand and communicate. Each breakthrough and revision of the diagnosis has lead to more sophisticated and more humane ways of living with autism. But there is still a need for caution – people will use the gaps in knowledge and the pervasive distress that can come from the diagnosis, to manipulate and exploit for their own ends.

I’ll continue with the series in the next week or so, looking at the modern “epidemic” of autism.