Thursday, September 29, 2011

That song really puts a spring in my step. Rather like a strong cup of coffee. Full disclosure - more of a tea person, but I've been known to drink a cup of joe every now and again. Maybe I ought to drink a bit more… the evidence is mixed, frankly. And certainly I can't tell you how many times I've had patients complain of insomnia, only to find out they are drinking 6 large iced coffees a day, or 12 Mountain Dews (no matter how much I exercise, I don't seem to be able to take this weight off, doctor…).

Some facts from the article - 80% of the caffeine in the world is consumed as coffee. Interesting. Prospective studies of men and caffeine use showed a strong inverse association between coffee drinking and depression, with no association for tea or cola. Three cohort studies showed an inverse relationship between coffee consumption and suicide (though in a Finish study, there was a J-shaped curve with both very high (>7 cups of coffee daily) and low consumption of coffee seemingly less protective than moderate amounts.)

So, in the Nurses' Health Study (following 121,700 American female nurses starting in 1976), women filled out questionnaires every two years. 97,000 filled out questionnaires in 1996, 98, or 2000, and those with no history of depression at that time (50,739 women) (those with unknown history were excluded) were followed over the next decade.

Regular coffee drinkers in this cohort were more likely to be smokers, drinkers, and not go to church! They also tended to have lower rates of diabetes and obesity. Average consumption for the whole group was about 1&1/2 cups of coffee a day.

Among the 51,000 women, about 2600 developed clinical depression in the 10 year period. There was a dose dependent, inverse relationship between the amount of coffee consumed and the risk of developing depression over the years. When covariates (such as age, health, smoking, divorces, etc. etc.) were all adjusted for, the inverse relationship became even stronger! No associations were found between tea consumption, chocolate consumption, decaf coffee, or soda consumption and depression.

Well, maybe. This is no randomized controlled trial, so causation cannot be determined, but caffeine (1,3,7-trimethylxanthine) antagonizes the adenosine A2A receptor. This is thought to have pro-dopamine effects. By taking out adenosine, we might also be affecting the transmission of norepinephrine and serotonin, both known targets of antidepressant medicines.

Since coffee is known to cause insomnia and anxiety, both features of depression, a weakness of the study is that women prone to insomnia and anxiety might limit their intake of coffee, thus biasing the results so that women who can tolerate a truckload of coffee also happen to be the ones less prone to depression.

But… all told, it seems that this study is another notch in coffee's bedpost. Though less than 8 cups a day seems prudent. And I really can't recommend Mountain Dew :)

Saturday, September 24, 2011

My buddy Jamie Scott is a research machine. It's all I can do to keep up with the interesting papers and links he emails my direction. Today's article is yet another one we owe to his sharp eye. He also has brand new digs at a wordpress blog (*brief moment of jealousy*) - so edit/add him to your blogroll and check it out:

Some music - I rather adore the Yeah Yeah Yeahs. Here's an oldie but a goodie: Gold Lion (right click to open in new tab). Favorite comment on youtube: "i think I just got whiplash rocking out to this song" [sic].

It's kinda cool. Involves humans, which is always a plus. It is one of those "view angry faces whilst in a functional MRI machine" which has some limitations, but it is pretty much the only way to see what's going on in real time in the old noggin, seeing as how it's rather awkward to test gene expression and neurotransmitter levels other ways without decapitation (not likely to pass the institutional review board any time soon, unless you were unfortunate enough to be born as a research rodent). (Random aside - Andrew tweeted this REAL MIND READING finding yesterday. Wow.)

How many segues is that? Welcome to my left-handed, small child-raising brain. As we know, depletions in serotonin, especially in a particular communication circuit between the frontal lobes (the policeman) and the amygdala (the emotional/rage center of the brain) leads to anger and aggressive behaviors. Now, there are some people who are just aggressive altogether - I'm thinking Drew Barrymore's boyfriend in one of the Charlie's Angels movies. We're not talking about that. We're talking about impulsive aggression. All the sudden, you just want to jump out of your car and strangle the other driver who cut you off (please don't do this). Impulsive aggression can be unexpected and very scary, and can certainly ruin lives.

So what if it happens just because you forgot to eat your banana this morning??? Oh, don't worry, we are likely more resilient than all that… but in an experimental setting, one can pretty much abolish serotonin via a weird laboratory tryptophan-depleting drink. Then you get into an MRI machine. Then you look at pictures of angry faces (if I were running this experiment, I would pipe in some hard core metal, and not one of Chopin's Nocturnes). Of course, I read A Clockwork Orange in high school. The tryptophan-depleting drink significantly reduced both plasma tryptophan levels (remember, tryptophan is the precursor to serotonin) and the ratio of tryptophan to other long-chain neutral amino acids (remember, tryptophan competes with these other amino acids for entrance into the brain).

In the end, the reactions of the tryptophan-depleted individuals to the angry faces vs. controls was statistically significant. Tryptophan-depleted folks had a higher response to the angry faces within the amygdala (the rage/anger part of the brain) compared to controls, and compared to the response to neutral faces. These findings would suggest that, as suspected, serotonin helps you chill out and assess the situation when faced with an angry hoarde.

Between the mind reading and the availability of a rapid acting tryptophan-depleting anger drink that will affect our aggressive reactions, I'm a little worried about the future of our free will. But I'll try to eat some protein, micronutrients, a banana, and put my trust in the incompetence of bureaucracy in order to be less paranoid.

Hmm. The text is very large and doesn't seem amenable to editing. One more reason to move over to wordpress…

So y'all have heard of the GAPS diet, right? Natasha Campbell-Mcbride is a doctor who had a kid with autism. I haven't read her book yet, but the general theory is that folks with certain issues with gut microbiota and carbohydrate malabsorption will end up with psychological/psychiatric symptoms, including autism. Dr. Campbell-McBride had great success with this approach, as, apparently, do many others.

In this PLOS1 paper, patients with autism and patients with GI disturbances were examined for different carbohydrate malabsorption.

Kids with autism often have gastrointestinal problems (survey studies report comorbidities of 9-91%, which isn't all that useful a percentage spread, but certainly given clinical experience and thinking about autistic kids I know in the community, higher seems more likely than lower). Pathologic findings of gut issues in autistic kids include gastritis, esophagitis, inflammatory markers at the gut lining, gut lymphatic system hyperplasia, increased intestinal permeability, abnormal gut microbiota findings, increased enzyme secretion, and carbohydrate malabsorption. Indeed, autistic children with severe gastrointestinal symptoms are more likely to have severe autistic symptoms (1).

So what happens if you don't have efficient digestion of disaccharides, for example, for whatever reason (damage to gut, unlucky genes, other illness)? Well, any carbohydrate that goes undigested will float down and feed the hungry masses of gut bacteria. This feeding can result in bloating, discomfort, diarrhea, and proliferation of pathogenic bacteria, which can presumably affect both inflammationand behavior.

The researchers from the latest studies biopsied the intestines of autistic children with gastrointestinal symptoms (AUT-GI), finding the following (among other things:)

Pyrosequencing analysis of mucoepithelial bacteria revealed significant multicomponent dysbiosis in AUT-GI children, including decreased levels of Bacteroidetes, an increase in the Firmicute/Bacteroidete ratio, increased cumulative levels of Firmicutes and Proteobacteria, and increased levels of bacteria in the class Betaproteobacteria...

Metabolic interactions between intestinal microflora and their hosts are only beginning to be understood. Nonetheless, there is already abundant evidence that microflora can have system-wide effectsand influence immune responses, brain development and behavior.

When there is a devastating illness with only supportive treatment, a harmless intervention, such as adjusting the types of carbohydrates in the diet (this will not be a completely "paleo" intervention - sweet potatoes, for example, are off limits on GAPS I believe - I will post some more when I get the book), seems to be an approach that ought to be supported and attempted. Sure, it might not help everyone, but what is there to lose?

Saturday, September 17, 2011

First off, everyone take a couple of hours and hop on over to Robb Wolf's blog and listen to his podcast with Dr. Kurt Harris. As usual, Kurt pulls it all together with fun and flair and a hefty serving of common sense. He gives me and my blog a few mentions, which is very much appreciated, as always :)

I've been anticipating excitement hunting down a couple of papers that came out in the last couple of weeks - the first one: An Update on Hospitalizations for Eating Disorders, 1999-2009. As expected from a statistical brief, there is little there besides the numbers - so it is not all that exciting. Overall, eating disorders as a primary or secondary diagnoses have increased 24% in that period, cost of hospitalizations have increased 29%, and hospitalizations for children under 12 have increased 72%, and for people 45-65 88%, and for men 53%. Weirdly, hospitalizations for pica (compulsively eating non-food items, such as dirt or soap or whatever) have increased 93% but are still rather unusual. If you look at eating disorders as a "principal" diagnosis only, the number has actually fallen 1.8%, and I've seen some funny headlines as a result - "eating disorder hospitalizations fall, but pica hospitalizations double."

An important caveat is that these numbers are generated from billing codes. If someone comes to see me at the office, I am obligated (if I want to be paid by the insurance company) to generate a code based on a DSMIV diagnosis that I put on a billing form. The same is true for inpatient hospitalizations. And in the past 13 years, a number of states and the federal government have issued rules to prevent insurance companies from rejecting paying for psychiatric diagnoses that are so-called "biologic." This change is a part of the mental health parity act. "Biologic" diagnoses vary from state to state depending upon the laws, and even depending upon the insurance company, but generally include major depressive disorder, bipolar disorder, schizophrenia, etc. Sometimes anxiety disorders are not included. Addiction used to be not included, now it is I believe, and often autism and eating disorders are not included. Therefore if I am a doctor who would like to get paid and not have patients stuck with bills when they pay their insurance premiums, and someone meets criteria for major depressive disorder (MDD) AND an eating disorder (which in the inpatient world will very often be the case), the MDD will always be the "principal" diagnosis to avoid issues down the line. I know that anorexia is often more likely to be covered for inpatient care than bulimia, depending upon the medical status of the patient… in short, the overall trend of primary and secondary diagnoses and increase in men and children and older people I find very interesting more than the drop in "principal" diagnosis.

It is actually rather difficult and getting more difficult all the time to be hospitalized for psychiatric disorders in general. For the most part you must be an obvious imminent risk to self or others or completely unable to care for yourself in order to get a bed, which are in scarce supply. In 1999 it was easier to get the slightly less ill hospitalized. So with this background, I find it rather remarkable the eating disorder hospitalizations have increased to such a degree. Binge eating rarely results in psychiatric hospitalization, and outpatient rates of binge eating and bulimia are rising also (though inpatient bulimia hospitalization dropped - the severe cases are often readily managed in intensive outpatient day programs nowadays, however.) As obesity has also increased over the same period of time, I can't help but suspect the two trends are related, but I can't prove it.

Why intranasal? None of the subjects had diabetes, and obviously systemic insulin could cause dangerous hypoglycemia. The intranasal dose goes pretty much straight to the central nervous system via the olfactory and trigeminal nerve perivascular channels, and none of the subjects had hypoglycemia during the trial.

Why insulin? Well, as I've discussed at great length (I really ought to repost some of those dementia articles up on Psychology Today…), there are very clear issues with the ability of a dementing brain to metabolize glucose (the example in that article is Parkinson's disease, but the principle is very similar for Alzheimer's). This problem results in inefficient use of energy, free radical generation, and neuronal toxicity and death. There are several ways to (theoretically) improve this issue - one of them is to use a therapeutic ketogenic diet. The other way is to jack up insulin in the central nervous system to improve the ability of the cells to pull in and utilize glucose, theoretically. In addition, insulin seems to have an effect on amyloid-beta peptides that may protect the neurons, and insulin and insulin activity are generally low in the CNS of folks with dementia (though hyperinsulinemia with insulin resistance seems to be a long-term risk factor for developing Alzheimer's dementia eventually).

My question is - and this is highly speculative - without improving the energetics, does jacking up the insulin help in the short term but hasten the problems in the long term? No long term studies have been done. In the absence of insulin resistance and with insulin in the CNS low already, perhaps not? I'll have to think a little more on that one.

Thursday, September 15, 2011

I've started teaching my small section of the introduction to psychiatry class for the medical students again, which has added a measure of increased chaos to the week. Not always a bad thing. However, blogging frequency may diminish for the fall (but who knows - depends upon what I see that interests me, and the class time for the lectures I don't teach does give me time to catch up on some journals, as I'm not taking a test at the end of the semester, I don't always need to pay attention…)

A few weeks ago I recorded a podcast with Superhuman Armi Legge and Bulletproof Exec Dave Asprey. Here is the podcast, so enjoy! I'm not entirely certain I am a paleo "brain hacker" - I'm more into emulating the evolutionary milieu(™)* than throwing MCT oil and butter into coffee for a kickin' breakfast, but that could be my likely dairy intolerance talking. We all share enthusiasm and interest in human health - the search for optimization of human health and performance is preliminary but intriguing. Thanks for the opportunity, Armi and Dave! Very happy to be on your podcast.

Sunday, September 11, 2011

A few weeks ago in Do Carbs Keep You Sane, I reported from a couple papers that disagreed with the textbook theory that a high carb, low protein and low fat diet would increase tryptophan in the brain. The Wurtmans from MIT have designed a whole pharmacologic diet around this theory, so it was interesting to read the rebuttal, especially since the rebuttal included data from Dr. Judith Wurtman's own papers.

In short, the theory goes that carbohydrate ingestion stimulates insulin production, which in turn causes protein to be driven out of the bloodstream and into the cells. Tryptophan, the rarest amino acid in the diet and the precursor for serotonin, is mostly bound in the blood to another protein called albumin, which makes it immune to insulin's effects. Therefore a carb bolus will increase the ratio of tryptophan to other amino acids competing for the same receptor, tryptophan shoots into the brain, and you get a nice hit of satiating, serenity-making serotonin.

If we follow the lines of this theory, a high protein diet will increase the amount of other amino acids and increase the competition for the receptor, leaving tryptophan a loser and the brain relatively "low" in serotonin. Fat in the diet will also delay gastric emptying and lower the overall glycemic index, lowering the insulin response and therefore reducing the insulin mechanism for driving tryptophan into the brain. Pretty simple.

Except in nutrition, nothing is ever simple. Turns out this mechanism works a bit differently in rodents than in humans or other primates, and any natural food and even flour and potatoes should have too much protein for it to work in humans. You can get this effect after a night's fast by eating or drinking something that is pure carbohydrate - such as marshmallows or lemonade. Not exactly an evolutionary model. In fact, in the primate models, the amount of tryptophan that made it into the brain depended on a higher amount of protein, not a lower amount with higher carbs.

But Mr. Jamie Scott sent me the Pubmed link for this paper a few weeks ago, and I certainly don't like to ignore papers, even if they tell a different story than the majority of the papers I had seen thus far:

Interesting study. 10 healthy human male university student volunteers are given several diets of varying macronutrient composition on different days. The first meal was a high GI meal consisting of 768 calories of jasmine rise and a tomato pureee - this meal was 1.6% fat, 8% protein, and 90.4% carbohydrate. The glycemic index of the meal was 117 and the glycemic load 200. That's 171 grams of carbohydrate, in case you were wondering. The other two "mixed macronutrient" meals served were lower in calories (about 457 each) of either a lower glycemic rice + or a high glycemic rice with a Lean Cuisine chicken. (I kid you not). The latter two meals were about 16% fat, 18% protein, and 66% carbohydrate give or take a rounding up or down, and each meal had 75 grams of carbohydrate.

(Okay, another hysterical sentence in this paper - each volunteer was tested with a standardized glucose drink to calculate glucose and insulin response to the different meals - the standard was the 75 grams of glucose bolus, and the figures were extrapolated to estimate the response to the 171 grams of carbohydrate meal because "it was considered unethical to give a glucose reference drink of 171.4g CHO.")

The results? The young men found the two mixed macronutrient meals palatable, whereas the (double calorie) high carb, high GI meal was more satiating, but less palatable. Sleepiness did not differ when measured immediately after the meal.

Only seven of the participants participated in all the blood draws, so only seven data sets were used for comparison of the ratio of tryptophan (TRY) to other "large neutral" amino acids (LNAA) in the study. At baseline (which was fasting), the ratio did not differ between the subjects. After the high carbohydrate, high GI meal, TRY:LNAA ratios increased by 23% and remained high for the next 8 hours. The Lean Cuisine folks with the low GI rice had an increase in TRY:LNAA of about 8%, and the high GI Lean Cuisine folks had an increase in TRY:LNAA of 17%.

The high carbohydrate bolus in several studies brings out robust high insulin responses - several hundredfold percentage increases over baseline (in this study the increase in insulin was 650% over baseline). In another study, plasma platelet serotonin levels were increased 3.5 fold after a similar high carb meal. Of course, high amounts of serotonin floating around in your periphery may not be exactly good for you - it is thought the high levels of serotonin caused by the diet drug combination fen-phen caused the aortic calcification risk from those drugs and possibly the risk of increased pulmonary hypertension.

In other studies, there are interesting implications. Recall that serotonin is the precursor for melatonin - in this rather recent study, a high carbohydrate, high GI meal (high GI rice, very very low fat) four hours before bedtime decreased sleep onset by 50% compared to those who consumed a low GI rice meal (also very low fat).

So it looks that you can increase tryptophan ratios in the periphery with high glycemic index meals, even with a more realistic macronutrient mix than jasmine rice and tomato puree. Presumably this may increase tryptophan intake into the brain. Tryptophan has several fates in the brain, serotonin being one, kynurenic another depending upon your state of inflammation and what drugs you may be on. Once again we have one fragment of a hugely complex picture.

And, since this mechanism depends on insulin, if one is contending that a high carb "serotonin cure" diet is helpful for depression, one must take into account that people with severe hyperinsulinemia are more likely to be depressed than people without, not less (though there are other confounding factors - since inflammation is one, the high carb diet in an (inflamed) type II diabetic might lead to increases in kynurenic rather than serotonin, explaining the diffference… but you see there is way too much unknown to make any general prescription for high carb diets in this context.)

Friday, September 9, 2011

Happy Friday! I actually have a whole stack of interesting papers from email and links (as usual) - Paleo Wunderkind Jamie Scott sent me an email about homocysteine and anger earlier this week, and the papers are pretty cool.

All right. Homocysteine. If you recall, in protein we eat an amino acid named methionine. Methionine plus various derivatives of the B vitamins, including folic acid vitamin B6 and vitamin B12 helps us make all sorts of stuff from other proteins like DNA, neurotransmitters, etc. Lots of important stuff. Homocysteine is an intermediate in the pathway, that is supposed to be recycled back into methionine (see this diagram) so that the cycle can begin again. Older people and men are likely to have higher homocysteine. Older folks also tend to eat less B vitamins. And folks with hyperinsulinemia are also more likely to have high homocysteine (if you really want your mind blown, check out this anonymous commenter who sounds and awful lot like Dr. K on my latest Psychology Today post linking insulin, homocysteine, selenium, B vitamins, choline, NAC and basically all the pathology of disease in Western Civilization.) Those who are obese are also more likely to have high homocysteine (in some studies but not in others), even with normal serum B6, B12, and folate levels (3).

If you don't have all the B vitamins in the right amounts, or if you are on medications that change the effectiveness of the enzymes in this pathway, or if you are one of 10% of of folks genetically deficient in the MTHFR enzyme, you will end up with extra homocysteine hanging about. And that, my friends, is not good. It's a bit murky, but homocysteine is thought to do all sorts of bad things, like stiffen arteries and increase the proliferation of smooth muscle cells leading to high blood pressure and increased risk of stroke. Homocysteine is also thought to be associated with joint and cartilage stiffness, weak bones, and is probably directly neurotoxic. High homocysteine is associated with increased risk of heart attacks both in baseline healthy folks and in people with previous heart disease, and it is thought to directly damage the blood vessel endothelium and is also probably prothrombic (2). High homocysteine (indicative of an inefficient folate cycle) (the actual level that is high is greater or equal to 11.3 micromol/L, in case you were wondering) means may be low in SAMe. SAMe (as we discussed earlier) is needed in the brain to make many neurotransmitters.

Over the years, high homocysteine has also been associated with anger (1). In fact, each 10 point increase in the Hostility and Hostility Direction Questionnaire is associated with a 2.9 micromol/L increase in homocysteine. Women under psychologic stress have higher levels of homocysteine also. Homocysteine has been investigated a number of times with respect to major depressive disorder, and it was found that only those with the disorder who also have anger attacks (approximately 40%) had significantly higher levels of homocysteine.

Anger on its own is also highly correlated with risk of heart attack. In one study of Koreans getting treatment for blocked coronary arteries, 60% of the patients met criteria for significant hostility on standard scales. This is in contrast to a much lower hostility score in healthy Koreans or Americans using the same scale. Both hostility and homocysteine level correlated with earlier return to the hospital with a new coronary event when the Korean patients were followed over time.

All right. So that is just a whole truckload of correlations, without a lot of explanation. And in the papers, there are some interesting suggestions (that the stress hormones deplete the B vitamins, thus raising homocysteine, that homocysteine is directly neurotoxic, causing anger. That homocysteine is associated with higher levels of pro-oxidants and represents an inflammatory state, also neurotoxic.

In the end we have the same prescription to address all the correlations and genetic variations - eat a healthy, nutrient rich diet. Avoid obesity and stress, or engage in stress reduction. Keep your folate cycle humming, and a lot of good things fall into place. Once it is out of whack, a cornucopia of bad juju starts to happen.

Thursday, September 8, 2011

Long, long, long ago, I wrote several posts that were later updated for Psychology Today, mostly because it is the basis for my understanding of the pathology of depression with regards to that Big Bad of Diseases of Civilization, inflammation. A nice tie-in to all those posts is here:

It is not, perhaps the friendliest of posts with all that biochemistry and all. But rest assured it is of vital importance. Especially this diagram (right click to open in new page and take a look). And this post will not be particularly friendly either. Mostly because I have caught up on a recent skirmish in the war between medicine and psychiatry, begun by the bookreviews by the former editor and chief of The New England Journal of Medicine, Dr. Marcia Angell, and all the big names in psychiatry writers, specifically an article in Psychiatric Times by Ronald Pies, M.D. "Misunderstanding Psychiatry (and Philosophy) at the Highest Level."

In case you do not happen to be a clinical psychiatrist and do not care to dive into the debate, let me paraphrase (and allow me to take extreme liberty with my own interpretation of the stance of the two sides): Dr. Angell: "Psychiatrists are witch doctors." Psychiatrists: "You are ignorant and misinformed."

It is hard to be misunderstood. Rest assured that I do not rely on incantations to treat my patients, but I do dislike equating psychiatry with the DSMIV. The DSMIV, the cookbook describing all the diagnoses for research and insurance billing purposes, is not psychiatry. A good psychiatrist listens and measures and watches for neurologic disorders, medical symptoms, experience, emotions, emotional expression, tremor, eye contact, muscle tone, gait… most of these are not ever mentioned in the DSMIV. I consider the DSMIV a necessary evil, for now. A very clever former teacher of mine once said, "If all the copies of the DSMIV dropped to the bottom of the ocean, all the better for us, and all the worse for the fishes." He asked that I not repeat that to anyone. I won't attach his name, and details are changed to protect the innocent, as always...

So how do I cope with being a well-meaning witch doctor? I write this blog. I tear apart the pathologies of the DSMIV in the context of biology, biochemistry, nutrition, lifestyle and evolution. For me, it is a more sensible and tenable approach than the random crapshoot of modern medicine epidemiology and the biased minefield that is psychopharmacology research. And in my own little corner of the blogosphere, I feel all is safe and honest and going the right direction. Most of the time.

Back to depression crashing the party. I've talked quite a bit about serotonin, a term, I think, with which everyone is familiar. Here is a nice article about serotonin in case you missed it.

But serotonin is only a small piece of the whole story. Our friendly neighborhood amino acid tryptophan can become all sorts of things - happy satiating serotonin, or enervating irritating kynurenic. Many, like the pioneering researcher Dr. Maes (who has hopped from Case Western Reserve (very respectable) to Antwerp (I'm sure, very respectable) to Thailand (well, let's reserve judgment until we know the whys and wherefores, though Thailand is a lovely place it is not a hotspot of respectable biomedical research!)) have been talking about inflammation and kynurenic for a decade or more. And, finally, other researchers have been looking into it. They call it kynurenine, but I'm not going to quibble.

The new generation of researchers working out of the very respectable New York State Psychiatric Institute measured kynurenine levels in healthy controls, patients with major depressive disorders, and patients with major depressive disorders who have had suicide attempts in the past (all controls and only three of the depressed patients in this study were medication-free).

And, low and behold, it was found that those with a previous suicide attempt were significantly more likely to have higher levels of serum kynurenine! Let's back up - activation of the inflammatory cascade (theoretically via autoimmune or other mechanisms, like, say, to go out on a paleo limb, wheat or omega 6 fatty acids) increases the activity of an enzyme called IDO (indoleamine 2,3 dioxygenase) which will change the amino acid tryptophan into kynurenine rather than fat-n-happy serotonin.

Serotonin levels actually have closer (negative) correlation with violence and suicide than depressed mood - and this study of kynurenine is no different - suicide attempters had the higher levels, and depressed patients without attempts had similar levels to healthy controls. Interestingly, kynurenine levels did correlate with BMI and tryptophan levels, and more robustly in males than in females (males have a higher risk of suicide completion than females, though females have more suicide attempts).

In previous studies, autopsies of suicide victims and CNS samples of suicide survivors have shown increased levels of kynurenine in both.

In mouse studies, increased kynurenine has been associated with activation of the neurotoxic (in excess) glutamate and even dopamine (which increases motivation and drive). Stress seems to increase the activity of IDO (leading to increased conversion of tryptophan to kynurenine) and general suicide badness.

In the end, I have to say that all that is psychologic is biologic. And psychiatrists must keep an eye out for signs and symptoms, and while the DSMIV (and psychiatry skeptics) ignores signs, we do not. Otherwise we remain guilty of the criticisms that the likes of Dr. Biffra will levy - which according to my comment (number 17) shows that he has very little understanding of the job of a psychiatrist. (Normally I like Dr. Biffra's ideas, but clearly he needs to consult with more expert psychiatrists if he is writing such posts!)

But ultimately I am not surprised. Mental illness is not understood, and psychiatrists hold some of the keys to the temple. Sometimes it is easier to eject what is misunderstood rather to absorb and understand it, regardless of biology or morality.

****
A little editorial addition on 9/9/11

I thought about not writing about the ongoing controversies in psychiatry as 1) it may not be of interest to many readers (though I do have several clinical psychiatrists who follow the blog, and hey, it's my blog and I will write about what interests me), and 2) it opens me up (fairly enough) to be a defender of modern clinical psychiatry, as I am critiquing the critic. I'm not interested in the role of defender, as certainly some aspects of modern psychiatry (and indeed modern medicine) are indefensible, and the role is ably taken up by Drs Altschuler, Nierenberg, Friedman, and Pies (among others) who have written responses to Dr. Angell's reviews.

However, I do think that critiques of psychiatry are important and, again, not entirely unreasonable. There is a risk with an atheoretical document like the DSMIV, with diagnosis based on a list of symptoms divorced from pathology (on purpose!) and a profit-driven pharmaceutical research community to create more and more diagnoses and make pills to fit the diagnoses. The ultimate argument of this critique is that a lot of mental illness is essentially made up. My main objection to Dr. Angell's stance is to this sentence of hers (quoted from Dr. Pies' article linked above), where she starts by saying that psychiatry is different than other medical specialties: "First, mental illness is diagnosed on the basis of symptoms (medically defined as subjective manifestations of disease, such as pain) and behaviors, not signs (defined as objective manifestations, such as swelling of a joint.)"

As I mentioned above, mental illness presents with many objective physical signs that have a known neuropathologic basis, and these signs are used all the time in clinical psychiatry. That Dr. Angell would not know this fact betrays a rather shocking ignorance. In addition, there are biomarkers for mental illness. Zinc is one, kynurenine now likely another, various cytokines… in fact biomarker tests are now being marketed to psychiatrists for diagnosis of depression, but it is hard to convince a psychiatrist to jab someone with a needle and spend money on the test when you can merely ask the person about the symptoms of depression and find the same answer. I suppose it might eventually be useful in cases where people are feigning depression for monetary gain (such as a faked disability case).

And I will suggest that just because there is an objective "sign" and "known pathology" doesn't make pharmacology less of a Faustian bargain in other more "objective" medical specialties. Sure, for reflux you can send a scope down someone's esophagus and measure pH, and the medicine used to treat it will indeed change the pH via blocking the proton pump, but is it helping the overall pathology of acid reflux in the long run? Statins will, indeed, lower cholesterol through a known mechanism, but despite the standard line that doctors have no idea that cholesterol is less important than the statin commercials will tell you, every well-trained and intellectually curious primary care doctor I speak with on a regular basis knows that statins work via their somewhat mysterious anti-inflammatory effect, not their cholesterol-lowering effect. And what about sulfanylurea drugs used to boost insulin production in type II diabetes? Sure, you improve the situation in the short term (and could possibly avert some long term hyperglycemia damage) - but you are making the patient more hyperinsulinemic in the process, and looking at longer term risks of worsening diabetes, and depending on the medication, there seems to be increased risks of pancreatic cancer and heart disease.

I would suggest that "knowing" the (almost invariably incomplete) pathology and having lab tests to check gives modern medical doctors a false sense of security in many cases. I'm not saying we should throw the baby out with the bathwater, but we can't scapegoat psychiatry without holding other specialties of modern medicine accountable in our critique as well. Pharmacology, whether it is with psych drugs, medical drugs, or pharmacologic use of supplements will always have unknown risks along with any benefits.

Add the risks of not using pharmacology (whether medical or psychiatric) - and you have a complicated picture of risks and outcome. One that takes good training, a bit of humility, honesty, and time to figure out.

So the truth of the matter is that people suffering from depression have lower overall cholesterol than average. I know, crazy, right? Here's where I really blow your mind - people with major depressive disorder, despite the lower cholesterol, have higher rates of death from heart disease, whether or not you had heart disease before or after the diagnosis of depression.

But anyone in the primal/paleo community will know that total cholesterol doesn't mean much - we want to know about HDL and the subfractions of LDL - the big fluffy new fresh LDL are not associated with increased risk, whereas the old, rancid, small dense LDL are certainly associated with increased risk of heart disease. Here is a quote from the article (though I'm not entirely sure I agree with the pathology description):

"small dense" LDL particles, resulting from packing of LDL particles with higher amounts of triglycerides, have a higher propensity to be oxidized, to be trapped in the subendothelial space, and, subsequently, to form the seed of an atherosclerotic plaque."

In favor of further lumpage, it is known that depression is associated with insulin resistance, which is associated with more small, dense LDL, and thus higher triglycerides and higher apolipoprotein B. Successful antidepressant treatment will just so happen to to improve insulin sensitivity and all those bad metabolic markers.

In the paper, lipoprotein composition of various folks with major depressive disorder were recorded and compared to healthy controls. Then depressed subjects were randomized to treatment with either mirtazapine (an effective antidepressant known to cause weight gain) or venlafaxine (another antidepressant less likely to cause the same).

Results: not surprisingly, total cholesterol was lower in depressed patients compared to healthy controls (surely the total cholesterol amount and how important robust cholesterol levels are to the brain has nothing to do with that. I'm sure it is just a confounder.) Both HDL and LDL were significantly lower. There was also a higher ratio of nasty, small, dense LDL particles to HDL particles in depressed patients compared to controls. Depressed people are much more likely to have rancid LDL lingering in their bloodstream. To me, not surprising.

Let's add the psychotropics - predictably, folks taking mirtazapine gained some weight, and folks taking venlafaxine lost some weight (serotonergic antidepressants such as venlafaxine will tend to cause short-term weight loss, likely though increasing satiety signals to the hypothalamus, of course some folks consider the hypothalamus to be of secondary importance with respect to weight gain or loss).

So what happened with the interesting subfractions of lipoproteins among the antidepressant treated groups? With fat-inducing mirtazapine (which causes weight gain through a central histamine or anticholinergic mechanism, most likely), total cholesterol and triglycerides increased. Under venlafaxine treatment, total cholesterol remained stable and triglycerides decreased. With both groups (and these are both powerful antidepressants who tend to actually work and help people feel a bit better - the study showed about 60-65% response, which is typical for these agents), the HDL levels improved, and HDL to LDL ratio and apolipoprotein B (a measure of old, dense, rancid cholesterol) decreased. The mirtazapine group gained weight (as expected), the venlafaxine group did not. In both groups, responders had a slight increase in total cholesterol.

What do we learn? Depressed folks who responded to antidepressant treatment tend to increase their HDL levels and had favorable changes decreasing old small dense LDL. Oddly enough, those on the fattening mirtazapine had similar good lipid changes (if the medicine worked) compared to the short-term slimming venlafaxine. (Editorial note: inflammation is inflammation, my friends, and an anticholinergic or histamine mechanism will make you fat regardless of how inflamed you are - and while most of this work has been done with antidepressant medication, I have linked studies in the past showing that therapy has also been shown to be anti-inflammatory, though to be honest it has not been looked at robustly).

Wildly enough, this study says "to the best of our knowledge, the composition of LDL particles has not been studied previously in depressed patients." WHAT? Small dense LDL vs large fluffy LDL was only discovered, like, 20 years ago, right? Mental health research is always lagging! And antidepressants might actually be anti-inflammatory as suspected… dare I say like statins? Both classes of drugs have evidence for some modest benefit in certain situations, and major drawbacks.

A few caveats - it's a mouse study, and even worse, part of it is just a mouse cell study. But it has some interesting bits and is worth a close look.

The authors begin by reviewing the current thinking about the cause of Alzheimer's (amyloid beta plaques, neuronal death, slow progression, etc. etc.) and the current treatments (several FDA approved acetylcholinesterase inhibitors, oh by the way they don't work very well and are not a cure). Then they go on to talk about the real cause of Alzheimer's (inflammation and crappy energetics - leading to cell death, slow progression, etc. etc.) and mention a funny little compound that might help with that, phytic acid.

Whoa. All the paleo peeps just squealed. PHYTIC ACID??? FROM WHOLE GRAINS??? Yes, that phytic acid. The one that binds all your minerals and leaves you anxious, insulin resistant, and fat. One of the reasons you don't eat oatmeal anymore. (via Dr. BG) Yes, phytic acid, why Stephan Guyenet has mysterious rice soaking in his fridge. (Honestly, I think traditionally prepared grains could be healthy enough if you are desperate, but they don't taste as good as meat and it takes so much effort… and I do worry about folks who are relatively sedentary or immobile and need the most nutrient-rich calories available.)

I actually wrote a bit about therapeutic uses of phytic acid before, but I was pretty sneaky, in my post on inositol.

So here's the scoop about phytic acid. Turns out that many nutrients and even anti-nutrients have multiple effects. I know. Shocking. I will quote the paper as they do a good job of writing here:

In this study, we investigated a novel protective treatment for AD pathology with phytic acid (PA, inositol hexakisphosphate). PA is structurally a myo-inositol sugar ring attached to 6 phosphate molecules. It is found naturally and ubiquitously, as a phosphate-storage phytochemical in unprocessed whole food grains, vegetables, and fruits, and as a key signaling molecule in mammalian cells. The Ca/Mg form of PA found in most plants is known as "Phytin" with its salt form known as "Phytate." Although PA is often described as a metal chelator, growing literature indicates that PA influences multiple processes, including antioxidant functions, anti-apoptotic effects, clathrin-coated endocytosis, DNA repair, and mRNA export from the nucleus. Phytic acid also lowers serum cholesterol and triglycerides. These studies suggest that PA possesses much broader functions than simply the originally-presumed metal binding properties

So what the authors suggest is that phytic acid is one of the many things (such as ketosis, calorie restriction, etc.) that can promote clean, happy, humming mitochondria. Happy mitochondria are one of my favorite things here at Evolutionary Psychiatry.

To get more detailed, there are many things that are postulated to increase SIRT1 (which is a class III histone and has epigenetic effects). High SIRT1 levels are found in long-lived yeast, flies, worms, and mice, and levels of SIRT1 are lower in folks with dementia. Calorie restriction and Red Wine Magic Food Resveratrol increase SIRT1 in animal models. The end result of increased SIRT1 seems to be increased autophagy. An awesome, dedicated clean-up crew keeping your neurons sparkly, as it were. Will phytic acid do the same for the brain as calorie restriction or resveratrol?

Well, in mouse cells, application of phytic acid seemed to do all the right things, increasing the expression of SIRT1 and other autophagy-promoting biochemical stuff. And, in Alzheimer's model mice given phytic acid-laced drinking water for six months, levels of SIRT1 increased, amyloid beta accumulation decreased, and other indicators of decreased oxidative stress and more efficient mitochondrial function were evident compared to control mice. AND brain levels of copper, zinc, and iron were no different between the control and phytic-acid mice.

(I'm wondering if any of that had to do with increasing the efficiency of second messengers, as inositol is thought to do, as I described in my previous post linked above).

So, the take home message is to eat whole grains to prevent Alzheimer's. Haha. Just kidding. Actually, the take home message is that nutrition and biochemistry are really complex. A molecule that has some downsides may also have some positives. Phytic acid seemed like the least noxious part of the chemical warfare that grains (and nuts and fruits and other plants) play. Since I don't pretend to have complete knowledge of the chemistry of the human brain, I like the evolutionary fallback position of an ancestral health lifestyle. Forget novel industrial foods created by a chemical company, and I won't be going to cute supplements (outside of the basic minerals and sunshine) unless I'm desperate.

Have a great Labor Day Weekend, everyone!

***

Just want to add on a bit to address Rudolf's very astute comment (the eighth comment, below): : Isn't there another confounder, namely that rats, mice etc express phytase, and we don't?

Rudolph is saying that mice are better adapted to eating grains and have the enzyme phytase in their guts, which will cut up phytate into its components (inositol and a bunch of phosphate molecules). The authors do address this issue (sort of, though they don't make it that clear that humans do not metabolize phytic acid) - here's another paragraph from the paper - I'm a little irritated they didn't measure the phytic acid levels in the mouse brains when they had them, though:

It is important to emphasize that the effects of PA demonstrated here are distinct from those that have been attributed to its “backbone” and metabolite, inositol. Stereoisomers of myo-inositol (the non-phosphorylated backbone of PA) inhibit Aβ fibril assembly and protect neurons from Aβ-induced cytotoxicityin vitro. The stereoisomer scyllo-inositol inhibits Aβ aggregation, reduces soluble and insoluble Aβ, reduces plaque size and inhibits cognitive defects in a transgenic model of AD. Scyllo-inositol (ELND005) has been tested in animals and has entered phase 3 clinical trials for AD by Elan Corporation, plc. Recently, however, it had been shown to cause severe adverse drug effects in humans in two arms at doses 1 and 2 gram twice a day leading to death of 9 patients and discontinuation of both arms of this clinical trial. Interestingly, 6–7.4 g/day of phytic acid administered to patients did not have any adverse drug effects for up to 24 months. Our study indicates that phytic acid works by independent mechanisms. Additional support for the hypothesis that phytic acid may have effects beyond those of inositol come from studies showing elevated brain levels of phytic acid in rats fed a high phytate diet, indicating that some unmetabolized phytic acid is delivered to the brain, in addition to other species of phosphorylated inositols.

So - it's not all the inositol or other metabolized products from a mouse consuming phytic acid (which is synonymous with phytate, by the way). Phytate itself gets through in the mice, and, it seems, the question in humans (who shouldn't absorb much phytate at all - the phytate gets pooped out with the minerals, remember), is whether administration of phytate orally would make a difference in the brain. One interesting note in the paper is that the regular mouse diet had NO phytate, and the experimental diet was 2% phytate, and a diet rich in legumes, grains, and seeds would be about 5-6% phytate. The authors postulate that we humans in the developed world would be fine (hahaha quite an assumption given what is known about mineral consumption in epidemiologic studies in the developed world!) but that those in third world countries with marginal food consumption ought not to eat so much phytate…

It is something to consider to be studied as an injectable, if one has Alzheimer's. (Do not try this at home - in the quote above you will note that 9 people died taking scyllo-inositol in the clinical trials of that drug for dementia).

A "paleo" diet in humans will have its share of phytic acid (likely not the 5-6% of calories, though, I would imagine), especially if you don't soak all your nuts (I don't - I don't eat that many nuts and don't have the time or the space to mess with them - just like grains). See Melissa McEwen's latest post for more information.

Pages

About Me

Emily Deans, M.D.: I'm a psychiatrist in Massachusetts searching for evolutionary solutions to the general and mental health problems of the 21st century. Disclaimer: This information is for educational purposes only, and is in no way intended to be personal medical advice. Please ask your physician about any health guidelines seen in this blog, as everyone is different in his or her medical needs.