MethodLogical

MethodLogical is now at methodlogical.wordpress.com

MethodLogical is now at methodlogical.wordpress.com

Due to some persistent technical issues we've been having with Blogger, we're now posting at methodlogical.wordpress.com. Please update your RSS feeds, etc. For the time being, our new posts will automatically be mirrored here, but you'll have to visit the new site to comment.

Also on the brand-new blogroll list is Naman Shah, who provided an expert counterpoint to my arguments against the use of mefloquine as a malaria prophylaxis for travelers. He runs the Malaria blog, which has great links and commentary about the disease I increasingly believe may be the most important challenge in development.

If you have your own development or global health-related blog, shoot us an email and we’ll add it to the list unless you’ve already achieved major internet fame: methodlogical at gmail dot com.

Interesting article on Marginal Revolution suggesting that by making it legal to give a bribe (but illegal to accept one), countries could reduce corruption. Basically, by decriminalizing bribe-giving, people who were forced to bribe officials can report it without legal discourse. This asymmetry would significantly discourage officials from demanding bribes, as their risk of penalty would increase. Full text here.

In the 1960s, Stanford University pyschologist Michael Mischel conducted the famous ‘marshmallow study’ demonstrating a correlation between a child’s ability to delay immediate gratification and that same child to achieve positive education and health outcomes later in life. Is this study truly a ground-breaking conduit to research on self-control, or is it all just fluff?

In the study, four-year-olds were given one marshmallow, and told they could either eat it now, or wait until the experimenter returned from an errand and have 2 marshmallows. About 1/3 of the children successfully waited to eat the marshmallow and were rewarded with a second marshmallow. Decades later, Mischel found astounding results. The child who could wait fifteen minutes had an S.A.T. score that was, on average, 210 points higher than that of the kid who could wait only thirty seconds. Similarly, the children who waited had on average lower body mass index, and were more self-motivating. Then even more decades later, the children that waited had higher incomes, greater career satisfaction, better health, and a higher rate of successful marriages.

Here is a cute video of the study being replicated:

In the New Yorker piece on the marshmallow study, Mischel defined the key skill of the 4-year-olds that waited as "strategic allocation of attention." Instead of getting obsessed with the marshmallow—the "hot stimulus"—the patient children distracted themselves by covering their eyes, pretending to play hide-and-seek underneath the desk, or singing songs from "Sesame Street." Their desire wasn't defeated—it was merely forgotten. This translated later in life to similar attention issues. If you can overcome the “hot stimulus” distraction, you can study for the S.A.T. instead of watching television. Or you can save more money for retirement instead of spending it immediately.

Mischel’s latest follow-up with the marshmallow kids has been inviting them back to Stanford to take fMRI’s in order to identify the crucial neural circuits that distinguish the delayers from the non-delayers. Identifying these underlying neurological mechanisms could help with a more effective treatment design for attention deficit disorder, ADHD, and OCD.

My question to you is — Should we be optimistic about the marshmallow study’s potential to address critical health problems related to attention and self-control? Or, should we be critical of its metholodology and validity?

Furthermore, African-trained members of the American Medical Association send home to Africa, on average, over $6,000 per year, even 20 years after arriving in the United States—including those who send no money. Far from being "scant compensation", this means that the typical African-trained doctor coming to the United States has sent back much more than the cost of training another doctor in the country he or she came from.

Such large remittances make my whole discussion of public vs. private funding of medical education in the developing world moot.

Nemana is right that One Day without Shoes makes the mistake of taking a solution and looking for a problem to solve. But they fail on a more basic level: did they talk to anyone who ran around barefoot as a kid, to ask why and how they felt about it? With that in mind I watched the video and had the following reactions:

As a child of the tropics, I basically never wore shoes until my teachers started forcing me in junior high school. When I did, they were almost uniformly slippers and not covered shoes (what the uninitiated call flip-flops). Going barefoot near the equator has very different effects from what you’d experience doing it in, say, Ann Arbor.

They highlight podoconiosis as a major issue for people without shoes in the developing world. It’s surprisingly hard to find information about it, particularly its prevalence – this article says that 5% of people have it in Wolaitta, Southern Ethiopia. In this interview Blake Mycoskie asserts that we know podoconiosis is preventable because it doesn’t occur in Hawaii, where the volcanic soil contains the same silicone as it does in Ethiopia. But kids in Hawaii spend tons of time running around barefoot in that volcanic soil, to the point that the calluses on the soles of our feet get stained red by it. Whatever the reason for the lack of podoconiosis in Hawaii, it’s certainly not because kids aren’t exposed to the dirt.

All of the dirt in Hawaii looks like this

The video claims that many children cannot attend school without shoes. I’ve never seen this rule enforced in any tropical area. The Tanzanian kids I taught HIV prevention to had to have uniforms but not shoes. Even my rich private school in Hawaii didn’t make us wear anything at all on our feet until the seventh grade. I don’t know how to assess how common shoe requirements are but I’d guess that in warm places the answer is “not very”.

Finally, I like going barefoot when possible, and most kids in Hawaii will fight like the dickens to avoid wearing shoes. Even if shoes are a good idea, I suspect you’ll have a hell of a time getting your target population to actually use them.

Naman Shah over at the topnaman Malaria blog and I didn't really agree with Jason's last post on the risks of mefloqine (brand name Lariam), so in the grand tradition of MethodLogical (almost 6 months old!), we took to the comments section and responded. The following are excerpts from and additions to those comments. Most of the credit here goes to Naman (seriously check out his blog).

So why is mefloquine on the market despite possibly severe side effects?

1) Because it works. The current FDA approved dosing regimen for prophylaxis may be suboptimal but mefloquine is a very effective antimalarial without reported drug resistance in most parts of the world.

2) As with all drugs there are contraindications and side-effects but mefloquine is part of the first-line treatment for P. falciparum (the most deadly of the malarial parasites) in several countries (Bolivia, Peru, Venezuela, Thailand, Cambodia) and is used safely, at much higher doses, to cure thousands of patients every year.

The FDA has to be risk averse and highlight even tenuous links. Notice their statement is careful to say “associated,” not “caused,” and only specifies suicides which occurred in traumatized war veterans. That said, yes, our clinical experience indicates there are certainly neuropsychiatric side effects, a term that sounds misleadingly severe which is why grading the adverse event matters – common "vivid" dreams are one thing, amnesia or seizures are another.

On the matter of frequency, the 25% rate of central nervous system side effects mentioned were self-reported in a small non-representative convenience sample. Note, most of those were headaches, vivid dreams, and insomnia- a far cry from schizophrenia. We really don't know how frequent side effects for mefloquine are, especially by severity. One review found "despite a negative media perception, large pharmaco-epidemiological studies have shown that serious adverse events are rare." Another review found that mefloquine does have more adverse events, but the absolute rate is still low (<5%) and they are mostly mild. In the real world, absolute rate is much more significant than relative rate.

Is it possible this entire blog post is a mefloquine-induced hallucination?

3) The other drugs are far from perfect.

Chloroquines are useless in much of the world, especially where P. falciparum is endemic.

Doxycycline dramatically increases photosensitivity which can be problematic for pale people in already hot places. It can cause some unpleasant GI symptoms. Now, these symptoms are probably preferable to schizophrenia, but are more common AND a huge reason people stop taking doxy. If patients stop taking it, they are not protected and they can get malaria. Further, doxy may interfere with oral contraceptives (aka, the pill). Getting malaria is worse than getting pregnant, but if alternatives exist, they are worth considering in a woman of child bearing age. Also, doxy is teratogenic (harmful to fetuses), so if you get pregnant while on it you may have problems.

While it is harder for some people to remember the weekly dosing regimen of mefloquine, the problem with the daily dosing of doxy is that if you aren't consistent with taking it at the same time, you run the risk of having low levels of the drug in your system, increasing susceptibility to malaria. It may be easy to remember to take a drug every day, but taking it at the same time every day (with food to avoid those nasty GI side effects!) is another thing. Is there one time every day where you are always 100% sure you will have access to food, water and your pills? Further, taking doxy for a month after leaving a malaria-endemic area is tough. People are not good at taking medication when they are sick, and they are even worse at taking it when they are healthy. It's hard enough to convince a healthy person in a malarial area to take prophylaxis, convincing them to take it for 4 weeks afterward is even harder.

Doxy is also the weakest antimalarial of the three leading prophylactic regimens, leading to many breakthrough infections (we believe).

Atovaquone-proguanil (brand name Malarone) has the shortest pre- and post-travel regimen and great activity but is crazy expensive. $6 to $8 a pill. That may be fine if your insurance covers it (but that really just masks the cost to the health care system- a different issue entirely) or if you are going somewhere for a week, but after a while you start to wonder if a little malarial fever may be worth $45 a week. Also, all the points about daily dosing from doxy apply here. However, if this one becomes generic before resistance becomes widespread, mefloquine's days are numbered.

Artemisins are not used for prophylaxis due to short half-lives and the need for multiple daily doses to ensure therapeutic levels.

There isn’t much comparative effectiveness research looking at atovaquone-proguanil v. mefloquine v. doxy. Some sources assert malarone's potential superiority, but there’s nothing rock solid. In the end, such studies are hard to execute, because you need to account for both efficacy (how a drug works when taken perfectly) and effectiveness (how a drug works when taken under real-world conditions- i.e., missed doses).

Lastly, Jason made an excellent point about assessing the true risk of mefloquine in patients predisposed to psychiatric illness. Such patients are (rightfully) protected by institutional review boards (ethics boards that approve all academic research) because of their reduced capacity, increased vulnerability, and potential inability to consent for themselves. However, if we overprotect this population, we do them a disservice, as we never identify the best ways to manage their care.