Effective information extraction

Month: August 2015

Tl;dr: Everyone is pretty overconfident about their pet nutrition theory (even you). We also trust announced panaceas too much, whether it’s a new diet today or bloodletting 2000 years ago.

I recently read The China Study, a book on nutrition. Or, rather, a bit over half. I stopped after a few hours because he was starting to repeat the narrative over and over, and I wanted to look at rebuttals before I put in the time to finish the book. I should clarify, at this point the book sounded pretty convincing.

The go-to rebuttal was written by Denise Minger, and took her a month to write and almost that long for me to read (over an hour). After reading it, it too sounded pretty convincing.

I aired my thoughts to a family friend, and his defense of The China Study sounded pretty convincing too.

This is not a healthy pattern. Luckily, at each point, I realized what was going on. I had gone into this endeavor with a highly, almost radically skeptical mindset, and it paid off. But many times, I and others aren’t so lucky.

One of my housemates sometimes complains that he believes too many things he reads. I completely empathize; humans evolved such big brains largely to make very compelling arguments. Nearly everyone disseminating information to us has an incentive to exaggerate as much as they can get away with, be it reporters, peddlers, researchers, or Donald Trump. Even researchers, whom we usually think of as trustworthy in cases other than climate change, are still motivated to exaggerate: if they can sway their field or the public to think their work more important than it is, it means more citations or media attention for them. Much of this is because good hypotheses are hard to come by; if a compelling one presents itself it may be your only chance for fame, and you certainly don’t want to pass that up because the data is unlucky. From HPMOR, ch. 78: