Tag Archives: RCTs

I have lots of internal debates on the recent trend towards blogs covering formal publications and vice versa. As an example, over the winter break I happened to pick up a copy of the Economist – a obviously great publication – to read a summary of the recent debates on evaluating Jeffrey Sachs’ Millennium Village Projects. The debate itself did not play out at a big conference, well, it probably has, but the article was tracking the discussion on the World Bank’s impact evaluation blog, written in part by IPA affiliate David McKenzie. In this one case I happen to have read the full debate beforehand – perhaps the only moment in my life when I’ve been a half step ahead of such a publication. Just struck me as odd that this implies that for nearly every article in the entire publication there are more than a few people thinking wow, the Economist is a few months behind on this one.. The debate itself is worth reading, if you’re into that sort of thing.

All that is to preface saying that this is a post about an article from another famed publication, The Atlantic. I’ve recently been mentioning this article to so many people I figured I might as well post it here, even if it’s two months old. It’s an article on meta-research – perhaps the driest realm imaginable to some – but a great introduction into the inquiries of how we know what we know, how science and academia and business are all so closely intertwined, and how minor adjustments to small underlying assumptions can turn great research findings on their head. Dr. Ionnidis set up a lab in Greece to crunch big numbers and keep the clinical trials industry honest – and they, as a scientific community, seem to have embraced his oversight. Best news coming out of his work is that we should really trust very little of the “evidence” produced by all these nutritional / health trials; common sense reigns again!