The need for scientific celebrity seems to have spread like wildfire in recent years, and it’s making a mockery of real scientific progress.

Whether you are calling all supplements dangerous after 'confirming' that omega-3 causes a 71% increase in prostate cancer or releasing pictures of cancer riddled mice
to the world media on the back of a questionable study on GM corn, it seems that poor evidence dressed up as a major finding has been the dish du jour for some time.

But the growth of 'scare stories' and news of miracle cures in recent years suggests to me that it's getting worse.

Whether you are a journalist, scientist, communications manager, journal editor - the unfortunate fact is that we all want that next blockbuster study, the one that everybody is talking about, the one that help you win that grant, or help improve your journal’s impact factor.

Unfortunately, in the search for those elusive blockbusters, we play a role in creating a trail of false hopes, unfounded health scares and bad science.

Correlation not causation

By far the biggest scientific crime I witness is the large number of studies that spot a statistical correlation in a raft of data and use this as a basis to publish a highly spun 'research paper' that claims (often conclusively) that A causes B or that X cures Y.

No, it's much more complex than that. It's due to a combination of socio-economic factors (supposedly accounted for in the analysis) that mean that well-off countries with people who are well educated and have reasonable incomes consume more chocolate and also win more Nobel Prizes – and we know it is.

With that in mind, take a look at some of the other correlation studies I have written about in the past few weeks. Omega-3 linked to prostate cancer
comes to mind immediately, not least because its publication generated so much interest and feedback
. But what about this one
that links walnuts to a reduced risk of diabetes, or this one
that links junk food and soft drinks to increased cancer risks?

They all, to some degree, have similar flaws. Correlation cannot be confused with causation; and on its own, in no way suggests it.

Correlation and causation are two very different things. (Image: xkcd.com)

Considering the evidence

It seems that many scientists seem to have pre-formed negativity when it comes to food, nutrition, and especially supplementation.

To use the omega-3-cancer study again, the lead author Dr Alan Kristal commented at the time: "We've shown once again that use of nutritional supplements may be harmful."

Think about that for a second. Have they? They found an unexplained association between a nutrient and cancer risk. There was no suggestion or data to support the idea that this was due to supplements (omega-3 also comes from fish). Even if there had been evidence that omega-3 supplements were causing such an effect, to extrapolate that and say they had shown that 'the use of nutritional supplements may be harmful' is a major fallacy.

I think in this case, as is in many other cases, the scientists put emphasis on part of their study - or extrapolated findings from their study - that are not necessarily based on the data they have.

There are many things where its simply impossible to extrapolate your findings to a wider, or larger group - no matter how much you try. (Image: xkcd.com)

Is good science dead?

At the same time as these wild claims, there are some very important advances out there. They are being published on a daily basis.

The trouble is that no matter how important it might be for yourself or your business, nobody really wants to read a 3000 word technical report (or even a short news summary of such a report) on acrylamide formation or microencapsulation when there is somebody claiming that the product you're making could cause or cure cancer - sometimes both!

Is this what caused the problem? Partly. But on the other side, industry will always question the scientific merit of studies that question their products - whether it’s omega-3, resveratrol or multivitamins. Whoever makes that product will naturally pick as many holes in the study as possible. Many of them are completely justified, some of them are simply not.

A single study will never tell us anything; that’s why we need a mixture of evidence. Preferably including mechanistic data that shows biological plausability, epidemiological evidence from population data (where relevant) and a systematic analysis of the evidence from trials – such as those done by the Cochrane collaboration.

Nathan Gray is the science reporter for FoodNavigator.com and NutraIngredients.com. He has written on key areas of food science and nutrition impacting the global food and nutritional supplements industry – including flavour formulation, sodium reduction, gut health, and the links between nutrition and disease states. Nathan has a degree in Human Biosciences, specialising in nutrition. You can tweet him @nathanrgray
.

This content is copyright protected

However, if you would like to share the information in this article, you may use the headline, summary and link below: