On the 26th of November, New Scientist (NS) published an article focusing on new chemicals. “great” I thought “I love chemistry”. It was that issue’s cover story but it had the worrying blurb of “We’ve made 150,000 new chemicals. We touch them, we wear them, we eat them. But which ones should we worry about?” The article’s tagline read “Our food, furniture and frankly everything else is contaminated with industrial compounds – but how harmful are they? New Scientist investigates.”

You’d be forgiven for feeling concerned because it’s New Scientist. They’re a powerhouse of knowledge and (usually) bring good, solid science to anyone who’s interested. So as opposed to just sighing, and muttering something about chemophobia, I read on. However, before we go any further, I want to take a second to appreciate what this cover and tagline sets up in the readers mind. To outright state, what your average non-chemist may think.

On the cover we see the words touch, eat and wear. Potentially leading the reader to think that they’re covered in, and ingesting chemicals. The less informed may worry. The more informed probably aren’t ridiculously bothered, but this is New Scientist, not known for over-exaggerating things. readers are probably taking serious note of what’s to come. Next, we see “which ones should we worry about?” OK, our more informed reader may hold off from full-blown panic but the blurb is literally suggesting that we may need to worry about something.

The by-line reads, “frankly everything is contaminated with industrial compounds”. The language is loaded, isn’t it? I mean, they’re being frank! And it’s not just that manufacturers have used some chemicals to make things (not that even that would be bad). No, no, no. Chemicals are contaminating things. No, not even that. Everything is contaminated. Contaminated with Industrial compounds. Our only reason to feel like it’s safe to shake quietly is that in asking the question “how harmful are they?” we’re lead to believe that perhaps not all 150,000 are going to kill us. However, the tone so far is causing us to consider the possibility that frankly we might just be screwed here. Again, this is New Scientist, people respect their journalism – I respect their journalism. NS has a stellar reputation. Of course, I may be jumping to conclusions here. It’s not like the article was outright using words like toxic and saying that we ought to be concerned – shocked, even. So perhaps your average reader wasn’t in full-blown panic mode. Concerned, sure, but maybe not panicked. Not shocked! No cause for that. OK, on to the article and, the natural beginning, the title… Toxic Shockers… Oh. Oh dear.

So here we are, in an NS cover-article about toxic chemicals, worried about what we’re going to see. We’re now looking out for new chemicals that we need to add to our list of things to avoid. Avoid them like you’d avoid sitting next to “that guy” at the Christmas dinner. The article’s set up for you to feel anxious and draw you in. So what monstrosities await? (Deep breaths everyone.)

We’re greeted with three paragraphs telling us that apples may contain traces of “pollutants”. Pollutants such as fungicides, insecticides and herbicides. To the uninitiated this might well be a worrying few lines of text. It takes until the fourth paragraph for the authors to say that most of them are either within safe levels or have zero effect upon the human body. The intro is horrible and if your guard wasn’t up, you’d be thinking that nothing (not even maw and paw’s good ol’ fashioned wholesome apple) was safe. The thing is, these pesticides are designed for use in agriculture and they’re present to prevent crop failure; to prevent pests from ruining this food which you’re relying on to not die. They’re not there to kill you and (as we eventually find out) they’re often present well below harmful levels or are as dangerous as a tickle fight with a kitten. Reading on, we hope they’re going to address the “problem” of the remaining 149,999 new ways to die.

“Over the next five pages, we explain what we know about nine of the most frequently suspected substances.”

Nine…NINE! I mean I wasn’t expecting 150,000, but nine (where did 150,000 come from anyway?). Ok, it’s surely going to be nine brand new super-effective ways to die. It’s probably the equivalent to juggling revving-chainsaws whilst telling Vladimir Putin how much you admire Stephen Fry. I mean, it’s had some serious build-up. It has got to be the nine worst offenders. Right?

Painfully, as you read through the article, you realise that only five of the nine classes of compounds actually appear to have a discernible toxicity profile. In what seems like a joke at first (but mercilessly isn’t) the authors talk about burnt toast. They mention acrylamide as a potential carcinogen and go on to establish the link between certain food-types, cooking methods, and how one may go on to produce the other. This leads the less critical reader to assume that burning their toast, causes cancer. However the authors (eventually) go on to state that there isn’t even a shred of evidence to prove that acrylamide is a human carcinogen. Why is this here? Their big example of compounds that they deem “guilty as charged” are lead and mercury. Ignoring the headache-inducing cliché and focusing on the big “offender”, this is not breaking news! Lead is bad for you… No, no, please, I’ve got this. 1990s meet New Scientist, New scientist meet the 1990s. You guys have a lot to talk about.

Sadly, after reading the article, the real take-home point here is not that you should burn your apples and not burn your toast. It’s that New Scientist span out an article which was a Daily Mail-styled piece of click bait! Something which, in essence, said “Hey, you’re all going to die. Wanna find out how? Click here!” New Scientist put out an article designed to make you think that we were (to quote the authors) “up to our necks in chemicals” and then muddied the waters by talking about a select few, not all of which that were harmful. You walk away from the piece scratching your head as to why any of this was news-worthy and quite frustrated that it not only made it into New Scientist, but was the cover article.

I despair at the shape of things to come if this is the future of science communication. Yes the Daily Mail makes stacks of cash churning out this sort of shit, but I beg you, NS, don’t sell your soul. Once your reputation’s gone, there’s no going back. We shouldn’t stand for this. Read the article (sorry, paywall) and if it pisses you off as much as me, tweet them or do the Facebook thing. Hell, if that doesn’t work, let’s stand outside their offices and shout until the daily mail takes over. I’m going back to radio silence.

A while ago I wrote a piece for GUM (Glasgow University Magazine) as part of a collaborative effort between themselves and The GIST. Due to the word count limitation, I had to cut the article down to 800 words (feel free to read it; pages 34-35). There was originally a plan to publish the longer article (which I like a little more) on The GIST’s website but As I’ve handed off most of my responsibilities to a new generation of GISTers I suspect that this article will fall to the bottom of their to-do list and I’ve decided to post the full article here. I hope you enjoy it.

Dying For Clarity

How do you begin to respond to an emotional story of loss and tragedy when you think that the methods used to tell the story are wrong? Not only wrong, but an affront to science? Couple this with the fact that the tragic story is conveyed via a heart-felt documentary told by a grieving father and you have a delicate situation on your hands. Essentially, arguing against this story’s conclusions will make you out to be a heartless monster who wilfully ignores the plight of others to focus on “mere” facts. None of this, however, changes the point that the methods used are wrong (or at least inappropriate for this context). Being part of a growing number of people that harp on about badly communicated science at every chance they get, I must say something.

In November 2012 I sat down to watch a documentary on BBC three called Dying for clear skin. The show focused on the use of the drug Roaccutane as a treatment for acne. Many readers I’m sure will be familiar with this drug and maybe even the documentary itself (if not, it’s available on YouTube). Very quickly the show centred on the more serious potential side-effects of the medication and told a heart-wrenching story of suicide that could perhaps, have been triggered by taking this drug. With the BBC being an institution desperate to provide balance on all its topics I had suspected that this emotional-rollercoaster would be countered at some stage by a good scientific argument to provide evidence for or against the banning of this drug, which was ultimately the point of the show. Sadly however, I was disappointed. Instead, the show relied almost exclusively on personal accounts to paint a picture that didn’t accurately reflect the whole story. By cherry-picking the personal accounts of a few they told an inherently biased story and although I suspect that this approach made for excellent television, it effectively removed the science from what was a valid scientific argument.

The personal accounts used to tell the story here were touching and at some points wholly concerning. But it’s important not to get drawn into these stories without further checking the facts (something I suspect most viewers wouldn’t do). When I checked some of the sources I was further disappointed. One of the people interviewed in the documentary, a 22 year old man named Stefan Lay, told his tale of how Roaccutane led to feelings of depression and sexual dysfunction. He appeared as an intelligent young man with an honest story to tell. However when I later found his YouTube channel I quickly realised that he is a person who not only hates Roaccutane but seemingly all prescription drugs. His YouTube channel is called FireYourDoctor, where he tries to discourage people from taking any medication whatsoever. With the dawning of this fact I began to worry about this person’s motives and I can’t help but question his inclusion in the documentary. The makers and presenter of the documentary found him by searching online and selected him on the basis of his YouTube videos. Videos where he expresses views like this:

But I do believe there is a cause. Like, people don’t get cancer for no reason. I don’t think so anyway. I think they’ve got to be a bit run down, or their body has been playing up and they’ve not really noticed the signs of that and they’re not taking their health seriously or they’re not eating well enough… I don’t think the cancer just pops out from nowhere.[1]

Oh dear. Undeterred, the film-makers interviewed him and used the footage where he stated that Roaccutane had caused his inability to have sex and caused him to “feel dead inside”. Yet in his own review of the show, posted on his channel, he stated that these side-effects weren’t as bad as reported and that in fact, he was still able to have sex.[2] I suspect that he was used as an interviewee because he was young, photogenic and told a story that suited the film makers (perhaps after some editing or careful scene selection).

After a few more personal accounts the documentary finally focuses on the evidence that is available in an attempt to link Roaccutane to depression or suicidal tendencies. In the 2 minutes (out of 57) dedicated to this, the facts were glossed over as irrelevant (presumably as they didn’t agree with the points being made) and fobbed off as incomplete. Sadly the film missed its chance to have a serious debate about the drug by ignoring the science. The fact is this. Out of half a million people who have been prescribed Roaccutane worldwide, reports of nine people committing suicide whilst taking the drug were made to the drug’s manufacturer. To add some context to this, that is 88% lower than the UK average (17 per 100,000 population[3]). Not only this, but the film didn’t properly explain to its audience that these suicides could have been caused by any number of different reasons. You’ll hear it time and time again; correlation does not equal causation.

What is very disheartening though is that I think there is a serious debate to be had here. Roaccutane usage is surprisingly common. The drug is not without its (proven) side-effects,[4] the mechanism of action is not fully understood and, worryingly, it appears that this “last option” treatment is sometimes used earlier than needs be.[5] Yet all these points were forgotten in favour of the emotionally-manipulative story of an unproven link to suicide made through the use of questionable and unscientific sources. The calls that came from the makers of this documentary (and some viewers) to ban Roaccutane were reactionary and misinformed. This sort of knee-jerk reaction to an emotional story is unsurprising but the fact that it’s the aim of the documentary is discouraging to say the least. This should have been a scientific argument. Scientific evidence is how we decide if drugs get their licence. Shouldn’t the story have followed the narrative that Roaccutane usage is on the increase, and GPs and dermatologists need a reminder that they are prescribing an incredibly potent drug to potentially vulnerable users? Maybe a call to monitor side-effects much more closely would have been sensible. Perhaps tackling a blasé attitude to the drug would have been much more productive. If nothing else, a call for more studies and more information would have been the logical thing to do. However that’s not what BBC three thinks its viewers want to see and instead they peddled a heart-felt but irrelevant story to an audience it clearly doesn’t respect enough. Not everybody tunes into the BBC to watch ‘Snog, Marry, Avoid?’. Come on BBC, you can do better.

Today I read an opinion piece in Nature(by Colin Macilwain) which states that programmes that bolster STEM (science, technology, engineering and mathematics) education, either distort the labour market (in favour of the employer) or are ineffective and, therefore, a complete waste of money. When you get into the (US-based) statistics of the piece you can’t help but think that there is a point, at least where wasting money is concerned: there do seem to be too many overlapping programmes that share the same goals yet try to achieve them in perfect isolation. But the whole piece is a bit ‘glass half-empty’ for my liking.

Macilwain goes on to let the reader know that the Obama administration is suggesting these programmes are consolidated and strengthened (through increased funding). “Good” I thought “stop wasting money and make the most of what’s available”; a pretty smart move in a time of austerity. I might yield and say that to increase the funding may be unnecessarily generous. However, it seems that Macilwain wants to take things in completely the opposite direction and believes that these types of programmes are of no use at all:

“What no one asked was whether these many activities actually benefit science and engineering, or society as a whole. My answer to both questions is an emphatic ‘no’.”1 (Quote attributed to Colin Macilwain)

What he seems to be suggesting, is that these programmes are of benefit solely to the industry itself and that by flooding the job-market with new graduates the entire situation allows for the best to be cherry-picked and cordially paid a pittance for their effort and for the rest to have been abandoned and ultimately failed by these schemes. He might not be far off the mark here but I do think he’s missing a wider point (more on that later).

Avoiding the beauty of learning and further education because of a lack of inspiration is a travesty. Image credits: Lricewiki via wikicommons

Luckily Macilwain understands that these programmes were set up to involve under-represented minorities to a greater extent in STEM subjects. Furthermore he suggests that employers, who want the best, should invest more in their own employee’s education and training and they’ll have the best. A very fair point, indeed!

Here’s the rub though, Macilwain seems to be saying that all this money ($3 billion in the US) should be spent elsewhere, where it’s really needed and that governments should tell kids to just do what they want as a career. In a best case scenario, I think that he’s missing a bigger point. In the worst case, he might just be belittling the impact STEM ambassadors have on childrens’ lives – glibly calling the schemes “cuddly and wonderful”. The bigger point is this; does he not remember what it was like to be young and impressionable? Not every child knows what they want to do with the rest of their lives. The process by which they chose their career path is rather more complex than:

“I just wanted to be an immunologist” (quote attributed to NOBODY).

Children often take a liking to a subject because they happen to have a particularly charismatic or very passionate teacher (I know that’s what got me hooked on chemistry in the first place). What happens to the child who might be sitting on the fence between a career in, say, plumbing or physics? Perhaps they had the potential to do well at both but needed the right education to know what was best for them. Perhaps they just didn’t ‘click’ with their physics teacher and decided, by the proverbial ‘flip of a coin’, to become a plumber (no bad trade by the way – my Dad is a plumber – but that’s beside the point). That’s one fewer person who pursued a career in the sciences through a lack of education; a missed opportunity due to a lack of inspiration (or, for example, resources within a school).

STEM ambassadors (and people who work through similar programmes) can go into schools, or host events at universities and inspire young minds! This isn’t just warm and cuddly, it’s life-changing. They can be the spark to a child’s imagination that sets them down the path of science or any other STEM career for that matter. I don’t think that we should be cutting investment in STEM education programmes – though I’ll concede that maybe savings could be made – I do think that we should be looking at ways to attract more jobs. It’s not “second-guessing” the job-market, it’s actively going out and promoting it. It’s saying, we have a country that’s full of bright, well-educated minds who can work well for your company and you should set up shop here.

Some might read this and say I’m being naïve but I’d say they’re being overly pessimistic. I’d say that this particular Nature article is guilty of that too. The penultimate point being made, is that the consolidation of these disparate STEM programmes is ultimately futile. Futile because (in the US at least) government ineptitude and bureaucracy will prevent resources from being transferred from one particular site to another. Come on! Let’s not just criticise a system (as valid as that criticism might be) and then just ‘walk away’ without proposing a solution to the problem. Moreover, don’t suggest that axing funding all together is the way forward just because you happen to lack the imagination to fix the problem. The phrase “where there’s a will, there’s a way” jumps to mind. What this article smacks of is a lack of willingness more than the lack of a “way”. There could be many ways in which this problem could be fixed. Why even centralise the resources at all? Surely local resources could just be managed from a central location that’s in charge of the funds?

Returning to my main point, it’s a sad day when it’s suggested that we lessen education opportunities that genuinely help people, as opposed to proposing a solution to an addressable problem. I’d like to see this become the start of some productive dialogue and not the death of a valuable resource.

A while back I wrote a book review. I had originally intended it to be published in The GIST but after a lot of thought I decided that it just didn’t feel like it belonged there. It was too mental and frankly, a bit neurotic. I was trying too hard to be funny when I shouldv’e gone for informative (which is what The GIST is for). So I sat on it until I could decide what to do with it. With things on here being very quiet I decided that it would be a fitting home for the piece (if for no other reason than this blog is pure unadulterated me). So here it is. Enjoy! And don’t judge it too harshly, You’ve gotta experiment with a few writing styles before you find out what works and what doesn’t.

The Epigenetic Revolution. A book by Nessa Carey.

My mind has just melted! Just this very moment I have closed a book, put it in my bag, sat back in a chair and – whilst looking at my coffee cup – breathed a heavy sigh; a sigh of astonishment if you will. This sigh is the only way my body knows how to process what’s just happened. I’m flabbergasted, inspired, a little jealous but mostly awe-struck. Awe-stuck by the information that I’ve been feverishly cramming into my eyes for the last few weeks. Information cramming on a scale akin to a bear cramming salmon down its throat before hibernation. The cause of this smörgåsbord of emotions is, of course, the marvellous book that I’ve just plonked back into my bag. The Epigenetic Revolution by Nessa Carey.

Front cover of your next book.

Anyone who has spent more than a passing amount of time chatting with me about most science-based subjects will realise that I quite like epigenetics. Actually, I more than simply like epigenetics. I adore epigenetics. Unfortunately for me I adore it from a distance. I’m like the bloke in a pub who doesn’t quite know how to break the ice with a pretty girl. I think me and epigenetics could be quite good together but I just need an ‘in’, well, needed an ‘in’. I say needed because I’ve just finished this book. Reading it was like skipping past the awkward chat-up lines and nervous jokes and getting straight to the first all-night conversation where you get to really know that pretty girl. Essentially I’ve just become comfortable reading about a complex area of biology. I’ve realised that epigenetics is more than just a brief whirlwind obsession but it’s a truly fascinating area of research. This is more than just a chemist ‘flirting’ with a bit of biology. It’s actually like the first all-consuming phase of any love story. Epi (we’re close now, I can call her that) is all I can think about. I’m finding epigenetic explanations for things everywhere I look. It’s changed my perspective on life. If I were still in high school I’d be declaring my undying love for this personified beauty and defacing my pencil case. But I’m not in high school, I’m a grown man and it’s high time to stop anthropomorphising this book – and my extreme joy at reading it – and time to try and convince you to read it too.

Hopefully you’re as swept up by epigenetics as I am. However I suspect that you’re left with the distinct disadvantage of not really knowing what Epi actually is. Fear not. The box below shows the most abridged cliff notes that I could bring myself to make:

Got it? Good. It’s so much more than this but I won’t go into much more detail, primarily because Carey has done it so bloody well. What I will do however is to leave you with the information that has me hooked on this subject and then leave you to go and read this wonderful book for yourselves.

Epigenetics is the future of modern medicine. As a chemist, I (and others) look at Epi and see the potential for a new way to treat diseases. Every living cell has DNA, and it relies on that DNA to remain alive and to function properly. If cells aren’t functioning properly and, for example, turn cancerous, epigenetic therapies could quietly shut them off by altering the ways in which these cells make use of the DNA. Teams of chemists and biologists all over the world are trying, as you read this, to utilise epigenetics to treat or cure any number of diseases from diabetes through cancer to schizophrenia. To overly simplify matters, epigenetics could give us a way to obtain the effect of gene therapy without actually having to alter the genes themselves. Don’t you want to know how it actually works?

This week saw the approval of a new drug called Glybera. This is the first in a long awaited class of gene therapy compounds. Officially classed as an adeno-associated viral vector expressing lipoprotein lipase, a non-biologist like me can only assume that the virus will be responsible for locating the appropriate site within the body and then “inserting” the DNA into the appropriate cell’s DNA. Then when the newly adjusted DNA comes to be transcribed the appropriate enzyme will be the result (in this case lipoprotein lipase). Now I’m not going to get into the specifics of this…well, because I’m not an expert and frankly I have a lot of reading to do on this before I’d feel comfortable enough to really go at it. But I will say that in my opinion this has been a long time coming! I thought that with the advent of the human genome project, these kinds of drugs would be all over the place! Sadly this appears not to be the case.

There isn’t a crystal structure for LPL but this is a phospholipase and I think it’s quite pretty. They’re from the same gene family (lipases) so It’s like putting in a picture of a good looking cousin and claiming it’s you. It’ll do for now but it’s not optimal!

The one big thing I will talk about though concerns the recent interaction I’ve had with other bloggers on here. Since deciding to make more of an effort with my blog, I decided to have a look at other writers out there in the big bad blogosphere. The first, second and third pieces that I came across all appeared to be engaging in some serious scare-mongering (Although curiously enough each one said the same thing, almost verbatim!.. that in itself is something I’m sure I could talk about). Does the community at large (and I ask this especially of non-scientists) think that we’re all going to turn into genetically modified super-humans? Or, more to the point, mutants? Do people really think that when a drug hits the market that scientists (and their marketers alike) have some magical power to make them take this drug? What is there to be afraid of? If you don’t trust something, don’t take it!

Let’s remember that, this drug has just been approved and if anything untoward does happen, you can bet your ass that it’ll be whipped off the market at break-neck speed. This leads me to my final point. This drug has only been approved for people whose genes don’t produce lipoprotein lipase naturally. So what this drug is doing is allowing their body to produce a naturally occurring enzyme that will prevent them from developing a potentially fatal disease. This is not the beginning of the real X-men!

Fritz Haber was a monster. Yet he’s, arguably, one of the most successful chemists of all time. In 1918 he was given one of the highest accolades that a chemist can receive; The Nobel Prize. Jump forward in time and we see how significant his work is even today, as high school pupils routinely learn about the Haber process (pictured below). In fact a third of the world’s population owes their lives to the products in fertilisers, which are made from this very reaction. I myself use ammonia and nitric acid frequently – something that wouldn’t be possible without this particular monster’s chemistry. Any scientist today would give their left arm to have such a legacy. However, I think you’d be hard pressed to find anyone who admires the man and this brings me to my point. Knowing that science is often used for questionable purposes, are we as scientists responsible for the unforeseen outcomes of our work? Just because it could be used for sinister purposes, should we supress the knowledge gained from original yet perhaps risky research?

The Haber Process…pretty, ain’t it?

These are questions that are still unanswered (and perhaps unanswerable) even today. Genetically modified foods (and their potential risks) are ubiquitous in the news. Large agrochemical companies produce ever more sophisticated pesticides that may actually drive the evolution of resistance in pests much in the way that antibiotics have driven resistance in bacteria (even though they are trying to fight famine). Petrochemical companies, whilst striving to meet the world’s insatiable oil demands are, at least in part, contributors to global warming. This question of ethics has dogged me for years. After a heated “debate” with a friend I thought I had clearly shown that science is solely knowledge and the problem rests with the people who abuse that knowledge, not the scientist who slaved away to uncover its mystery. Of course the original motive behind this argument was for me to justify my place in the world and redeem myself as a proud scientist. But the more I think about Fritz Haber the more I start to question that stubborn stance. In fact chemists looking to answer this question themselves probably need look no further than the aforementioned monster.

Haber was described as having “…lived for science, both for its own sake and also for the influence it has in moulding human life and human culture…” 1 He was a man who undertook his science for the greater good. A most honourable prospect in anyone’s eyes and, eerily, something I aspire to emulate. However Haber is also the very man who abused his knowledge of science to initiate the gassings of World War I. The Haber process was used to make key components in the explosives used in that horrendous war. He also led the team that developed the precursor to Zyklon B, the gas which the Nazi’s used in concentration camps.

So what are we to learn from the life of Haber? Are we really responsible for every repercussion of our work and if so, do we censor our discoveries? I’m coming around to the idea of fuller ownership of my work but I’m still not convinced that I’m responsible for every repercussion. I’m certainly not responsible in the way that Haber was. After all he was the one who corrupted his own science and I certainly don’t plan on doing that. I am however starting to realise that chemistry – and science as a whole – is corruptible and I should be more aware of the seemingly hidden repercussions of my work. Maybe I should devote some time to limiting the damage it may do. I’m an aspiring medicinal chemist, maybe my compounds could have unintended side-effects and perhaps I am responsible for those. As this unending argument rages on in my head I’m left with another question. At a time where open access journals are becoming ever more popular and the scientific community at large fights against all forms of censorship, how could I even begin to go about limiting the negative outcomes of my own work? I for one hate the idea that scientific discoveries could be locked away by a government agency under the guise of “the greater good” only to be dished out to people it deems appropriate. In this open access age, how could I even begin to stop my (at present hypothetical) scientific discoveries from being hijacked by others with ill-intentions? Sadly I don’t have the answer to all these questions, and I definitely don’t think they’re going to be easy to come up with. What I can be sure of though is that at least I’m not going to betray my science and go the way of Fritz Haber.

A while back I wrote an article for a science communication competition. Nothing came of it and to be honest I had completely forgotten about it. Recently, however, I rediscovered it and thought that this might be a fairly decently place to put. let me know what you think.

When’s a bull not a bull?

Science (noun, pronounced: sʌɪəns) from the Latin word Scientia meaning knowledge. Science is something that the majority of us read about daily, even if we don’t notice it. Most of us take for granted that what we are reading is well researched and truthfully conveyed. I used to believe this was the case until, quite recently, I turned my hand to a bit of light science communication myself. When I began the process of research I noticed that mainstream science articles are plagued with varying levels of… something… a word I probably shouldn’t use in press. (A word defined as vulgar slang meaning “stupid or untrue; nonsense”.) Instead of using this slang, I will refer to it by another word, maybe we should paint a picture of it, perhaps a feisty animal, let’s say a bull. Science communication in the modern age is, unfortunately, riddled with varying levels of bull. (It has a nice ring to it wouldn’t you say?)

"I've told you before. Stay out of that shop and away from science!"

Here’s a light example. This story is based in fact and is not entirely incorrect. What you see is a slight twisting of facts, but it is this distortion that is about as welcome as our bull running free in a china shop named truth.

The piece is centred on a story reported by many newspapers as; “scientists keep beer fresher for longer”. Many papers stated that if you drank beer, you could now take your time over it, savour it, enjoy it, safe in the knowledge that science will keep it from going stale. All that scientists had to do was make it less acidic and keep it cool. The problem with this piece was its over sensationalised nature, by anyone’s standards it was probably a ‘non-story’. It suggested that once your pint was poured you had hours to drink it. This inaccuracy lets our bull charge through the entire article leaving the misinterpretation of poorly conveyed facts in its wake. The scientific paper was concerned with the analytical profiling of the compounds that made beer taste stale. They did suggest that if they made the beer more acidic it would go off faster. However the converse wasn’t mentioned, making the newspapers’ hypothesis speculative at best. The paper was actually providing scientists with a method to measure and follow the ‘freshness’ of beer whilst it was being stored; it was an analytical tool.

Keeping it cold keeps it fresh... who knew?

You may be thinking that there is no harm in a mild misrepresentation of facts because it was just a harmless little article about beer. However that’s entirely against the point of science and its communication; it’s an insidious problem. Our bull has a temper and he snaps in the blink of an eye. There are people who don’t question everything they read and to them the written word is sacrosanct. If the media lies or even accidentally misrepresent the truth, it won’t be long before these people stop believing anything scientists and communicators have to say. Remember the words “MMR vaccine“?

How do we deal with our bull? First-of-all, we need to identify the cause of this problem. Communicators are too concerned with writing a story that sells papers. Perhaps those who are genuinely concerned with relating the truth are just too far removed from modern science to properly grasp some ideas – even former scientists themselves. Understanding science is like any skill set, if you don’t use it, you lose it. Scientists are also at fault and all too often only have to inform their funding bodies that their science is actually being communicated (thus hopefully securing more funding) regardless of how truthfully it was conveyed. We rarely take the time to report outside of peer reviewed journals or make any effort to even engage the masses. When we see our work mildly misrepresented, too few of us actually do something about it. We just frown at our ill-informed press office.

It appears that the attitudes of a few key people need to change. Scientists need to adopt a more long term view and vigorously defend the truth of their research, even if it impacts on how wide its reach is. Whilst science communicators need to realise their own short comings and admit that, sometimes, they need the help of the scientific community to fully understand what’s going on. Perhaps both should train the next generation of budding researchers to engage the public effectively? Both need to talk to their press office. Above all, both parties need to ensure that what’s being communicated is 100% accurate. We need to cut out this nonsense, we need to eradicate public doubt, we need to focus on the truth… our bull needs the snip!