Woohoo! I no longer need to feel guilty about all the single-use packaging my groceries are in!

Any images of plastic melting into water and carbon dioxide were swiftly dispelled, however, when I read the article itself:

The new research was spurred by the discovery in 2016 of the first bacterium that had naturally evolved to eat plastic, at a waste dump in Japan. Scientists have now revealed the detailed structure of the crucial enzyme produced by the bug. The international team then tweaked the enzyme to see how it had evolved, but tests showed they had inadvertently made the molecule even better at breaking down the PET (polyethylene terephthalate) plastic used for soft drink bottles…

So the enzyme that eats plastic bottles already existed, and was discovered in 2016. What the scientists accidentally did is they accidentally made it better at eating plastic bottles.

How much better is the new mutant enzyme at breaking down PET (polyethylene terephthalate)?

It is a modest improvement – 20% better – but that is not the point,” said McGeehan. “It’s incredible because it tells us that the enzyme is not yet optimised.

See, here’s the problem I have. The developments in the article are genuinely exciting, not for their immediate uses but because of what they suggest is possible — that we may eventually find and/or develop enzymes that can break down all plastics (not just PET) in a matter of, say, hours or days — as opposed to the centuries plastics currently take to degrade.

But none of that possibility is conveyed in a sensationalist headline that focuses on the idea of eating plastic bottles.

In fact, the Reddit post on r/science highlighting this article got flagged for a sensationalistic headline.

Screenshot in case the post disappears.

And the mod response right at the bottom.

Confusogenic Cancer Communications

You might remember this one. When the World Health Organisation (WHO)’s International Agency for Research on Cancer (IARC) released its report on the link between processed meats and colon cancer in Lancet Oncology, the news made headlines all over the world. The Guardian’s headline was especially egregious, for reasons I’ll point out in a second, but many major news outlets responded with similar headlines:

All of these headlines say that eating meat causes (or “is linked to”) cancer, but the Guardian’s headline says one thing that the other headlines do not: that processed meat is in some way as bad at causing you cancer as smoking.

Let’s take a look at the openings of each of these articles, too. The Guardian:

Bacon, ham and sausages rank alongside cigarettes as a major cause of cancer, the World Health Organisation has said, placing cured and processed meats in the same category as asbestos, alcohol, arsenic and tobacco.

The New York Times:

An international panel of experts convened by the World Health Organization concluded Monday that eating processed meat like hot dogs, ham and bacon raises the risk of colon cancer and that consuming other red meats “probably” raises the risk as well. But the increase in risk is so slight that experts said most people should not be overly worried about it.

The Washington Post:

A research division of the World Health Organization announced Monday that bacon, sausage and other processed meats cause cancer and that red meat probably does, too.

The BBC:

Processed meats - such as bacon, sausages and ham - do cause cancer, according to the World Health Organization (WHO). Its report said 50g of processed meat a day - less than two slices of bacon - increased the chance of developing colorectal cancer by 18%. Meanwhile, it said red meats were “probably carcinogenic” but there was limited evidence.

If you look carefully, you’ll notice that the New York Times, Washington Post and the BBC talk specifically about the findings in the Lancet Oncology) paper itself, which is a meta-analysis of existing studies and pretty readable even for someone without any college-level medical or biology knowledge:

A meta-analysis of colorectal cancer in ten cohort studies reported a statistically significant dose–response relationship, with a 17% increased risk (95% CI 1·05–1·31) per 100 g per day of red meat and an 18% increase (95% CI 1·10–1·28) per 50 g per day of processed meat.

The Guardian, instead, opted to talk about the WHO’s categorisation of processed meat as a class 1 carcinogen:

Overall, the Working Group classified consumption of processed meat as “carcinogenic to humans” (Group 1) on the basis of sufficient evidence for colorectal cancer.

[T]hese classifications are not meant to convey how dangerous something is, just how certain we are that something is dangerous.

Scientific Literacy, Or Lack Thereof

While it’s true that the WHO’s communications on the carcinogenicity of processed meat were pretty dang bad, I also think the overall level of scientific literacy among non-scientists is pretty poor.

This comment isn’t about how much science non-scientists know — it’s about whether non-scientists know how to read science at all. And to be fair, this isn’t something I knew to seek out for myself either — it was something that I accidentally thrust on myself.

My final semester at NYU, I took a class called Learning To Speak: First and Second Language Acquisition Of Sound. I asked the professor beforehand what the class was like, and she said it was “lots of reading”.

Easy peasy, I thought. I’m a Spanish major, I can do lots of reading. In my imagination there was some kind of textbook of language acquisition, and we’d read a chapter or two every week.

Of course that’s not what happened. Every week, we read two to three papers on how people learn to speak and understand spoken language, and then we critiqued them. We discussed how well or how poorly the experiments were designed, how the subject pools may have affected the outcome, alternative interpretations of the results, and so on. (While discussing one paper with a particularly baffling choice of subjects, our professor said, “You’re all young and like, ‘for the science!’ but maybe he had a publishing deadline and decided the the data was good enough.”)

Honestly, I don’t remember half of the conclusions from the papers we read, but what I took away from the class was much more valuable. I learnt how to read a scientific paper, how to look for and poke at chinks in the armour, the linguistic and statistical sleights of hand that researchers might use to shore up data that is in reality not very conclusive. It was the first time I’d been forced to pull back the curtain and actually look at how scientific knowledge is created — and therefore how solid or shaky that knowledge might be.

This is very different from how science tends to be taught up to high school. At that level, education focuses not on the experiment but on the result. High school science is about showing a grasp of principles that are already well-established, without necessarily exploring how those principles got established in the first place.

It’s no surprise, then, that when we’re watching science being written, we have no idea what to do with it. It’s not that we don’t want to see the sausage being made, necessarily — we’re not even taught what that looks like. Right through high school we’re only shown complete sausages, and given the vague impression that they come straight from the animal like this. (Okay, that metaphor died fast.)

Except maybe don’t eat sausages, because, you know, they raise your risk of colorectal cancer from 4.3% to just over 5%.

Alternatives to the Current Model of Science Education

There’s no need to throw the baby out with the bathwater here — we don’t have to overhaul pre-university science education as it is, but we probably should change the experimental part of it.

Imagine a virtual reality sandbox with its own rules of physics. Imagine the challenge of using whatever you find in this virtual reality to try and determine acceleration due to gravity, or the refraction index of a particular gemstone, or the chemical composition of an unknown liquid. You could generate a complex system with some element of randomness for teaching students about designing and conducting experiments on a population, and the different ways of massaging data to fit a desired result. There’s so much more we could do in this arena.

Right now, science journalists have the job of communicating to us what these science papers say. They don’t always do a very good job of it, and neither do the scientists.