This is Steven Salzberg's blog on genomics, pseudoscience, medical breakthroughs, higher education, and other topics, including skepticism about unscientific medical practices. Here's where I can say what I really think about abuses and distortions of science, wherever I see them.

The new Star Trek: Discovery series is based on a massive scientific error. Can it survive?

It didn't have to be this way. Those of us who have followed Star Trek through its many TV series and movies, including the excellent trio of recent moves (2016's Star Trek Beyond is the latest) were eager to jump on board the newest show, Star Trek: Discovery.

Despite some rather uneven acting in the pilot, I was willing to give it a chance. So were millions of other Star Trek fans.

But alas, the writers have stumbled into a scientific error so egregious, and so entangled in the entire plot line, that I fear the new Star Trek cannot recover. (Note: a few mild spoilers ahead.)

Episodes 4 and 5, released on October 8 and 15, revealed that the USS Discovery, the ship that the series revolves around, has an advanced form of transport that allows it to travel anywhere in the universe instantaneously. Unlike all previous Star Trek transport tech, this one uses a biological mechanism, based on mushrooms.

Yes, you read that right. The DASH (Displacement Activated Spore Hub) drive uses mushroom spores as its power source. They've discovered a special fungus whose root system extends "throughout subspace" all over the galaxy. Using spores from this fungus, the ship can jump into subspace (or something like that) and jump out somewhere else in real space, light years away, in a matter of seconds. As bizarre and this sounds, the worst is yet to come.

To power its DASH drive, the Discovery maintains a large greenhouse full of spore-producing mushrooms. (Mycologists might love this, but how big a fan base can they be?) The problem for the Discovery, in the first few episodes, is that the experimental drive will only let them jump short distances.

Then they discover the tardigrade. Tardigrades are a real thing: they are microscopic animals, only 0.5 millimeters long, that live all over the planet. Here's a picture of one:

They are also surprisingly cute for a microscopic animal, and they are colloquially known as water bears, moss piglets, or space bears. "Space bears" comes from their ability to survive in extreme environments, possibly including interplanetary space.

Star Trek Discovery's tardigrade is, shall we say, rather different. It looks a bit like the picture shown here, but it's the size of a large grizzly bear, incredibly strong, and extremely fierce. On the show they call it a "giant space tardigrade."

But that's not all. Thanks to a unique biological property that the show's writers apparently misunderstood, the space tardigrade can access the mushroom network to travel throughout the universe, wherever and whenever it chooses.

"Like its microscopic cousins on Earth, the tardigrade is able to incorporate foreign DNA into its own genome via horizontal gene transfer. When Ripper [the space tardigrade] borrows DNA from the mycelium [the mushroom], he's granted an all-access travel pass."

And just like that, not only the tardigrade but the entire spaceship jump across the galaxy. Is this sounding a bit crazy? It should.

Horizontal gene transfer (HGT) is a real thing. It's a process through which bacteria sometimes take up DNA from the environment and integrate it into their own genomes. Animals can't do HGT, but rather infamously, a paper was published in December 2015 that made the bold claim that tardigrades had a unique ability to absorb all kinds of DNA. That paper was instantly controversial in the scientific community, and not surprisingly its findings were being disputed in the Twittersphere within days of its appearance. Surprisingly, the same journal (PNAS) that published the bogus HGT claim published a second paper just a few months later showing that tardigrades do not absorb foreign DNA into their genome. That plus a third paper showed that the original paper had mistakenly identified contaminating DNA as part of the tardigrade's own genome. This rapid correction of the record was a win for science; I've used this example to demonstrate to my undergraduate class how sloppy science (the first paper) can lead one astray.

So: a minor scientific controversy, quickly debunked.

Until, that is, one of the Star Trek writers got their hands on it. Apparently one of them heard the tardigrade story, perhaps someone who'd had a bit of biology in college (I'm guessing here), and got so excited that they turned it into a wildly implausible premise for an intergalactic space drive.

The idea of using horizontally transferred DNA for space travel is so nutty, so bad, that it's not even wrong. Even if tardigrades could absorb foreign DNA (they can't), how the heck is this supposed to give them the ability to tap into the (wildly implausible) intergalactic spore network? DNA that's been taken up through HGT isn't connected to the source any longer. This is no more plausible than asserting that people could connect to the mushroom network by eating a plate of mushrooms. And how would the space-traveling tardigrade take the entire ship with it? Are we supposed to assume it's creating some kind of mushroom-DNA field?

And now the entire series seems to be based on a combination of magic (an intergalactic mushroom network in subspace) and scientific errors (horizontal gene transfer by tardigrades).

I can't watch this nonsense. I'm willing to suspend disbelief for the sake of a good story (warp drive!), but I can't accept obviously bogus claims. I don't know how the Star Trek writers can get themselves out of this one, but if they don't, then Star Trek Discovery is finished. If they're reading this, here's my plea: ditch the DASH drive and find something to replace it–and for god's sake, hire a competent science consultant.

Should you be on statins? New guidelines and an online calculator may allow you to answer this question yourself.

Back in 2011, I asked whether we should all be on statins. At the time, it was clear that statins offered benefits for people who had already suffered heart attacks or other serious cardiovascular problems. But for the rest of us, it wasn't clear at all. A number of studies had been published suggesting that millions more people (in the U.S. alone) might benefit from statin therapy, but most of those studies were published by drug companies that made statins. As I wrote at the time, "we need more data from completely unbiased studies."

So has anything changed? Actually, it has. Last year, the U.S. Preventative Services Task Force (USPSTF) reviewed all of the evidence and updated its former (from 2008) recommendations. The evidence now suggests that some people–even those who have never suffered a heart attack–would benefit from statins.

you have one or more "risk factors" for cardiovascular disease (more about this below), AND

you have a 10-year risk of cardiovascular disease (CVD) of 7.5%-10%, using a "risk calculator" that I'll link to below.

Now let's look at those risk factors for CVD. There are four of these, and any one of them puts you in the category of people who might benefit from statins: diabetes, high blood pressure (hypertension), smoking, or dyslipidemia.

Most people already know their status for the first 3, but "dyslipidemia" needs a bit more explanation. This is simply an unhealthy level of blood cholesterol, defined by USPSTF as either "an LDL-C level greater than 130 mg/dL or a high-density lipoprotein cholesterol (HDL-C) level less than 40 mg/dL." You can ask your doctor about these numbers, or just look at your cholesterol tests yourself, where they should be clearly marked.

For that last item, how do you calculate you 10-year risk of CVD? Most people should ask their doctor, but if you want to see how it's done, the calculator is at the American College of Cardiology site here. It's quite simple and you can fill it in yourself to see your risk.

A big caveat here, as the USPSTF explains, is that the "risk calculator has been the source of some controversy, as several investigators not involved with its development have found that it overestimates risk when applied to more contemporary US cohorts."

Another problem that I noticed with the risk calculator is that using it for the statin recommendation involves some serious double counting. That's because the risk calculator relies in part on your cholesterol levels and blood pressure, but those same measurements are considered to be separate risk factors for CVD. This puts a lot of weight on cholesterol levels–but on the other hand, statins' biggest effect is to reduce those levels.

The USPSTF is a much more honest broker of statin recommendations than industry-funded drug studies, so we can probably trust these new guidelines. Note that if the risk calculator puts you in the 7.5%-10% range, you will only get a very small benefit from statins–as the USPSTF puts it, "Fewer persons in this population will benefit from the intervention."

“If I was taking a tablet every day for the rest of my life, I would want to know how long I would have extra to live. If you take statins for five years and you are at higher risk, then you reduce the risk of a heart attack by 36%. But if you rephrase the data, this means on average you will have an extra 4.1 days of life.”

So no, we shouldn't all be on statins. But until something better comes along (and I hope it will), they are worth considering for anyone who is in a higher-risk group for cardiovascular disease.

Have you heard that serving your food on smaller plates will make you eat less? I know I have. I even bought smaller plates for our kitchen when I first heard about that study, which was published in 2011.

And did you know that men eat more when other people are watching? Women, though, behave exactly the opposite: they eat about 1/3 less when spectators are present. Perhaps guys should eat alone if they're trying to lose weight.

Or how about this nifty idea: kids will eat more fruits and vegetables at school if the cafeteria labels them with cool-sounding names, like "x-ray vision carrots." Sounds like a great way to get kids to eat healthier foods.

Hang on a minute. All of the tips I just described might be wrong. The studies that support these clever-sounding food hacks all come from Cornell scientist Brian Wansink, whose research has come under withering criticism over the past year.

Last week, Stephanie Lee at Buzzfeed wrote a lengthy exposé of Wansink's work, based on published critiques as well as internal emails that Buzzfeed obtained through a FOIA request. She called his work "bogus food science" and pointed out that

"a $22 million federally funded program that pushes healthy-eating strategies in almost 30,000 schools, is partly based on studies that contained flawed — or even missing — data."

Let's look at some of the clever food hacks I described at the top of this article. That study about labeling food with attractive names like "x-ray vision carrots"? Just last week, it was retracted and replaced by JAMA Pediatrics because of multiple serious problems with the data reporting and the statistical analysis.

The replacement supposedly fixes the problems. But wait a second: just a few days after that appeared, scientist Nick Brown went through it and found even more problems, including data that doesn't match what the (revised) methods describe and duplicated data.

How about the studies that showed people eat more food when others are watching? One of them, which found that men ate more pizza when women were watching, came under scrutiny after Wansink himself wrote a blog post describing his methods. Basically, when the data didn't support his initial hypothesis, he told his student to go back and try another idea, and then another, and another–until something comes up positive.

This is a classic example of p-hacking, or HARKing (hypothesizing after results are known), and it's a big no-no. Statistician Andrew Gelman took notice of this, and after looking at four of Wansink's papers, concluded:

"Brian Wansink refuses to let failure be an option. If he has cool data, he keeps going at it until he finds something, then he publishes, publishes, publishes."

Ouch. That is not a compliment.

Soon after Gelman's piece, scientists Jordan Anaya, Tim van der Zee, and Nick Brown examined four of the Wansink's papers and found 150 inconsistencies, which they published in July, in a paper titled "Statistical Heartburn: An attempt to digest four pizza publications from the Cornell Food and Brand Lab." Anaya subsequently found errors in 6 more of Wansink's papers.

It doesn't stop there. In a new preprint called "Statistical infarction," Anaya, van der Zee and Brown say they've now found problems with 45 papers from Wansink's lab. Their preprint gives all the details.

"Until Wansink can explain exactly what happened, no one should trust anything that comes out of his lab."

In response to these and other stories, Cornell University issued a statement in April about Wansink's work, saying they had investigated and concluded this was "not scientific misconduct," but that Cornell had "established a process in which Professor Wansink would engage external statistical experts" to review many of the papers that appeared to have flaws.

And there's more. Retraction Watch lists 14 papers of Wansink's that were either retracted or had other notices of concern. Most scientists spend their entire careers without a single retraction. One retraction can be explained, and maybe two or even three, but 14? That's a huge credibility problem: I wouldn't trust any paper coming out of a lab with a record like that.

But how about those clever-seeming food ideas I listed at the top of this article? They all sound plausible–and they might all be true. The problem is that the science supporting them is deeply flawed, so we just don't know.

Finally, an important note: Brian Wansink is a Professor of Marketing (not science) in Cornell's College of Business. He is not associated with Cornell's outstanding Food Science Department, and I don't think his sloppy methods should reflect upon their work. I can only imagine what the faculty in that department think about all this.