It’s easy to believe something when a scientific study suggests it’s true. But what we often don’t know is where that study came from — and that’s a problem, experts say.

“Predatory journals” are fake academic journals on the internet, and they’re a serious issue for both scientists and everyday people. These journals publish shoddy research online, often without any peer review. In some cases, they also publish legitimate research from scientists who were duped into thinking they’re the real deal — but unlike real journals, predatory ones don’t vet, review and give edits to work before it’s published.

“I would not trust a study in a predatory journal. We want to know that researchers are published in a journal of high quality,” Jason Schmitt, an associate professor of communication and media at Clarkson University, said by phone. “In a world of fake news, this problem deserves to have some light shed on it.”

Tricks and treats

About 420,000 research papers were published across roughly 8,000 predatory journals in 2014. That’s up from roughly 53,000 papers published in 2010, according to a study in BMC Medicine. Schmitt says the problem really got worse in the internet age, starting around 1999 to 2001, and it’s a symptom of the highly competitive world of academia. Typically, researchers have to compete for coveted slots in well-known journals, while associate professors who want tenure need to publish a lot — and fast.

“In my view, it’s the tenure process that’s pushing academics and scientists toward predatory publishing,” Schmitt said. “They’re hoping they can pass their papers onto their tenure committee, and they will trade that $3,000 or $5,000 fee [paid to the predatory journal] for publishing to a $150,000 tenure job for life.”

Professors gunning for tenure need a large body of published work, and landing that work in prestigious journals certainly increase their chances. Schmitt said that such high-impact journals “turn away an extremely large amount of really good research,” and that’s often after researchers wait six to eight months just to hear back from them. Considering that research has a shelf life (and tenure has deadlines), predatory journals are — as the New York Times put it — a “classic case of supply meeting demand.”

But even though publishing with one can certainly be intentional, some scientists are simply lured into the trap.

Predatory journals often assume similar names to famous outlets, for example, which the academic community calls “hijacking.” For instance, the Open Access Emergency Medicine — a legitimate journal from Dovepress — was hijacked by a fake journal called OA Emergency Medicine. These predatory journals are also known to contact researchers directly with offers to publish their work, and their websites sometimes steal real people’s names from editorial boards belonging to legitimate journals.

“Certainly, not everybody who publishes in a predatory journal has been duped, but the evidence would strongly suggest that some people are,” David Moher, a research scientist at the Ottawa Hospital Research Institute, said by phone. “Post-docs to very senior investigators have been duped. It clearly goes on.”

Early-career researchers and those from developing countries could be particularly vulnerable, Schmitt says — it makes sense when one thinks of language barriers, lack of education on the subject and the similar-sounding journal names.

But Schmitt is still skeptical when most people say they’ve been tricked these days, especially since there are several well-known lists of suspicious journals.

“I don’t think that most scholars can just say that they were duped into the predatory publishing system in 2017,” he said. “The same people have this massive track record of being analytical, organized thinkers as an academic and researcher.”

The depth of the problem

The bar is low for predatory journals. Researchers pay a fee and are sometimes published soon after, meaning that there was likely zero review process.

But the scientific community is still trying to define what exactly constitutes a predatory journal and how bad the problem really is. As it stands, there is no singular, official list of predatory journals to be wary of.

In 2013, journalist John Bohannon decided to see how bad things were. He sent a fake scientific paper authored by a nonexistent biologist at the nonexistent Wassee Institute of Medicine to 304 open-access journals. More than half of the journals accepted it, despite its “fatal flaws.”

“Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s short-comings immediately,” Bohannon wrote in Science. “Its experiments are so hopelessly flawed that the results are meaningless.”

Then, in 2015, researchers created a fake scientist named Anna O. Szust (“Oszust” is Polish for “a fraud”) and sent applications on her behalf to 360 journals. Their hope was to see which journals recruited her onto their editorial boards.

Szust’s resume included fake degrees, fake book chapters and a link to a faculty page at the Adam Mickiewicz University’s Institute of Philosophy. Many of the journals responded positively to her application, and 48 of them even accepted it. Eight of these journals were part of the Directory of Open Access Journals and 40 of them were on Beall’s list of “potential, possible or probable” predatory journals.

In many cases, predatory journals asked Oszust to pay or donate money in order to accept her position on the board. Others asked her to help lure in research for the journals to publish, sometimes in return for a portion of the profits.

The only good news was that none of the 120 recipient journals listed on Journal Citation Reports — an authoritative tool that measures the impact of research — accepted Oszust’s application.

These experiments certainly show the depth of the problem, but there’s still a lot of research needed on unreliable journals and what to do about them. Moher, who recently published research that identifies typical characteristics of predatory journals, is calling for even more funding to fight the problem.

“Although this is not yet a massive problem, it’s a growing problem,” Moher said. “Funders need to put money on the table to enable people like me and other groups around the world to investigate this.”

In the meantime, the societal harm is obvious: Bad science will continue to circulate on the net, and the academic community may see scientists with merits they don’t deserve or those who were cheated out of their perfectly good research.