Sting Operation Exposes the Dark Side of Science

Doctors, patients, policymakers and the general public are finding that the integrity of science is questionable, as less people trust journalism and science journals. One 2014 poll found that trust in media dropped 5 percentage points to 52%. This also stands true globally, as trust in media has dropped approximately 80% across all of the countries surveyed.1

The other issue is open-access journals. While many are legitimate, others are under fire for conducting little, if any, peer review on scientific research accepted for publication.

Trust in Scientists, Science Journalists Lacking

The public is losing faith in scientists and science journalists, according to a new online survey conducted Dec. 6-7, 2013 among 1,000 U.S. adults. Only 36% of respondents said they had “a lot” of trust that information from scientists is accurate and reliable. Most people (51%) said they trusted the information “only a little.” Six percent had no trust at all.

Only 12% said they had “a lot” of trust in science journalists to get the facts right. In fact, 26% of adults said they have no trust in the accuracy of scientific studies reported by journalists. Fifty-seven percent said they had “a little bit of trust.”

Overall, 82% said they believe scientific findings are often or sometimes influenced by the company sponsors.2

Fake Science Accepted by Open-Access Journals

Waning trust should come as no surprise as publishers struggle to uphold credibility. In 2013, Science contributing correspondent John Bohannon conducted a sting operation to quantify how many peer review systems of online open access journals publish bogus scientific papers and found it happened in more than half his cases.

Between January and June 2013, he submitted a deliberately faked research article to 304 open access journals matching the paper’s subject.

The paper was riddled with errors in the methods, data and conclusions that should have been noticed and rejected.

“Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s short-comings immediately,” Bohannon wrote.

The fake research showed that a new cancer drug containing a chemical extracted from lichen was effective in killing cancer cells. However, it was not tested in healthy cells, so it was not known if the drug was simply toxic to all cells. Another red flag was that the paper included a graph of data that were opposite to what was stated in the text. Lastly, the paper was written as if by someone whose first language is not English, had fictitious African-sounding author names from invented institutes. Bohannon did this because the whole idea of the sting stemmed from a complaint from an African biologist who felt she was getting scammed from a fee-charging open-access publisher and he wanted to replicate her experience.

The paper was accepted by 157 online journals, including the Journal of Experimental & Clinical Assisted Reproduction, the Journal of International Medical Research, the Journal of Natural Pharmaceuticals and Drug Intervention Today.

It was rejected by 98, including PLOS One. Only 36 rejections addressed the scientific flaws of the paper. Acceptance took an average of 40 days, while rejection took an average of 24 days. The other 49 journals included in the experiment were either defunct or still reviewing the paper at the time of publication.

How Do Online Journals Differ from Subscription-Based Journals?

Bohannon recognizes that open-access scientific journals are a growing entity, making science more accessible to the readers than ever. Online journals charge the contributors to publish their research, building profitability on volume, but have been previously criticized for poor quality control through peer review.

Among Bohannon’s inquiries, the online open-access journals invoiced the fictitious authors for up to $3,100.

In contrast, print journals restrict how many articles they can publish and therefore screen out the highest-quality research. Plus, these journals rely on fees from traditional subscriptions, so the number of papers they accept do not determine profitability.

Bohannon noted a geographic pattern in the substandard peer review process found among his open-access journals. The journals often obscure the identity and location of their editors, publishers and bank accounts, but he was able to trace the roots in his sting operation.

“Invoices for publication fees reveal a network of bank accounts based mostly in the developing world,” he said. More than 30% of the online journals targeted in the sting were based in India, the world’s largest base for open-access publishing. Sixty-four of the India-based journals accepted the fake paper, while only 15 rejected it.

The United States was the next largest base for open-access publishing. Twenty-nine of the U.S.-based publishers accepted the fake paper, while 26 rejected it. Even well-known publishers, including Elsevier, Wolters Kluwer and Sage, published the fake paper in the sting.

Critics, Publishers Respond to Sting Evidence

Bohannon’s investigation received an abundance of responses. Critics of Bohannon’s investigation say it was disappointing because the “study” was not controlled with subscription-based journals and the issues with the peer review process could be found throughout the publishing industry.

“The lack of a control means that it is impossible to say that open access journals, as a group, do a worse job vetting the scientific literature than those operating under a subscription-access model,” blogs Phil Davis, an independent researcher and publishing consultant.

Davis said the real test will be if publishing organizations and directories react to Bohannan’s conclusions and do some quality control.

“If they are to uphold their credibility, they will need to censure and delist the offenders until they can provide evidence that they are abiding by the guidelines of their organization,” Davis wrote. “This means stripping these publishers of the logos many display proudly on their web pages.”3

Well, a number of publishers and publishing organizations have responded to Bohannon’s research.45

One Croatria-based open-access publisher, InTech, announced the cancellation of the International Journal of Integrative Medicine, which was exposed by the investigation.6

The Open Access Scholarly Publishers Association (OASPA) took its own action. OASPA said “although the data undoubtedly support the view that a substantial number of poor-quality journals exist, and some certainly lack sufficient rigor in their peer review processes, no conclusions can be drawn about how open access journals compare with similar subscription journals, or about the overall prevalence of this phenomenon.”7

After working with the few OASPA members who accepted the article, the organization found that “there was a lack of sufficient rigour in editorial processes at all three of the journals in question,” resulting in the termination of two of the memberships and placing the third (Sage’s Journal of International Medical Research) under review.8

Sage also responded. Editor-in-chief Malcolm Lader said, “I take full responsibility for the fact that this spoof paper slipped through the editor process. The publishers requested payment because the second phase, the technical editing, is detailed and expensive.” 910

According to Sage, this specific journal operates under a two-stage review process. “First the Editor performs an initial review of a submission to check that it is within scope and to screen for methodological soundness and basics standards of research quality,” Sage stated in a statement. The second stage—detailed technical edit—is “undertaken by at least two experienced medical technical editors.”

“We are extremely concerned that a paper with fundamental errors got through the initial stage but are confident that the technical edit would have revealed the errors and led to its rejection,” the statement said. The publisher also assured that it plans to enhance the initial peer review stage for the journal.11

Sage also said it welcomes and will cooperate with OASPA’s six-month review.12

Bohannon said the biggest take-away was that one cannot conclude that the open-access business model is entirely bad based on these black sheep, it’s just that the model has flaws and needs work.