10/11/2013

If you're at all interested in scientific publishing, you've
probably heard about the open-access "sting operation" published in
Science magazine this past week.
To test the rigor of the peer-review process, reporter
John Bohannon sent out spurious research papers with fabricated lists of
authors to a selection of 304 open-access journals. (Links to the datasets are available here.)

mBio was one of
them.

As Bohannon indicated in the supplemental material, mBio rejected the paper. In fact, mBio summarily
rejected the manuscript without review in a letter sent to the author two days
after submission.

However, 157 of the targeted journals DID accept the paper
for publication, highlighting the ease with which shoddy science can gain the
appearance of legitimacy over on the seedy side of open-access (for examples, see
Beall’s list of “predatory open access publishers").

The sting has been the subject of heated debate among
scientists, publishers, librarians, and journalists all week, Many of whom have raised
questions about the reliability of peer review, quality control in scientific
journals, and what constitutes a reliable source of scientific information. But
amidst all the hubbub, I wanted to hear from our own Editor in Chief, Arturo
Casadevall. I asked Arturo what he thought of the article, what it means for mBio and other open-access journals, and
what he would say to our readers and authors about the sting.

What do you think
about the piece?

I think that anything
that sheds light on the workings of science is useful and helpful. I do worry that one aspect of the piece is
that it does not discriminate between open-access journals. Some open journals such as mBio and the PLOS family of journals practice strict peer review, and they are not different
than the traditional subscription journals in that way.

Did any of the
results surprise you?

Not really. The proliferation of journals that seek to
make a profit by publishing articles has created a "wild west"
situation out there such that there is great variability between the rigor of
open access journals. However, as I
stressed above, there are major differences between open access journals with
regard to their review policies and it is important not to paint them all with
the same brush.

What does the work
mean for open access journals that conduct rigorous peer-review, if anything?

I don’t think this
means anything for those journals that carry out rigorous peer review. The fact of the matter is that scientists
know which journals do and which don’t.
Hence, I don’t think that this will have any impact on mBio.
However, I suspect that for those journals that do not carry out
rigorous peer review there could be
consequences, such as further diminished credibility in the scientific community.

Do you think the
"sting" will drive authors or readers away from open access journals?

No. I believe authors are quite savvy and they
know which open access journals are legitimate and which are not. Legitimate open access journals already are established,
and this will have no consequences for them.
However, it is important to make sure that the popular media understand that there are major distinctions between open
access journals when they cover these types of stories.

What would you say to
open-access readers or authors about the outcome of Science's experiment?

Science’s experiment
is not relevant to mBio. mBio is a
society-run journal with a very distinguished editorial board whose members
work very hard to make sure that only the best science is published. Papers submitted to mBio are subjected to peer review at many stages
from the Editor in Chief to members of the Editorial Board to peer
reviewers. mBio operates like traditional established journals with regards to peer
review, so the outcome of Science’s
experiment is not relevant to our journal.