Bringing Scholarly Communication into the 21st Century: a Workshop at the 2017 AAAS Meeting

This is a guest post by Future of Research policy activist, Adriana Bankston.

Preprints are on the rise in many different types of sciences, including life sciences. My first extensive exposure to preprints was while co-moderating the ASAPbio subgroup session at the 2016 ASCB meeting, co-organized by Prachee Avasthi, Assistant Professor at the University of Kansas Medical Center, and Jessica Polka, Director of Accelerating Science and Publication in Biology (ASAPbio). ASAPbio is a scientist-driven initiative to promote the productive use of preprints in life sciences, and includes discussions of various topics such as preprint citation and recognition for grant funding and promotion purposes. Several funding agencies, including the NIH, now encourage interim research products, thereby highlighting the value of preprints to allow accelerated distribution of scientific information.

At the 2017 AAAS meeting, preprints and open access were discussed in a session entitled “Bringing Scholarly Communication into the 21st Century.” Main themes in this session were the virtue of preprints and also problems that might arise around them, including the broader idea of the digital publication age and open access publishing practices. Speakers in this session were Wendy Hall (University of Southampton), Neal Young (National Heart, Lung, and Blood Institute) and Jessica Polka (Harvard Medical School). Other people participating in this session were: Stuart Taylor, The Royal Society (organizer), Philip Campbell, Nature (moderator) and Michael Taylor, University of Bristol (discussant).

Wendy Hall: The Why of Open Access

Wendy Hall discussed the transition from the print to the digital age of publication, and the benefits of publishing in an open access manner. She stated that initially the Royal Society was formed to allow people to talk about and present what they do (previously done by writing letters), and verify all the facts determined by experiments. But when the digital world of publication emerged, the question was how it was going to affect publishers, who were previously making a lot of money out of scientists. Nevertheless, she agreed that we need to publish in a digital format, as well as move towards open repositories (on a global level), and utilize open peer review. For a bit of history, she recounted that eprints were the beginnings of preprints, and were in the form of a full text deposited into an eprint server. As she stated in her concluding statements, we must now embrace the digital age of publication, “get away from current metrics and also promote public peer review.”

Neal Young: Why Current Publication Practices May Distort Science

Another important point related to publication practices is how we brand individual scientists, which was brought up by Neal Young. The idea of the “winner’s curse” is that the small proportion of results being published is underepresentative of the actual data generated by scientists. Thus, publications are creating a distorted view of this reality. In addition, we are now paying to publish less papers, and about 18-88% of scientific publishing is wrong. We must therefore think about how scientific data are being judged and disseminated in a way that accurately represents the work being done.

Jessica Polka: Promoting a Culture of Preprints

Following Wendy Hall’s remarks on open access publishing, Jessica Polka offered some thoughts on how we can move into the “beautiful world of open access” and still survive in this profession. In this context, Polka posed the question of why preprints, the digital equivalent of publications, are still slow to gain traction in biology? Some of these reasons are below:

Lack of incentives for posting preprints. The problem of incentives for preprints is still a major one in the scientific community. It also brings up the idea of finding suitable matches of preprint editors, and thinking of ways in which we can build a new marketplace for this practice.

Trust in work quality. The issue of trust in the work being published in preprints exists because of the preconception of low quality work compared to a high impact, peer-reviewed publication. In this case we should lower the barrier and recognize that this work is not necessarily of low quality merely due to the absence of peer review, and that the work itself should be evaluated.

Fear of scooping. The fear of scooping is another major issue perceived with preprints. In this case we should increase the visibility of preprints in various ways, and also “increase the availability of information that is available for feedback” in terms of promoting open access for online publication and review. More generally, we also need to enhance the “transfer of trust between scientists” and preprints can bridge this gap.

Polka also brought up other important topics related to preprints that we should consider:

Who pays for preprint servers?One of the concerns with preprints is the cost of the server needed to maintain them. Maintaining arXiv costs $1 million/year, which is an average of about $10 per new manuscript. Business models also differ, with arXiv paid for by member institutions and BioRxivapproaching various funding agencies, such as the Chan Zuckerberg Initiative.

If we have preprints, do we still need peer review? The value of peer review is a debated topic in the publication world. According to Polka, some scientists are of the opinion that “preprints are all we need” and peer review is not needed. While preprints ultimately fulfill the need of advancing science, Polka argues that peer review is also needed and has value for the publication process.

Can preprints and scientific journals coexist? Allowing both preprints and journals to coexist may be possible by first publishing a preprint, with the option of subsequently submitting the publication to a peer-reviewed journal. In fact, journals and preprints have co-existed for 25 years in physics, with each serving a distinct function (quick communication vs. evaluation by the community). However, there is still the issue that peer review remains a proxy for quality, and impact factors are still widely used.

General ideas and audience questions

This session highlighted several important points about publications, such as better ways to measure publication quality and impact, and what metrics and rewards should be used. One suggested solution is, of course, focusing on the quality of the work itself and whether the authors can discuss it, thereby placing less emphasis on where it is published. However, as was argued in the session, having someone tell you “a narrative about their achievements” is not enough. We need alternatives to evaluating people’s work in ways other than using the impact factor.

In this regard, we should focus on incentives for publication. Impact factors are not the best incentives and also do not reflect the sort of work being done by trainees, especially if waiting for a long time to publish a high impact paper. Therefore, we need some other way to evaluate the work being done in a manner that is independent of the impact factor. This is where incentives for preprints become relevant. Preprints can speed up publication rates and better reflect a trainee’s productivity overtime, and this measure can then be used for subsequent fellowships, grants and promotions. Preprints also allow for multiple rounds of open peer review and continuous improvement of one’s work. This would be a considerable advantage over the current system where publications are only evaluated once (via peer review) and then published in a journal. Preprints may also allow for further discussions of other ways in which we can highlight/reward the work of scientists.

Another important point from these discussions is the need for better access to and availability of online information, thus promoting open access, open peer review and the use of preprint servers. These practices would allow for better information transfer amongst scientists, which should be a continuous process in the life of a publication. Of course, as pointed out in the beginning, the digital publication age in which information transfer is ongoing could potentially negatively affect universities, libraries and professional societies. Therefore, one remaining question is still whether these entities would survive the switch to the digital publication age.

Potential ways to bridge the gap between the digital world and the print publication system include getting away from the PDF format, integrating code into papers, and using several tracking versions of materials. To the latter point, retracting papers is currently the only way of tracking data. However, as Polka pointed out, if there are issues with particular parts of the paper, simply retracting the paper will not fix these issues. Thus, we need to “lower the barrier to amend and correct papers rather than just retracting them.” This is something I wholeheartedly agree with and was also a very popular tweet from this session. It includes the idea that we should correct any wrong information prior to publication, and that publications should not be static.

Finally, given the idea that preprints are a fluid form of publication, which is open to feedback in an open access manner, we must also consider questions such as When do you know to publicize a preprint? and What is the final version if going through multiple versions? from the audience. These are valid questions given that preprints need to at some point be in a final format which can be cited. The last point made in this session was on the media coverage of preprints, which should be done “with extreme caution” according to Polka.

Overall, preprints and open access represent the future of scientific publishing. But how to best engage the scientific community in digital publishing, and allow preprints to coexist with scientific journals, are still questions to ponder in the future.