Wednesday, 4 September 2013

Janne-Tuomas Seppänen: how to improve peer review and three reasons why Peerage of Science was born

As the count down to the 2013 ALPSP International Conference and Awards continues, we asked Janne-Tuomas Seppänen, Founder of Peerage of Science, winners of the ALPSP Publishing Innovation Award in 2012, to explain how they serve the research community.

"Peerage of Science is a new kind of peer review and submission system launched on October 2011, and it won the ALPSP Award for Publishing Innovation in 2012. The service is currently used by 25 journals, including new ventures like PeerJ (one of this year's finalists for the ALPSP Publishing Innovation Award) as well as established society journals like Ecography and Heredity. The most recent publisher to establish a contract with Peerage of Science for full use and rights to make direct publishing offers via the system is BioMed Central, which launches participation with four journals including their flagship biology journal BMC Biology.

I was asked to outline why and how the initiative responds to particular needs in scientific publishing. Here's three personal perspectives on why Peerage of Science was born.

1. There was no yardstick, and little reward, for trying to excel in peer reviewing.

Getting my first reviewing requests as a fresh PhD felt like coming-of-age for a young scientist. Being trusted to do a peer review is a privilege, up to which your knowledge and analytical skills should now measure up, so naturally I wanted to get feedback and compare my arguments to those of other reviewers. But not all journals provided the opportunity to see other reviews, some did not even notify about the final decision.

2. The variance of quality of peer reviews was astonishing

Of course I had seen plenty of peer reviews written by others – in response to my own submissions and those of colleagues. Sometimes peer reviews were thorough, incisive, even brilliant and giving valuable guidance for research. But all too often they were flippant and dismissive without analytical justification, let alone supported by references to literature. The variance of quality was astonishing, given that they are all supposedly written by the most learned and brightest people on the planet.

3. The reality of funding and hiring practices makes "aiming high" in submissions an imperative for scientists

I know I do not necessarily have to have an article in Nature or Science to maintain a career in science, but if you've never been rejected from those journals, then you are not trying hard enough. This often results in a publishing process that is a slide down the journal prestige ladder until an article ends up accepted somewhere. And that process takes on average over a year, each iteration requiring time and effort from authors, editors and peer reviewers, and consuming publisher's resources.

In addition, editors I knew were telling me it is getting harder and harder to find (good and punctual) reviewers. Even top-notch journals may have to send requests to ten or more people to get two reviewers. Given that good journals reject a large majority of submissions, the effort and money publishers invest in just managing peer review, per article actually published, must be substantial. Wouldn't the editor and editorial office resources be more effectively used in growing and maintaining journal quality if they could be more focused on material a journal wants to publish, rather than material rejected?

Instead of resigning myself to coffee-break rants about a Churchillian "worst-system-except-for-the-alternatives", together with colleagues Mikko Mönkkönen and Janne Kotiaho, I wanted there to be a better solution.

Just two things are needed to remedy the situation. First, peer review quality needs to be measured in a way that creates academic recognition for high-quality peer reviewing, and social pressure to avoid flippant, hasty evaluations. Second, consideration of whether to accept for publication or not needs to be done concurrently by suitable journals during peer review, instead of wasting everyone's resources and time in multiple iteration rounds. Peerage of Science has many other features too, but these two are at the centre of the concept.

It was an immense honour, and a huge boost for developer spirits, when ALPSP decided to give the Award for Publishing Innovation 2012 to Peerage of Science last year . Being officially recognized by a leading trade organisation in academic publishing gave Peerage of Science legitimacy among prospective customers that is otherwise an uphill battle for small start-ups to establish. But perhaps even more importantly, the ALPSP in effect gave the Award to the scientific community also. Scientists doing peer reviews in Peerage of Science, and authors bold enough to be early adopters of a new system, were recognized and awarded too, and encouraged to continue to make the initiative possible."