Peer Review in Computer Science Conferences Published by Springer

Mario Malički,1 Martin Mihajlov,2 Aliaksandr Birukou,3 Volha Bryl4

Objective To describe the types of peer review, number of reviewers, use of external reviewers, acceptance rates, and submission systems associated with computer science conference proceedings published by Springer.

Design We used the Springer online delivery platform to identify computer science conference proceedings published by Springer between 1973 and 2017. Proceeding prefaces list published articles and detail the process used for their selection; we batch-downloaded all prefaces using JDownloader software, converted the prefaces to text using UNIpdf or Adobe Acrobat Pro, and extracted peer review information using a combination of regular expression matching with Perl and manual data curation.

Results Of 7710 processed conference prefaces, we identified 1657 unique conferences that had been held a median of 3 (range, 1-39) times. Prefaces contained information on number of submissions sent for review (n=5021 [65%]), accepted full articles (n=4890 [63%]), short articles (n=940 [12%]), posters (n=711 [9%]), peer review type (n=561 [7%]), number of reviewers (n=2962 [38%]), use of external reviewers (n=2716 [35%]), and submission systems (n=1392 [18%]). Acceptance rates for full articles ranged from 3% to 93%, with a time-weighted median of 37% (95% CI, 36%-39%). Acceptance rates tended to decrease per number of times the conference was held (Figure). Conferences used a median 3 reviewers per article (range, 2-11) and most commonly used double-blind peer review (315/561 [56%]), although a small proportion (32/237 [14%]) changed their type of peer review over time (eg, from single to double blind). The most common submission systems used were EasyChair, CyberChair, and iChair.

Figure. Time Series of Acceptance Rates of Computer Science Conferences Published by Springer (n=985)

Conclusions Computer science conferences decrease their acceptance rates over time. The reasons were not explored in this study but may be due to an increase in reputation of the conferences and in number of submissions over time.

Funding/Support: This research was funded by COST Action TD1306, New Frontiers of Peer Review (PEERE). PEERE awarded Malički Mario and Martin Mihajlov the Short Term Scientific Mission to extract the peer review data and citation data and will cover the publication costs of the manuscript. The initial prototype of the Springer Linked Open Data pilot for computer science conferences was developed by Net Wise in cooperation with University of Mannheim and was partially supported by the LOD 2 EU FP7 project.