RSNA Journals Use Software to Fight Plagiarism

March 01, 2013

Plagiarism-detecting software is helping editors of RSNA’s scientific journals stem the rising incidence of plagiarism and falsification of data.

Editors of prominent science journals, including RSNA’s Radiology and RadioGraphics, are relying on plagiarism-detecting software and increased awareness to stem the rising incidence of misconduct such as plagiarism and falsification of data.

RadioGraphics Editor Jeffrey S. Klein, M.D., has seen the problem firsthand since taking the helm of RSNA’s peer-reviewed bimonthly education journal in January of 2012.

“In the year I’ve served as editor, I’ve dealt with various aspects of misconduct or plagiarism on six or seven occasions, which is more than I had anticipated,” said Dr. Klein, the A. Bradley Soule and John P. Tampas Green and Gold Professor of Radiology at the University of Vermont (UVM) College of Medicine and Fletcher Allen Health Care in Burlington, Vt.

Dr. Klein recounted one case in which a RadioGraphics reviewer noticed a familiar aspect to one of the submissions. “The reviewer recognized the images from an article that had been published 11 years earlier,” Dr. Klein said. “The images had been repurposed without attribution for the new research paper.”

Journal Retractions Steadily Rising

Dr. Klein’s experience is not unusual. According to an Oct. 24, 2012, study published in PLoS ONE, the number of articles retracted per year in major scholarly fields increased by a factor of more than 19 between the years 2001 and 2010. Almost half of the retractions were due to alleged publishing misconduct, defined by the National Institutes of Health (NIH) as fabrication, falsification or plagiarism in research.

A study published in the Oct. 16, 2012, issue of the Proceedings of the National Academy of Sciences (PNAS) cited misconduct as the reason for three-quarters of the 2,047 retracted papers in the biomedical and life sciences dating back to 1977.

“The literature suggests that the number of retractions has been going up exponentially,” said Herbert Y. Kressel, M.D., the Miriam H. Stoneman Professor of Radiology at Harvard Medical School in Boston, who has served as editor of Radiology, RSNA’s peer-reviewed science journal, since 2007 and on the journal’s editorial board from 1985 to 1991.

Misconduct appears to be less common in radiology publications than in those focusing on other specialties. The PNAS study did not include any radiology journals among the top 10 journals with the most retractions and a search of the misconduct-tracking blog Retraction Watch (retractionwatch.wordpress.com) yields just one radiology case: a 2011 study in the journal Acta Radiologica that was retracted for likely plagiarism.

Still, misconduct remains a concern for editors like Drs. Klein and Kressel. Plagiarism is easier than ever in the cut-and-paste world of the Internet when time constraints and the pressure to publish can lead authors to succumb to the temptation to self-plagiarize.

“The pressure to publish is such that authors will tend to stretch what they’ve already produced and present it as original material,” Dr. Klein said. “Some misconduct stems from the pressure to get published in scientific journals, while other instances involve junior scientists who don’t have the best level of supervision.”

Editors rely heavily on expert reviewers to detect plagiarism and fraud before an issue goes to press. At Radiology, Dr. Kressel has added another safeguard: sending a query asking authors if any of the patients, research subjects or test animals described in their research have been reported in other submissions.

“If the person answers ‘yes,’ we ask him or her to submit a PDF of each publication with a letter explaining why the submission is original,” Dr. Kressel said. “Just adding this question and asking authors to justify their papers has made researchers rethink submitting.”

Software Analyzes Text for Plagiarism

Plagiarism-detection software is an additional tool to help editorial staff guard against misconduct. Radiology and RadioGraphics recently began using the plagiarism-detecting software iThenticate to comb through their nearly 3,300 annual submissions. Once uploaded, iThenticate compares each document with the millions of scholarly articles in the database and provides a report highlighting any matches in less than one minute.

“You can click on any of the matches and see if it’s something in common usage or something even more concerning,” said Chris Harrick, vice-president of marketing at iThenticate’s publisher, Turnitin in Oakland, Calif. “The software enables you to exclude small matches, such as those of less than five words, and matches found in quotations and the bibliography.”

Even with the software, the judgment and experience of reviewers plays a central role. “Word matches don’t always equate to misconduct,” Dr. Kressel noted. “For instance, there’s only so many ways to describe a CT protocol. What most journals do is set their own thresholds for duplicate words.”

Miscounduct is Often a Repeat Offense

Despite advances in finding plagiarism, fraud detection remains a significant challenge. Reviewers who suspect fraud—for instance, results that show an unusual level of consistency—may ask authors to provide the original data files.

“A few years ago one of our readers questioned the validity of the data in a paper, and we had the author send us the files containing original data so a statistician could reanalyze it,” Dr. Kressel recalled. “It turned out that it wasn’t fraud; the author had just neglected to include some information that was important to understanding the data.”

If misconduct is suspected, editors like Dr. Kressel inform both the authors and representatives of the institutions associated with the research.

Disciplinary measures vary. In 2000, NIH blocked noted glaucoma researcher Evan Dreyer, M.D., Ph.D., from receiving federal grants for 10 years after he admitted to falsifying data. In another prominent case, Eric Poehlman, Ph.D., a researcher at the University of Vermont in Burlington, was sentenced to one year in jail in 2006 for making a false statement on a federal grant application.

Stiff penalties appear to be the exception, however, as evidenced by the significant number of repeat offenders. The PNAS study found that 38 research groups with five or more retractions accounted for 44 percent of articles linked to fraud or suspected fraud. “Clearly, deterrence is not working, because we’re still having misconduct,” Dr. Kressel said.

With improved software, growing awareness of the problem and the continued diligence of reviewers, editors like Dr. Kressel remain hopeful that they will be able to focus more on science and less on misconduct.

“The whole system is designed to promote scientific integrity and prevent bias,” Dr. Kressel said. “The last thing I want to be is the scientific police.”

To access an abstract of the article, “Misconduct Accounts for the Majority of Retracted Scientific Publications,” in the Oct. 16 edition of the Proceedings of the National Academy of Sciences of the United States of America, go to www.pnas.org/content/109/42/17028.