A journal has removed a paper after realizing it contained a verbatim quote from a patient that could reveal the patient’s identity.

The journal learned of the slip-up after receiving a complaint from a social networking site for patients called PatientsLikeMe, which enables people with similar conditions to connect with each other. The retracted paper — ironically about automatically sanitizing private information on social networking sites — included a brief quote from an HIV-positive user of the site, containing specific dates and infections the patient had experienced.

The corresponding author of the study in Expert Systems and Applications confirmedto usthat the letter from PatientsLikeMe about two lines of text in the study triggered its removal.

The journal has republished an updated version of the paper without the problematic text.

Here’s an excerpt from the complaint, sent by Paul Wicks, Principal Scientist and Vice President of Innovation at PatientsLikeMe, to the researchers and the journal in December 2015:

A search within PatientsLikeMe for this string, or fragments of it, would quickly identify this patient. This would appear to be in violation of typical standards of medical research ethics.

He explains that the researchers took the material from the logged-in section of the site, violating the site’s user agreement.

Wicks goes on to say:

Please ensure that any screenshots, quotes, or data that is from the logged-in part of PatientsLikeMe is retracted/deleted from the publication and any copies on repositories available on the Internet as soon as possible. Furthermore please delete any data obtained from the logged-in portion of PatientsLikeMe from your servers as soon as possible.

Wicks told us that PatientsLikeMe found out about the reference to the patient through an automatic alert from Google Scholar. He added:

This was an issue to do with scientific and publishing ethics, so we thought it most appropriate to handle scientist-to-scientist rather than involving legal action.

This article has been removed at the request of the Editor and the Publisher, as it contains personal information without the consent of the concerned individuals.

The 2015 paper, “Enforcing Transparent Access to Private Content in Social Networks by Means of Automatic Sanitization,” has yet to be cited, according to Thomson Reuters Web of Science.

Expert Systems and Applications has recently republished the study, but due to delays, the removal notice for the study’s older version was only issued last month, said David Sánchez, the paper’s corresponding author from the Rovira i Virgili University in Tarragona, Spain.

The newer version of the paper, which is due to be published in print in November, 2016, is available here.

Sánchez told us that removing the paper’s earlier version was “the only option,” adding:

The problematic text was in Table 1. In the current version, the message you can see there is synthetic, but serves the purpose of illustrative example for the calculations.

He noted that the PDF of the earlier version also appeared on the preprint server arXiv, but has since been taken down.

Paul Ginsparg, founder of arXiv from Cornell University in Ithaca, New York, confirmed to Retraction Watch that

the library made the previous versions inaccessible.

Ginsparg noted that arXiv does not actively monitor retractions on other sites, and relies on authors or other third parties to flag such issues.

This isn’t the first study flagged for containing personal information that could identify patients — last week, we reported on a case study that surprised family members by revealing a bagpipe musician had died because of inhaling mold and fungi. In that case, the hospital apologized and launched an investigation, but the family — who says anyone who knew the musician could easily identify him from the report — did not seek a retraction.