Skip to Main Content

Call it the publishing version of “repeal and replace”: the Journal of the American Medical Association this week retracted and then republished a study with significant clinical implications — showing in the process how journals and researchers, working in concert, can correct the scientific record quickly and transparently.

The paper in question was published in 2014 by a team of researchers across the Netherlands. It reported, with much fanfare, the findings of a large study looking at interventions to reduce antibiotic resistance in intensive care units.

But two years later, another group of researchers discovered an error when analyzing the data for a different study. One of the 16 ICUs had miscoded its data, turning what was essentially no difference between the two methods into a small but clear advantage for one. The group called first author Evelien Oostdijk, who’d led the study, and told her about their discovery.


The authors reanalyzed their findings with the correct information and they did in fact conclude that the new stats doomed their original results. But they didn’t stonewall, and they didn’t just retract, either: They requested that JAMA remove the paper and replace it with a new version. To its credit, this week, the journal agreed with the request.

And that, on the whole, represents a remarkably virtuous series of events: Data made available for reanalysis, a journal that promptly responded to the outcomes of that reanalysis, and a finding that could save lives.


In a blog post timed to this week’s retraction, Marc Bonten, an infection control specialist at University Medical Center Utrecht who was one of the authors of the JAMA study, walked readers through what went awry. “Each month the ICU had delivered an Excel file containing admission and discharge dates, with hours and minutes in the same cell. To harmonize these data with those from other ICUs, the hours and minutes needed to [be] removed, requiring several copy and paste procedures. And that is where the human error occurred,” Bonten wrote.

The lessons are threefold, Bonten said: “Think of the unthinkable when it comes to checking the quality of your data; If you think that all possible has been checked, check again; Always let others use your original data for new (or just the same) analyses.”

Bonten has high praise for Oostdijk, whom he calls “a shining example of scientific integrity” in the event. We agree. After all, even if she couldn’t very well have ignored the news, she could have delayed acting on it or taken other steps to safeguard the original publication.

And we think JAMA deserves applause, too. Faced with such a devastating error, the journal had to retract the paper. But it could have stopped at that and rejected the resubmission for any number of reasons that would have been acceptable: Given one major error, were others from the group likely to emerge? Wouldn’t it be better to conduct the whole trial from the ground up?

But JAMA put its faith in the integrity and competence of the researchers and in so doing helped to illustrate how science at its best is transparent and self-correcting — as long as the definition of “self” is broadened to include the entire community of scientists.

The paper marks the fourth time JAMA and its sister publications have retracted-and-replaced a paper. Annette Flanagin, executive managing editor of the JAMA Network, said the option is useful for addressing issues of “honest pervasive error (i.e., unintentional human or programmatic errors that result in the need to correct numerous data and text in the abstract, text, tables and figures, such as a coding error) without the current stigma that is associated with retraction.”

The Lancet has done this, too. Self-correction in science: Now that’s cause for celebration.

  • Funny story

    I tried to contact Nature Medicine to report the publication of a paper claiming wrong data/claims, but I never had a reply from the journal. Indeed, in this case, the data have been more certainly falsified (or are an indication of a ludicrous lack of experimental skills). Clearly, most journals are happy to publish rotten research as long as they are not forced to take it back by adverse circumstances.

Comments are closed.