As a child, Jana Christopher would spend hours trying to discern patterns in the wallpaper and on the floor of her bedroom. So perhaps it was destiny that as an adult she would spend hours doing much the same thing — only now she stares at scientific manuscripts trying to find evidence of unkosher images.
Christopher, an editorial assistant at FEBS — for Federation of European Biochemical Societies — Letters in Heidelberg, Germany, is one of a small but growing number of research integrity czars whom publishers are employing to help police their pages. Just last month, Science Advances named Philip Yeagle as scientific integrity officer. Yeagle, a deputy editor of the journal, which is published by the American Association for the Advancement of Science, acts as a liaison between institutions and the editors and ensures “that we are dealing with situations in a consistent fashion over time,” he told STAT by email. “Scientific integrity issues can become quite complex.”
Scientific journals’ creation of dedicated positions for rooting out misconduct before publication comes amid growing awareness of such issues, and stems from a recognition that spot-checking and other ad hoc arrangements were insufficient.
“We’ve always been seeing issues like plagiarism, like authorship disputes, like data fabrication. But there was a lack of awareness of the scale” of the problems, said Matthew Hodgkinson, who left the Public Library of Science (PLOS) in 2016 to oversee ethics from a newly created perch at Hindawi and its 250-plus scholarly journals. Bringing him in at Hindawi, he told STAT, reflected the “realization that we need to be catching these issues as a core focus. Centralizing it means that we can make sure we have some processes and can deal with them routinely.”
Hodgkinson’s advice for Yeagle or others in similar roles is simple: “If you see a thread, pull it, and see what comes out. If you see one problem with an author, then you check everything that they’ve done.” (Hindawi runs author names against Retraction Watch posts as a kind of background check.)
Christopher has an even more specific brief. Her job at FEBS Press is to identify suspicious images in the manuscripts its four journals consider. Christopher started doing similar work at the European Molecular Biology Organization in 2011, where she quickly learned that such misconduct was “a depressing reality” in science.
A 2016 study found that roughly 4 percent of published papers have evidence of doctored or duplicated images, but researchers have been sounding the alarm since at least 1994, according to Mike Rossner, the founder of Image Data Integrity Inc., which consults with publishers and universities.
Rossner began screening images at the Journal of Cell Biology in 2002, after having a heart-sinking moment when he opened an image file to fix a formatting problem, and noticed subtle evidence that the image had been manipulated. He went on to write a widely cited editorial on the issue in 2004. (Unbeknownst to Rossner at the time, James Hayden, of the Wistar Institute in Philadelphia, had published a piece in 2000 sounding many similar notes.) Around that time, the U.S. Office of Research Integrity and others took notice, and created tools for publishers and universities to use to screen images.
Those tools are useful, research integrity czars agreed. But they’re a double-edged sword. Over time, “I’ve become better than I was at the beginning, but I also think people are getting better at what they’re doing to their images. I will find issues in close to 30 percent of the manuscripts that are accepted,” Christopher said. “Most of them are fixable and it’s fine — it could be tiny things, size bars missing, things like that, with no effect on actual data. But in about 2 percent of the accepted manuscripts, the issues detected clearly point to serious problems and mean that we may rescind the acceptance.”
Christopher is not a scientist; she has a background in fine arts. While she said it helps to know the material she’s reviewing, it’s not necessary to the goal. More important is the ability to recognize patterns and spot differences. “I enjoy it, and I’m really good at this,” she said. “It’s almost like a passion; I’ve had it all my life.”
Kaoru Sakabe, who has been overseeing integrity issues at the American Society for Biochemistry and Molecular Biology (ASBMB) since 2015, has three people helping her scan images for potential problems. Two, like Christopher, come from the world of visual arts; one is a Ph.D. scientist. (The staffer who helped Rossner screen images at the JCB also lacked a scientific background, and he recommends a similar setup, with close interaction with someone with a Ph.D.)
ASBMB and its flagship title, the Journal of Biological Chemistry, announced the hiring of a manager for publication ethics in late 2012. There had been a recent changing of the guard at the top of the journal’s masthead, and the new editor-in-chief was floored by what staff had been seeing. “I think the image screening we started has been really beneficial not just for us, but when we find an issue, we like to share with authors what we found,” Sakabe said. “We have some very upset authors, but some that are very grateful. I don’t feel that all authors are baddies; they just may not be aware of everything that went on during figure preparation.”
That’s why the role takes a lot of tenacity — and a delicate approach — said Renee Hoch, who is one of three research integrity team members at PLOS ONE. “We’re not working in a job where people are generally happy to hear from us,” Hoch said. “You need to be a strong communicator, but also a very sensitive communicator.”
Hoch’s team, which was created in January, sees everything from concerns about data, to failure to disclose important conflicts of interest, to authorship disputes, and more. “If you wrote a list of potential ethical issues, we’ve probably seen everything on it,” she said, noting that the largest slices of the pie are image manipulation and data concerns.
Science Advances has “dealt primarily with issues related to authorship disputes as well as a few related to ownership of data,” Yeagle said, adding that a couple of papers had “plagiarism and figure manipulation issues.”
“Unsurprisingly, authors are not always initially cooperative but once they understand that the Journal does not publish unless all such issues are resolved, they usually find a way to address the problems,” he said.
The work can be draining. “It’s really tiring on the eyes,” said Christopher, who estimated that she can sometimes spend 20 to 30 minutes poring over an individual image. “If you do it eight hours per day all day, it will drive you crazy.”
But it’s worth it. In an editorial earlier this year, Christopher recounted the story of a dozen manuscripts she’d reviewed after they raised red flags, revealing “serious, systematic and large-scale fabrication of research results.”
“Cases like this confirm yet again the importance and obvious benefits of screening articles before publication,” she wrote. “Our constant vigilance is a service to authors and the wider scientific community.”
Correction: This column has been updated to make clear that Renee Hoch is one of three research integrity team members at PLOS ONE.
All good, but image manipulation is kind of a cheap, and lazy cheating, representing just a fraction of all the fraudulent research. As it is kind of meaningless the proliferation of double,triple, etc. blind experiments.
Ultimately, it goes down to the ethics and morality of the authors. As I have witnessed in more than one occasion.
On the other hand, I was once accused of cheating by a referee for presenting a “suspiciously” almost identical graph in a follow up paper. Well, it was almost identical because it was an expansion of the same experiment with a larger number of subjects, and the results were consistent and reproducible!
BTW; anyone willing to call out a Nature Medicine paper with falsified data?
Comments are closed.