Skip to Main Content

How would you feel if you were called a parasite in a respected scientific journal? If you’re Drs. Erick Turner and Kun-Hsing Yu, the answer is: elated.

The two scientists have each received an inaugural Research Parasite award, for which they’re honored in the latest issue of Nature Genetics.

The tongue-in-cheek honor comes from the mind of computational biologist Iddo Friedberg, and was brought to life by Casey Greene, a pharmacologist at the University of Pennsylvania. The idea was a response to a now-infamous editorial in the New England Journal of Medicine that referred to those who picked over the bones of previously published studies as “research parasites.”

advertisement

The award has the serious aim of fostering “the craft of data reanalysis for novel ends.” And the particular brand of parasitism that Turner and Yu demonstrated shows just how useful the rehashing of others’ findings can be.

Turner, a psychiatrist at Oregon Health and Science University in Portland, explored publication bias in studies of antidepressants and other medications. One of Turner’s findings: 90 percent of published studies of mood drugs report positive findings but only about half of such trials registered with the FDA do so.

advertisement

For his part, Yu, a postdoctoral fellow in biomedical informatics at Stanford University, led a group that used data from the Cancer Genome Atlas to improve predictions about how likely cancer patients would be to survive their disease. And they did so in a way that would allow other scientists to use their approach — let’s call it meta-parasitism.

Data sharing is not a new idea. Since 2003, the National Institutes of Health has required grantees who receive at least $500,000 per year in funding to share their results. And many journals demand that authors make their data public as a prerequisite of publication, although those polices are often not enforced.

“Research parasites help to maintain the self-correcting nature of science,” Greene and his colleagues wrote in Nature Genetics announcing the winners. “Scientists who perform rigorous parasitism put scientific work to the test, and their results may support or challenge what we think we know.”

This year’s contest generated 41 applications and the group is already accepting applications for 2018. Greene says he’s pleased with the results so far, but he fears that some scientists may be turned off by the satire. “I’d like to transition to the positive framing of the award — awards for rigorous secondary data analysis — and to work harder to encourage members of underrepresented groups to apply,” he said.

Turner said science does not value secondary data analysis as much as it should. “My cynical sense is that one’s academic ‘success’ is largely measured in terms of how much grant money you bring in — not only for your work but also for the institution via indirect costs,” he told STAT. ”Funding agencies seem enamored with chasing the next big thing, even though the ‘gold in them thar hills’ often turns out to be fool’s gold. The research parasite who reveals that fact may be viewed by many as a party pooper.”

Yu offers these tips to would-be parasites: know the online datasets and data repositories that are available and mine them for “hidden gems.” Don’t be afraid to come up with conclusions that differ from those of the earlier research — but always acknowledge the scientists who worked on the original study and cite them appropriately. Oh, and of course: “Share your findings with the scientific community.”

Still, at the moment, the supply of data — not the skills of parasites — is what’s holding up more data reanalysis from taking place.

Part of the problem is attitude. At a two-day summit on data-sharing hosted by NEJM, Rory Collins, a UK scientist who has spoken out in the past against mandatory sharing of data, linked the push to being on the wrong end of a mugging, and expressed the view that sharing for the sake of sharing is like Brexit: “The remedy is worse than the disease.”

We agree that collaboration is better than coercion. But that’s the whole point: We need coercion precisely because so many scientists are loath to collaborate on any terms other than their own, if at all.

In the end, though, you can’t have parasites without hosts. So how about a new, complementary award for researchers who most willingly turn their data over to parasites? According to Greene: stay tuned.

A previous version of this story misstated Dr. Turner’s profession. It has been updated.

  • A published article reporting certain conclusions from a study surely requires the accompanying data, the RAW data. Without that there is zero credibility, zero. Plus anything required to reproduce the results.

    I was shocked when I learned this is not a practice, in the journal CALM, Communications of the ACM, where one writer argued that such papers that involved a computer program should publish the source code with it.

    The purposes of code review is so others can see what one missed. Anonymous peer review has its own problems, chief among them the enforcing of political orthodoxy. It encourages beat-downs of new ideas, new approaches, or ideas that expose flaws. Hence we now get Lysenkoism Phase 2 manifesting in practices exposed in Climategate, and in the lawsuits against writing about the flaws in the “hockey stick” model.

  • Ana,

    As I penned “hard sciences,” I had in mind physics and chemistry . . . specifically the cold fusion controversy that led to the rejection of the claims of Pons and Fleishman at the University of Utah after “scores of laboratories in the United States and abroad” could not replicate the experiment. http://partners.nytimes.com/library/national/science/050399sci-cold-fusion.html

    I accept your qualification about the general lack of needed funding and needed expertise everywhere to support replication efforts. I, too, applaud “the craft of data reanalysis for novel ends” that is recognized by the Parasite Award.

    My comment was meant to highlight its importance as “the next best thing” to replication, given the constraints you mention on needed funding and expertise.

    But perhaps equally important is the lack of governmental and professional leadership support for both replication and reanalysis of data in the rush to promote novel findings. From a societal perspective, it is hard to rationalize withholding data for reanalysis in a world of limited resources when the data represent sunk costs that have already been paid for and the novel findings already published by the original authors and recognized as such.

    • Further reading involving the efforts to replicate the results from Pons and Fleishman’s experiments did NOT refute those claims, as so widely misreported. Some labs got positive results, some mixed, others negative. Meaning to date no one knows what the exact conditions are to replicate it consistently.

      Pons and Fleishman, I’m my opinion, did the announcement right, whether intentionally or not. The published all their data, their methods, their conditions, their materials, their years of work on it, all of it. The late (later brutally mudered) Eugene Mallove described all this in the book “Fire From Ice”.

      The handling of the announcement by Press was to be expected. But the aftermath was lazy.

      Eugene Mallove was working at MIT when their physics department announced that they had failed to find any results like the mislabeled “cold fusion” announcement. I don’t think Pons and Fleishman ever used the term.

      After that announcement Eugene Mallove quit MIT in absolute disgust, because he was privy to the truth of those results, and in so many words called the one making the announcement a liar.

      Eugene Mallove went on to write an open letter to the president pleading for more research funding for this promising area of clean energy study. Arthur City. Clarke also wrote to governments in the same spirit.

      MIT and many of those universities, even the ones with labs that got promising reactions, get tens of billions of dollars for HOT Fusion research. There is where the real distortions are. Political research funding distorts the “science funding market”, diverting resources into places that reinforce the status quo, bolster conformity enforcement, and quash new and promising fields of study, in this case something that could make clean and bountiful energy possibly ubiquitous.

      Eugene Mallove went on to create Infinite Energy Magazine (“http://www.infinite-energy.com”), and the New Energy Foundation” (NEF, “http://www.infinite-energy.com/whoarewe/donate.html”) which funds research into these promising areas of new physics, new energy.

    • trutherator,

      My point was that it was possible to try to replicate the Pons and Fleishman experiments in laboratories around the world . . . not that the results would be unmixed, nor that it would end politics and controversy.

      I do agree with your observation about the role peer review in reinforcing political (and disciplinary) orthodoxy. The controversy about a possible vaccine-autism link continues to scare away researchers from trying to do research.

      Not so amusing is the apparent use of Twitter to express dissatisfaction with a publication and maybe (I’m still checking to see) to retraction.

      You might add Tweety Trumpism to Lysenkoism Phase 2 and lawsuits against writing about flaws in the “hockey stick” model.

  • Amazing that reanalysis of data should be such a big thing. I doubt it is an issue in the hard sciences where published findings can be replicated in labs all over the world. Given the lack of replicability of much of the soft sciences, namely, biomedical and behavioral research and the social sciences, reanalysis is the next best thing. The “parasites,” I would hope, will prioritize reanalyzing shared data to determine if the original published findings hold up before seeking additional insights.

    • Could you maybe clarify what you mean by “hard sciences”? I have hard time imagining nowadays any branch of science in which the published findings can be replicated all over the world. That’s because of the enormous technological progress and shortage of funding. And in the branches of science where you don’t need that much technology, you depend on people, who are hardly replicable all over the world.
      I think there are two reasons why this particular data reanalysis is a big thing: it’s been done with rigour and led to new conclusions. It’s in fact a new study, only in the part of the protocole that says “collect the data” you crawl the Internet and not go to the lab.

Comments are closed.