Science is about discovery. But discovery does little good unless it is shared. The way we have traditionally shared information in biomedicine — submitting articles to peer-reviewed journals — can add a year or more to publication. I believe that preprints can dramatically speed up information-sharing without harming the scientific process.
Preprints are complete manuscripts uploaded to a public server without formal review. Anyone can read a preprint and comment on it. Not only do they make new information immediately available, but they can also enable a kind of crowdsourced peer review that can point out errors or suggest complementary experiments.
The concept isn’t new. A preprint server called arXiv was started in 1991 for physicists. Its founder, Paul Ginsparg, says that researchers in physics, mathematics, and computer science now check the site daily to learn about advances in their fields. Many of these preprints are eventually published in traditional journals.
A similar server, bioRxiv, opened for the life science community in 2013. Although it has published a steadily growing number of preprints, the community as a whole has approached it cautiously.
To promote wider use of preprints among biomedical researchers, several colleagues and I organized a conference earlier this year called ASAPbio. It brought together individuals representing three major stakeholders — researchers, funders, and journal editors — to explore common ground and identify barriers. Perspectives from each group appear in a commentary in today’s Science magazine.
One reason researchers are hesitant to embrace preprints is uncertainty about whether they will hurt the chances of later publication in a high-impact journal. The so-called Ingelfinger rule, warning that prior publication would disqualify a paper for consideration by a major journal, still casts a long shadow over biomedical publishing. In a draft statement prepared for the ASAPbio meeting, some publishers supported the use of preprints and said they “will prejudice neither the peer-review process nor publication in our journals or monographs.” It would be wonderful, and a huge relief if all publishers adopted that approach.
Another concern is that preprints will flood the literature with poor-quality work. This hasn’t happened with the routine use of preprints in physics and mathematics. What’s more, virtually every scientist zealously tries to protect his or her reputation. Few would endanger it by publishing poor-quality work.
Preprints would be good for the entire scientific community. But they would have special benefit for early-career scientists, who need to generate the currency of academia — publications. Waiting a year or more for a paper to appear in a journal can put serious road bumps in career development. I finished graduate school with one first-author publication and another in the works. I was turned down for numerous fellowships. Finally getting the second paper through peer review helped me get my current fellowship. In such a situation, preprints could serve as an interim demonstration of a junior scientist’s productivity.
There’s a moral aspect of preprints we should also keep in mind. Much of biomedical research is supported by taxpayer dollars. Our work is a public good, and we owe it to the people who pay for it to share it with the world as soon as possible.
And there’s no reason we have to stop at preprints. Adding in the ability to publish data sets and hypotheses on sites like Zenodo could go even further to make our data free and accessible.
Jessica Polka, PhD, is a postdoctoral fellow at Harvard Medical School.