If you come around these parts often, you’re likely familiar with the idea of fake data. You might even recall fake peer review. But fake laboratory supplies?
Unfortunately, that’s also a thing. From mislabeled antibodies to watered-down catalysts — to the wrong kind of cells entirely — problems with lab supplies have wasted labs’ resources and scuppered their research findings. And the trouble is exacerbated by unscrupulous producers, primarily in China, who are tempting scientists with cut-rate versions of essential substances for their experiments, and thereby jeopardizing the resulting conclusions.
The fakery piece recently came to light in an alarming expose in Nature. “Even university cleaning staff have been implicated in the hidden process that creates counterfeit laboratory products, including basic chemistry reagents, serum for cell culture and standard laboratory test kits,” according to the article. “Although it’s difficult to quantify the effects of this illegal trade, Chinese scientists and some in Europe and North America say that fake products have led them astray, wasting time and materials.”
For researchers, the faulty products can produce substantial headaches. One group in Toronto claims to have spent roughly $500,000 over two years trying to figure out why antibodies it bought from a Chinese manufacturer wouldn’t work as expected. Similar problems bedeviled a group at Massachusetts General Hospital that had purchased the same materials from the company.
But the problem isn’t limited to fakery. Honest mistakes with supplies can have far-reaching consequences. This month, for example, the Journal of Biological Chemistry issued a correction to a 2016 paper after the authors reported finding that, unbeknownst to them, the enzyme they’d been studying lacked a key piece of DNA. The molecule, purchased from an affiliate of GE Healthcare, changed their results.
Then there are the cells with mistaken identities. Simply put, researchers believe they are studying one kind of cell, but, thanks to undetected contamination or another mishap, the cells are, in fact, completely different. Thousands of published papers involving misidentified cell lines are clogging the scientific literature, including hundreds involving a widely studied form of brain tumors in humans.
But there are ways to curb these problems. In the case of cell lines, for instance, the International Cell Line Authentication Committee, or ICLAC, hosts a database of lines known to be unreliable. ICLAC has made some progress, but it’s a steep climb, as the authors of a proposal for a parallel system wrote a few years ago: “Reporting of contaminated and misidentified cell lines is scattered and often inconsistent, thus continued use of cell lines from dubious origins is still evident in the literature.”
Another encouraging effort is the Research Resource Identifier, or RRID. The RRID is a sort of virtual barcode — like those annoying stickers on supermarket apples, or the barcodes on many drugs and patients in the hospital — that authors stamp on the materials they use to ensure consistency.
As the creators of the idea put it: “Resources (e.g. antibodies, model organisms, and software projects) reported in the biomedical literature often lack sufficient detail to enable reproducibility or reuse. For example, catalog numbers for antibody reagents are infrequently reported, and the version numbers for software programs used for data analysis are often omitted.”
By tagging resources with a unique ID, researchers can ensure that other scientists can retrace their steps along the road toward greater rigor. In fact, the goal of the project is to allow machines to do this legwork, further reducing odds of human error.
That kind of automation could produce big gains for scientists. Per the creators: “We currently have no way of alerting readers of a paper that a problem has been found with a particular antibody or software tool. The RRID will provide the basis for developing such an alerting service.” One might imagine a system akin to Crossmark, which some publishers use to keep readers updated in more or less real time about issues with papers.
RRID is still just a pilot project, albeit a promising one with backing from the National Institutes of Health. It doesn’t currently cover many categories of lab materials, although it is expanding. Nor is it perfect. For the scheme to work most effectively, companies and researchers have to agree to participate — and at this point, they’ll have to bear the costs themselves.
We’d urge everyone to take part — and for funders to back efforts like these to promote more reliable science. After all, “fake it til you make it” is no way to do science.