he years spent as a grad student and postdoc are among the most trying times for any scientist. The pressure to publish is intense, as young researchers vie for the few jobs at the heads of academic labs.

Those high stakes and the pressure-cooker atmosphere make mistakes — and sometimes the willingness to cut corners and commit fraud — more likely.

Unfortunately, both can be career killers, if two recent cases are taken as examples. And although fraudsters aren’t welcome, the loss of the innocent overwhelmed is taking a toll on science.


Case one: Sergio Gonzalez, a postdoctoral researcher at the Institute for Neurosciences of Montpellier, France. He was hitting the job market in 2015, and he knew he needed a paper in a top journal to stand out in that market.

So he was relieved — elated, perhaps — when editors at the Journal of Clinical Investigation, one of the world’s most prestigious journals, told him they’d be publishing one of his papers. Having an article accepted there would carry a lot of weight on a job application — and in France’s system, success on that application meant a permanent job.

But the paper was flawed — deeply, it turns out. First, a commenter on PubPeer, an anonymous post-publication peer review site, flagged a suspicious-looking figure. Next came a correction in the journal, more comments on PubPeer, an expression of concern from the editors, an institutional review of Gonzalez’s work, and finally, this month, a retraction. Along the way, Gonzalez lost the opportunity for the job he so wanted.

Around that same time, across the Atlantic, another grad student was also on the academic job market. Michael LaCour was a promising graduate student in political science at the University of California, Los Angeles, who managed to publish a headline-grabbing paper in Science, one of the world’s top journals, about attitudes toward gay marriage. Soon after that, he landed a job at Princeton. But then his paper — and academic career — unraveled after two other graduate students at different institutions starting asking questions that would eventually make it clear LaCour had made up the data. The paper was retracted — and so was his job offer.

LaCour’s story and Gonzalez’s take different paths from there. While LaCour, it seems, faked his data, through it all, Gonzalez and his supervisor, Nicolas Tricaud, maintained that the postdoc was innocent of misconduct. Any errors, Tricaud insisted, were the result of Gonzalez’s haste and stress over his impending job search and his desire to land a plum spot.

The university seemed to agree that Gonzalez was honest but sloppy. According to the JCI’s retraction notice: “The institutional review found no evidence of intention to falsify results and concluded that errors were made due to negligence during the assembly of figures. The institutional review panel did not question in any way the authenticity of the published results.”

We may feel relieved that LaCour seems unlikely to return to the ivory tower. But the loss of Gonzalez, who did not win a coveted spot in a laboratory, seems by all accounts to have been a blow to science. Tricaud says the budding researcher has dropped out of academia — a shame considering his willingness to “work like hell” on the project.

It’s not just France where postdocs feel this pressure. In the US, only about 15 percent of postdocs can expect to land faculty jobs, according to one estimate. Meanwhile, the rate of unemployment among this group jumped from 4 percent in 2008 to 10 percent in 2012.

Part of the problem is that academic mentors tend to emphasize careers in academia, rather than all of the other doors a PhD can open. So when someone like Gonzalez is shut out of the academy, they feel the failure even more acutely. One solution would be for such mentors to embrace so-called “alternative careers,” whether in industry, public service, or elsewhere — which aren’t really “alternative” anymore, given that most PhDs end up in them.

But if senior faculty and administrators don’t want to drive young scientists from the field — even the honest ones — they’d best figure out a way to let publish or perish itself perish.

Leave a Comment

Please enter your name.
Please enter a comment.

  • It is very tough, unfair and unkind in the field (ex. California’s biotechbay), but to grant leniency is the major reason any system of integrity and professionalism breaks down. Fraud is a very big issue in academic and industrial biotech, today more than ever, because someone out there, untalented and/or undeserving wants a particular career. Should we then punish those who are right for what they strive for, then? This guy doesn’t deserve a second chance, much less a mention.

    • Well said , these fake scientists especially in the Global Warming / Climate Change field , do not follow Scientific Method and Procedure , as you well know . The cost to the public was just enormous with bad politicians seeing a new cash cow adopting these fake scientists ” facts” . This was scientifically atrocious and I’m still fighting this particular “clique” in Canada . They’ve introduced the much hated Carbon Tax from adopting these fake scientists , “facts” . They may postulate a theorem ! But they can’t say it’s a fact , till all the chips are in !

  • Are you kidding me? Or maybe, this is just a reflection of the great new era we are living in.
    Who make those mistakes shouldn’t be granted second chances
    A) because if you do it, it means you are either a fraudster or simply not good at your job
    B) because you should deserve less chances than whom has been either morally sound or skillful. And in a reality, where there is a huge overcrowding of PhDs (both in academia and industry), it should means they should be at the bottom of any list.
    BTW, the overcrowding is generated by the fact that PhDs/postdocs are cheap labor and there is no interest from both the granting agencies and research institutions to correct that

  • It’s not clear from this article and the various links whether Gonzalez committed fraud, or whether this is jsut sloppy work. There is not enough here to prove fraud so the review board was correct in labelling it as just sloppy work.

    I often see medical studies that use very questionable statistical methods, and the conclusions are often not supported by the data presented. The Seven Countries Study is a good example of likely fraud, yet the author is honored by many people. It seems to me that medical journals should make every attempt to improve the quality of research, and an important part of that is having sanctions against researchers who commit fraud or whose work is so sloppy that it is suggestive of fraud.

Recommended Stories

Sign up for our
Daily Recap newsletter

A roundup of STAT’s top stories of the day in science and medicine — delivered straight to your inbox every weekday afternoon.