
They’re not the kind of gangs that smuggle drugs and murder people. But people looking closely at the scientific literature have discovered that a small number of scientists are part of a different kind of cartel — ones that band together to reference each other’s work, gaming the citation system to make their studies appear to be more important and worthy of attention.
These so-called citation cartels have been around for decades, as the publishing consultant Phil Davis has pointed out. Thomson Reuters, which until recently owned the Impact Factor for ranking journals, has even sanctioned periodicals for evidence of cartel behavior.
Davis, who clearly has an eye for this kind of thing, unearthed a citation cartel a few years back when he came across a 2010 article in Medical Science Monitor with a glaring feature: Of its 490 references, 445 were to articles in an emerging medical journal called Cell Transplantation. Of the rest, 44 were to papers in … Medical Science Monitor. Davis also noticed this: “Three of the four authors of this paper sit on the editorial board of Cell Transplantation. Two are associate editors, one is the founding editor. The fourth is the CEO of a medical communications company.”
That wasn’t a one-off. Davis found similar cases involving the same authors. In 2012, Thomson Reuters sanctioned three of the four publications involved by denying them Impact Factors. The firm did the same to six business journals in 2014.
For authors, the payoff is clear: The more citations your articles generate, the more influential they appear. And journals have similar incentives: Encourage authors to cite papers that appear in your pages and you’ve created the illusion that your journal is highly influential. Indeed, the controversial Impact Factor ranks scientific periodicals on how frequently their articles earn citations. The lure is so strong that editing services have been found to produce papers — citations included — for a charge.
But sorting out true collusion from innocent network effects has been historically difficult. After all, some routes to generating citations are honest and fair: mentioning the work of frequent collaborators, for example, or working in a small field with few other scientists.
A new paper joins a small band of researchers trying to identify these cartels — before they do too much damage.
The paper, in Frontiers in Physics, is by a group at the University of Maribor in Slovenia. They use existing tools for analyzing data online and demonstrate that they can pick out cartel behavior in an artificial list of publications. The work is preliminary, and with good reason. “Trying to conclude whether articles have been published with the specific intent to increase the citation statistics of a cited journal, and in particular the journal’s impact factor, is perhaps a slippery slope,” wrote a different group of bibliometricians — yes, this field of study has its own name — in June.
The authors of the new paper agree. Declaring that two authors have engaged in inappropriate back-scratching “is very dangerous, because we cannot ever be sure that this indictment really holds in the real-world,” they write. “We can only indicate that there is a high probability of citation cartel existence, but this fact needs to be confirmed using a detailed analysis.”
However large the cartel phenomenon, it’s just one among many illnesses afflicting modern science, which tends to reward quantity of metrics — more citations, more papers, more grant money — over quality.
As seductive as metrics are, however, they’re often fool’s gold. It’s sort of like cutting that Cali cocaine with baking powder — a subject about which we promise we have no knowledge. It’ll work on the street for a little while. But when you’re found out, it won’t be pretty.
Actually! Now several kind of ilnesses in Science. One of them is mentioned here – Sitation cartels. Another is “farmers” who are grand-owners feeding other researchersand becoming co-authors and by this way increase the number of their papers that is allow them to win new grants and e.c. There is also a problem non-competentive reviews of manuscripts submitted to journals. I think the World Mathematical Society should and just it is able solve such problems.