Skip to Main Content

The community of science watchdogs has lost an important voice with the sudden and mysterious takedown of a website devoted to outing unscrupulous publishers.

The site, Scholarly OA (for “open access”), listed more than 1,000 so-called predatory journals and publishers — those that prey on unsuspecting researchers hoping to publish their work, then charge massive fees for doing so.


Jeffrey Beall took down his site early last week, without any explanation. He has so far been silent about the reasons for the sudden closure, declining repeated requests for comment. A backup of the site exists on the Internet Archive, but is of course not being updated.

But the site wasn’t just a list, it was a catalog of remarkably bad behavior, stories so extreme they often verged on the ridiculous — hijackings of journal titles, royalty payments for citations, and more. And in its absence, no one else is keeping track of such sketchiness, leaving scientists without a valuable field guide to “predatory publishers” that will let them avoid such miscreants.

Scholarly OA began in 2008 as a hobby for Beall, a research librarian at the University of Colorado, Denver. But it quickly grew into something much more significant, thanks to the massive number of predatory outfits he has unearthed.


While plenty of well-respected publishers use the author-pays model, the approach has also attracted those out to make a buck without doing any actual work. What differentiated the two groups, for the most part, was that the publishers on Beall’s list provided little, if any, peer review or even editing of the articles they accepted (often solicited through spam directed at researchers). And their reach was, simply put, minimal at best; meaning scientists who published in these journals could expect few of their colleagues to see their work.

Not surprisingly, Beall faced howls of criticism from the companies he flagged, including repeated threats of lawsuits — one from a group the Federal Trade Commission is now suing for deceptive business practices — and worse.

But his targets were not the only ones who strenuously objected to his site. Beall was a vocal opponent of the author-pays model, and that made for enemies among open-access advocates as well. There were those, like Walt Crawford, who said Beall had tunnel vision, only going after open-access journals even though traditional “closed-access” publishers print a lot of crap, too. (The latter is certainly true, we’d agree.) And then there were some, like Karen Coyle, who argued that Beall was biased against publishers from the developing world.

Some of those critics may have simply wanted Beall’s list to improve. And every venture benefits from constructive criticism like that. But those who wanted to see it go away now have their wish. And regardless of the site’s flaws, that’s a loss. “Who will keep predatory science journals at bay now that Jeffrey Beall’s blog is gone?” asked astronomer Michael Brown.

But others are taking a different approach — considering how a successor initiative could improve on Beall’s foundation.

“Beall’s list was pioneering and very useful but not perfect,” Liz Wager, a publishing consultant in the UK wrote in a comment on Retraction Watch. “While I find its sudden disappearance troubling in terms of freedom of speech, and, like other commenters, wish Jeffrey Beall well (and would like to thank him for his work and dedication to highlighting the problem of predatory publishers), I wonder if we might use the opportunity to create something better.”

Some of that work has already begun. Just as Beall compiled a “blacklist” of what he considered to be the worst offenders in scholarly publishing, at least one group is working on a “white list” of the best ones. And a directory of journals in Latin America, the Caribbean, Spain, and Portugal says which of 33 criteria for quality each one meets.

Such rosters will be an invaluable resource for scientists looking for outlets in which to publish their work. Yet it’s only half the picture. We need blacklists and white lists to guide the way.

For instance the “Think. Check. Submit.” campaign, launched by a group including major publishers like Springer Nature as well as the Committee on Publication Ethics, offers researchers simple tips for vetting journals to which they’d like to submit their work.

Ultimately, as others have argued, we need lists that are as inclusive, comprehensive, and transparent as possible. They should not only be about open-access publishers, but about closed-access publishers, too. They should be global, and should clearly delineate which criteria each publisher meets or fails to meet.

“What the world needs now,” to paraphrase Burt Bacharach, is lists, sweet lists.

  • I am sad to know that Jeffrey took down his site. This was only source I would go to check for “predatory journals” since becoming victim, sort of, to one such journal. In fact, I came to know about his site while researching about a Journal who were constantly harassing for a payment. I have all the evidence for the predatory practice of this journal, Journal of Blood Disorders, and will be happy to provide it to any agency trying to sue these predators.

  • I sent Jeffrey some names of suspicious publishers and was very happy to find them in his list, as well as learn a lot from his genius list. I will be happy if I could continue to get more information from the global fair community. And my thanks to Jeffrey. Hope he is ok.

  • Since you mention one of my objections, your readers might also want to be aware that those “horror stories” almost never actually appeared in Beall’s blog: when I researched the matter, I found *any textual mention at all* for only 12.5% of the publishers and journals on the 2016 lists:

    So, basically, all we have in seven out of eight cases is Beall saying “trust me.”

    By the way, your readers also might want to check out my attempt to make brandy out of sour grapes, using Beall’s lists (excluding the tiny number also in DOAJ) as the basis for determining non-DOAJ gold OA production levels. The result, “Gray OA 2012-2016,” is here (and is of course CC BY):

Comments are closed.