A Science Journal Sting

Want to get your work published in a scientific journal? No problem if you have a few thousand dollars you are willing to part with. 

These days a number of journals display trappings of a journal, promising peer-review and other services, but do not deliver.  They perform no peer review, and provide no services, beyond posting papers and cashing checks for the publication fees. 

bribeThere has been a recent dramatic increase in the number of publishers that appear to be engaged in the practice, growing by an order of magnitude in 2012 alone. (1)

Network of bank accounts based mostly in the developing world

From humble and idealistic beginnings a decade ago, open-access journals have mushroomed into a global industry, driven by author publication fees rather than traditional subscriptions. Most of the players are murky. The identity and location of the journals’ editors, as well as the financial workings of their publishers, are often purposefully obscured.

Invoices for publication fees reveal a network of bank accounts based mostly in the developing world, reports John Bohannon. (2)

A striking picture emerges from the global distribution of open-access publishers, editors and bank accounts. Most of the publishing operations cloak their true geographic locations Some examples: The American Journal of Medical and Dental Science is published in Pakistan, while the European Journal of Chemistry sees publication in Turkey. (2)

Inspired by the experience of a colleague in Nigeria, who felt deceived by a certain journal—one with a business model that involves charging fees to the scientific authors ranging from $50 to more than $3,000, the above-mentioned John Bohannon, a biologist at Harvard, submitted 304 versions of a wonder drug paper to open-access journals. More than half of the journals accepted the paper, failing to notice its fatal flaws. (2)

The paper, about a new cancer drug, included nonsensical graphs and an utter disregard for the scientific method. In addition, it was written by fake authors, from a fake university in Africa, and as a final flourish, changed it through Google Translation into French and back to English. Collaborators at Harvard helped Bohannon make it convincingly boring. (3)

“Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper’s short coming immediately. Its experiments are so hopelessly flawed that the results are meaningless,” Bohannon wrote in the journal Science. And yet his informal sting operation revealed that 156 publishers completely missed the hints. (2)

Whether fee-charging open-access journals were actually keeping their promise to do peer review

Bohannon wanted to find out whether fee-charging open-access journals were actually keeping their promise to do peer review—a process in which scientists with some knowledge of a paper’s topic volunteer to check it out for scientific flaws. In the end, what he concluded was that ‘a huge proportion’ of the journals were not ensuring their papers were peer reviewed.  He added that his experiment could be the tip of the iceberg, and that peer review at traditional journals—not just fee-based open-access journals—could be just as bad. “It could be the whole peer review system is just failing under the strain of the tens of thousands of journals that now exist.” (4)

Some examples of the issue with ‘prestigious’ journals:

  • In a classic 1998 study, Fiona Godlee, editor of the prestigious British Medical Journal (BMJ), sent an article containing eight deliberate mistakes in study design, analysis and interpretation to more than 200 of the BMJ’s regular reviewers. Not one picked out all the mistakes. On average they reported fewer than two; some did not spot any. (5)
  • Another experiment at BMJ showed that reviewers did no better when more clearly instructed on the problems they might encounter. They also seemed to get worse with experience. Charles McCulloch and Michael Callahan, of the University of California, San Francisco, looked at how 1,500 referees were rated by editors at leading journals over a 14-year period and found that 92{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} showed a slow but steady drop in their scores. (5)
  • The Economist adds, “As well as not spotting things they ought to spot, there is a lot that peer reviewers do not even try to check. They do not typically re-analyze the data presented from scratch, contenting themselves with a sense that the authors’ analysis is properly conceived. And they cannot be expected to spot deliberate falsifications if they are carried out with a modicum of subtlety.” (5)
  • On another front, The Institute of Medicine estimates that only 4 percent of treatments and tests are backed up by strong scientific evidence; more than half have very weak evidence or none. (6)

    John Ioannidis reported that one-third of studies published in three reputable peer reviewed journals didn’t hold up. He looked at 45 studies published between 1990 and 2003 and found that subsequent research contradicted the results of seven of those studies, and another seven were found to have weaker results than originally published. In other words, 32{154653b9ea5f83bbbf00f55de12e21cba2da5b4b158a426ee0e27ae0c1b44117} did not withstand the test of time. (7)

    This translates into a lot of medical misinformation. Ioannidis reviewed prestigious journals including The New England Journal of Medicine, The Journal of the American Medical Association (JAMA), and Lancet along with a number of others. Each article had been cited at least 1,000 times, all within a span of 13 years.

    Read the rest of the article at Canadafreepress.com

    Jack Dini, Livermore, CA, writes a monthly column on science and environmental issues for Plating & Surface Finishing and also writes for other publications. He is the author of Challenging Environmental Mythology (2003). Jack can be reached at:  [email protected]

     

    Trackback from your site.

    Leave a comment

    Save my name, email, and website in this browser for the next time I comment.
    Share via