A spoof paper concocted by Science reveals little or no scrutiny at many open-access journals.
On 4 July, good news arrived in the inbox of Ocorrafoo Cobange, a biologist at the Wassee Institute of Medicine in Asmara. It was the official letter of acceptance for a paper he had submitted 2 months earlier to the Journal of Natural Pharmaceuticals, describing the anticancer properties of a chemical that Cobange had extracted from a lichen.
In fact, it should have been promptly rejected. Any reviewer with more than a high-school knowledge of chemistry and the ability to understand a basic data plot should have spotted the paper's short-comings immediately. Its experiments are so hopelessly flawed that the results are meaningless.
I know because I wrote the paper. Ocorrafoo Cobange does not exist, nor does the Wassee Institute of Medicine. Over the past 10 months, I have submitted 304 versions of the wonder drug paper to open-access journals. More than half of the journals accepted the paper, failing to notice its fatal flaws. Beyond that headline result, the data from this sting operation reveal the contours of an emerging Wild West in academic publishing.
ILLUSTRATION: DAVID PLUNKERT
From humble and idealistic beginnings a decade ago, open-access scientific journals have mushroomed into a global industry, driven by author publication fees rather than traditional subscriptions. Most of the players are murky. The identity and location of the journals' editors, as well as the financial workings of their publishers, are often purposefully obscured. But Science's investigation casts a powerful light. Internet Protocol (IP) address traces within the raw headers of e-mails sent by journal editors betray their locations. Invoices for publication fees reveal a network of bank accounts based mostly in the developing world. And the acceptances and rejections of the paper provide the first global snapshot of peer review across the open-access scientific enterprise.
One might have expected credible peer review at the Journal of Natural Pharmaceuticals. It describes itself as "a peer reviewed journal aiming to communicate high quality research articles, short communications, and reviews in the field of natural products with desired pharmacological activities." The editors and advisory board members are pharmaceutical science professors at universities around the world.
The journal is one of more than 270 owned by Medknow, a company based in Mumbai, India, and one of the largest open-access publishers. According to Medknow's website, more than 2 million of its articles are downloaded by researchers every month. Medknow was bought for an undisclosed sum in 2011 by Wolters Kluwer, a multinational firm headquartered in the Netherlands and one of the world's leading purveyors of medical information with annual revenues of nearly $5 billion.
But the editorial team of the Journal of Natural Pharmaceuticals, headed by Editor-in-Chief Ilkay Orhan, a professor of pharmacy at Eastern Mediterranean University in Gazimagosa, Cyprus, asked the fictional Cobange for only superficial changes to the paper—different reference formats and a longer abstract—before accepting it 51 days later. The paper's scientific content was never mentioned. In an e-mail to Science, managing editor Mueen Ahmed, a professor of pharmacy at King Faisal University in Al-Hasa, Saudi Arabia, states that he will permanently shut down the journal by the end of the year. "I am really sorry for this," he says. Orhan says that for the past 2 years, he had left the journal's operation entirely to staff led by Ahmed. (Ahmed confirms this.) "I should've been more careful," Orhan says.
Tangled web.
The location of a journal's publisher, editor, and bank account are often continents apart. Explore an interactive version of this map at http://scim.ag/OA-Sting.
CREDIT: DAVID QUINN AND DANIEL WIESMANN
Acceptance was the norm, not the exception. The paper was accepted by journals hosted by industry titans Sage and Elsevier. The paper was accepted by journals published by prestigious academic institutions such as Kobe University in Japan. It was accepted by scholarly society journals. It was even accepted by journals for which the paper's topic was utterly inappropriate, such as the Journal of Experimental & Clinical Assisted Reproduction.
The rejections tell a story of their own. Some open-access journals that have been criticized for poor quality control provided the most rigorous peer review of all. For example, the flagship journal of the Public Library of Science, PLOS ONE, was the only journal that called attention to the paper's potential ethical problems, such as its lack of documentation about the treatment of animals used to generate cells for the experiment. The journal meticulously checked with the fictional authors that this and other prerequisites of a proper scientific study were met before sending it out for review. PLOS ONE rejected the paper 2 weeks later on the basis of its scientific quality.
Down the rabbit hole
The story begins in July 2012, when the Science editorial staff forwarded to me an e-mail thread from David Roos, a biologist at the University of Pennsylvania. The thread detailed the publication woes of Aline Noutcha, a biologist at the University of Port Harcourt in Nigeria. She had taken part in a research workshop run by Roos in Mali in January last year and had been trying to publish her study of Culex quinquefasciatus, a mosquito that carries West Nile virus and other pathogens.
Noutcha had submitted the paper to an open-access journal called Public Health Research. She says that she believed that publication would be free. A colleague at her university had just published a paper for free in another journal from the same publisher: Scientific & Academic Publishing Co. (SAP), whose website does not mention fees. After Noutcha's paper was accepted, she says, she was asked to pay a $150 publication fee: a 50% discount because she is based in Nigeria. Like many developing world scientists, Noutcha does not have a credit card, and international bank transfers are complicated and costly. She eventually convinced a friend in the United States to pay a fee further reduced to $90 on her behalf, and the paper was published.
Peer review reviewed.
Few journals did substantial review that identified the paper's flaws.
CREDIT: C. SMITH/SCIENCE
Roos complained that this was part of a trend of deceptive open-access journals "parasitizing the scientific research community." Intrigued, I looked into Scientific & Academic Publishing. According to its website, "SAP serves the world's research and scholarly communities, and aims to be one of the largest publishers for professional and scholarly societies." Its list includes nearly 200 journals, and I randomly chose one for a closer look. The American Journal of Polymer Science describes itself as "a continuous forum for the dissemination of thoroughly peer-reviewed, fundamental, international research into the preparation and properties of macromolecules." Plugging the text into an Internet search engine, I quickly found that portions had been cut and pasted from the website of the Journal of Polymer Science, a respected journal published by Wiley since 1946.
I began to wonder if there really is anything American about the American Journal of Polymer Science. SAP's website claims that the journal is published out of Los Angeles. The street address appears to be no more than the intersection of two highways, and no phone numbers are listed.
I contacted some of the people listed as the journal's editors and reviewers. The few who replied said they have had little contact with SAP. Maria Raimo, a chemist at the Institute of Chemistry and Technology of Polymers in Naples, Italy, had received an e-mail invitation to be a reviewer 4 months earlier. To that point, she had received a single paper—one so poor that "I thought it was a joke," she says.
Despite her remonstrations to the then–editor-in-chief, a person of unknown affiliation called David Thomas, the journal published the paper. Raimo says she asked to be removed from the masthead. More than a year later, the paper is still online and the journal still lists Raimo as a reviewer.
After months of e-mailing the editors of SAP, I finally received a response. Someone named Charles Duke reiterated—in broken English—that SAP is an American publisher based in California. His e-mail arrived at 3 a.m., Eastern time.
To replicate Noutcha's experience, I decided to submit a paper of my own to an SAP journal. And to get the lay of this shadowy publishing landscape, I would have to replicate the experiment across the entire open-access world.
The targets
The Who's Who of credible open-access journals is the Directory of Open Access Journals (DOAJ). Created 10 years ago by Lars Bjørnshauge, a library scientist at Lund University in Sweden, the DOAJ has grown rapidly, with about 1000 titles added last year alone. Without revealing my plan, I asked DOAJ staff members how journals make it onto their list. "The title must first be suggested to us through a form on our website," explained DOAJ's Linnéa Stenson. "If a journal hasn't published enough, we contact the editor or publisher and ask them to come back to us when the title has published more content." Before listing a journal, they review it based on information provided by the publisher. On 2 October 2012, when I launched my sting, the DOAJ contained 8250 journals and abundant metadata for each one, such as the name and URL of the publisher, the year it was founded, and the topics it covers.
There is another list—one that journals fear. It is curated by Jeffrey Beall, a library scientist at the University of Colorado, Denver. His list is a single page on the Internet that names and shames what he calls "predatory" publishers. The term is a catchall for what Beall views as unprofessional practices, from undisclosed charges and poorly defined editorial hierarchy to poor English—criteria that critics say stack the deck against non-U.S. publishers.
Like Batman, Beall is mistrusted by many of those he aims to protect. "What he's doing is extremely valuable," says Paul Ginsparg, a physicist at Cornell University who founded arXiv, the preprint server that has become a key publishing platform for many areas of physics. "But he's a little bit too trigger-happy."
I asked Beall how he got into academic crime-fighting. The problem "just became too bad to ignore," he replied. The population "exploded" last year, he said. Beall counted 59 predatory open-access publishers in March 2012. That figure had doubled 3 months later, and the rate has continued to far outstrip DOAJ's growth.
To generate a comprehensive list of journals for my investigation, I filtered the DOAJ, eliminating those not published in English and those without standard open-access fees. I was left with 2054 journals associated with 438 publishers. Beall's list, which I scraped from his website on 4 October 2012, named 181 publishers. The overlap was 35 publishers, meaning that one in five of Beall's "predatory" publishers had managed to get at least one of their journals into the DOAJ.
I further whittled the list by striking off publishers lacking a general interest scientific journal or at least one biological, chemical, or medical title. The final list of targets came to 304 open-access publishers: 167 from the DOAJ, 121 from Beall's list, and 16 that were listed by both. (Links to all the publishers, papers, and correspondence are available online at http://scim.ag/OA-Sting.)
The bait
The goal was to create a credible but mundane scientific paper, one with such grave errors that a competent peer reviewer should easily identify it as flawed and unpublishable. Submitting identical papers to hundreds of journals would be asking for trouble. But the papers had to be similar enough that the outcomes between journals could be comparable. So I created a scientific version of Mad Libs.
The paper took this form: Molecule X from lichen species Y inhibits the growth of cancer cell Z. To substitute for those variables, I created a database of molecules, lichens, and cancer cell lines and wrote a computer program to generate hundreds of unique papers. Other than those differences, the scientific content of each paper is identical.
The fictitious authors are affiliated with fictitious African institutions. I generated the authors, such as Ocorrafoo M. L. Cobange, by randomly permuting African first and last names harvested from online databases, and then randomly adding middle initials. For the affiliations, such as the Wassee Institute of Medicine, I randomly combined Swahili words and African names with generic institutional words and African capital cities. My hope was that using developing world authors and institutions would arouse less suspicion if a curious editor were to find nothing about them on the Internet.
The papers describe a simple test of whether cancer cells grow more slowly in a test tube when treated with increasing concentrations of a molecule. In a second experiment, the cells were also treated with increasing doses of radiation to simulate cancer radiotherapy. The data are the same across papers, and so are the conclusions: The molecule is a powerful inhibitor of cancer cell growth, and it increases the sensitivity of cancer cells to radiotherapy.
There are numerous red flags in the papers, with the most obvious in the first data plot. The graph's caption claims that it shows a "dose-dependent" effect on cell growth—the paper's linchpin result—but the data clearly show the opposite. The molecule is tested across a staggering five orders of magnitude of concentrations, all the way down to picomolar levels. And yet, the effect on the cells is modest and identical at every concentration.
One glance at the paper's Materials & Methods section reveals the obvious explanation for this outlandish result. The molecule was dissolved in a buffer containing an unusually large amount of ethanol. The control group of cells should have been treated with the same buffer, but they were not. Thus, the molecule's observed "effect" on cell growth is nothing more than the well-known cytotoxic effect of alcohol.
The second experiment is more outrageous. The control cells were not exposed to any radiation at all. So the observed "interactive effect" is nothing more than the standard inhibition of cell growth by radiation. Indeed, it would be impossible to conclude anything from this experiment.
To ensure that the papers were both fatally flawed and credible submissions, two independent groups of molecular biologists at Harvard University volunteered to be virtual peer reviewers. Their first reaction, based on their experience reviewing papers from developing world authors, was that my native English might raise suspicions. So I translated the paper into French with Google Translate, and then translated the result back into English. After correcting the worst mistranslations, the result was a grammatically correct paper with the idiom of a non-native speaker.
The researchers also helped me fine-tune the scientific flaws so that they were both obvious and "boringly bad." For example, in early drafts, the data were so unexplainably weird that they became "interesting"—perhaps suggesting the glimmer of a scientific breakthrough. I dialed those down to the sort of common blunders that a peer reviewer should easily interdict.
The paper's final statement should chill any reviewer who reads that far. "In the next step, we will prove that molecule X is effective against cancer in animal and human. We conclude that molecule X is a promising new drug for the combined-modality treatment of cancer." If the scientific errors aren't motivation enough to reject the paper, its apparent advocacy of bypassing clinical trials certainly should be.
The sting
Between January and August of 2013, I submitted papers at a rate of about 10 per week: one paper to a single journal for each publisher. I chose journals that most closely matched the paper's subject. First choice would be a journal of pharmaceutical science or cancer biology, followed by general medicine, biology, or chemistry. In the beginning, I used several Yahoo e-mail addresses for the submission process, before eventually creating my own e-mail service domain, afra-mail.com, to automate submission.
A handful of publishers required a fee be paid up front for paper submission. I struck them off the target list. The rest use the standard open-access "gold" model: The author pays a fee if the paper is published.
If a journal rejected the paper, that was the end of the line. If a journal sent review comments that asked for changes to layout or format, I complied and resubmitted. If a review addressed any of the paper's serious scientific problems, I sent the editor a "revised" version that was superficially improved—a few more photos of lichens, fancier formatting, extra details on methodology—but without changing any of the fatal scientific flaws.
After a journal accepted a paper, I sent a standard e-mail to the editor: "Unfortunately, while revising our manuscript we discovered an embarrassing mistake. We see now that there is a serious flaw in our experiment which invalidates the conclusions." I then withdrew the paper.
The results
By the time Science went to press, 157 of the journals had accepted the paper and 98 had rejected it. Of the remaining 49 journals, 29 seem to be derelict: websites abandoned by their creators. Editors from the other 20 had e-mailed the fictitious corresponding authors stating that the paper was still under review; those, too, are excluded from this analysis. Acceptance took 40 days on average, compared to 24 days to elicit a rejection.
Of the 255 papers that underwent the entire editing process to acceptance or rejection, about 60% of the final decisions occurred with no sign of peer review. For rejections, that's good news: It means that the journal's quality control was high enough that the editor examined the paper and declined it rather than send it out for review. But for acceptances, it likely means that the paper was rubber-stamped without being read by anyone.
Of the 106 journals that discernibly performed any review, 70% ultimately accepted the paper. Most reviews focused exclusively on the paper's layout, formatting, and language. This sting did not waste the time of many legitimate peer reviewers. Only 36 of the 304 submissions generated review comments recognizing any of the paper's scientific problems. And 16 of those papers were accepted by the editors despite the damning reviews.
The results show that Beall is good at spotting publishers with poor quality control: For the publishers on his list that completed the review process, 82% accepted the paper. Of course that also means that almost one in five on his list did the right thing—at least with my submission. A bigger surprise is that for DOAJ publishers that completed the review process, 45% accepted the bogus paper. "I find it hard to believe," says Bjørnshauge, the DOAJ founder. "We have been working with the community to draft new tighter criteria for inclusion." Beall, meanwhile, notes that in the year since this sting began, "the number of predatory publishers and predatory journals has continued to escalate at a rapid pace."
A striking picture emerges from the global distribution of open-access publishers, editors, and bank accounts. Most of the publishing operations cloak their true geographic location. They create journals with names like the American Journal of Medical and Dental Sciences or the European Journal of Chemistry to imitate—and in some cases, literally clone—those of Western academic publishers. But the locations revealed by IP addresses and bank invoices are continents away: Those two journals are published from Pakistan and Turkey, respectively, and both accepted the paper. The editor-in-chief of the European Journal of Chemistry, Hakan Arslan, a professor of chemistry at Mersin University in Turkey, does not see this as a failure of peer review but rather a breakdown in trust. When a paper is submitted, he writes in an e-mail, "We believe that your article is original and [all of] your supplied information is correct." The American Journal of Medical and Dental Sciences did not respond to e-mails.
About one-third of the journals targeted in this sting are based in India—overtly or as revealed by the location of editors and bank accounts—making it the world's largest base for open-access publishing; and among the India-based journals in my sample, 64 accepted the fatally flawed papers and only 15 rejected it. 26 rejections. (Explore a global wiring diagram of open-access publishing at http://scim.ag/OA-Sting.)
But even when editors and bank accounts are in the developing world, the company that ultimately reaps the profits may be based in the United States or Europe. In some cases, academic publishing powerhouses sit at the top of the chain.
Journals published by Elsevier, Wolters Kluwer, and Sage all accepted my bogus paper. Wolters Kluwer Health, the division responsible for the Medknow journals, "is committed to rigorous adherence to the peer-review processes and policies that comply with the latest recommendations of the International Committee of Medical Journal Editors and the World Association of Medical Editors," a Wolters Kluwer representative states in an e-mail. "We have taken immediate action and closed down the Journal of Natural Pharmaceuticals."
In 2012, Sage was named the Independent Publishers Guild Academic and Professional Publisher of the Year. The Sage publication that accepted my bogus paper is the Journal of International Medical Research. Without asking for any changes to the paper's scientific content, the journal sent an acceptance letter and an invoice for $3100. "I take full responsibility for the fact that this spoof paper slipped through the editing process," writes Editor-in-Chief Malcolm Lader, a professor of pschopharmacology at King's College London and a fellow of the Royal Society of Psychiatrists, in an e-mail. He notes, however, that acceptance would not have guaranteed publication: "The publishers requested payment because the second phase, the technical editing, is detailed and expensive. … Papers can still be rejected at this stage if inconsistencies are not clarified to the satisfaction of the journal." Lader argues that this sting has a broader, detrimental effect as well. "An element of trust must necessarily exist in research including that carried out in disadvantaged countries," he writes. "Your activities here detract from that trust."
The Elsevier journal that accepted the paper, Drug Invention Today, is not actually owned by Elsevier, says Tom Reller, vice president for Elsevier global corporate relations: "We publish it for someone else." In an e-mail to Science, the person listed on the journal's website as editor-in-chief, Raghavendra Kulkarni, a professor of pharmacy at the BLDEA College of Pharmacy in Bijapur, India, stated that he has "not had access to [the] editorial process by Elsevier" since April, when the journal's owner "started working on [the] editorial process." "We apply a set of criteria to all journals before they are hosted on the Elsevier platform," Reller says. As a result of the sting, he says, "we will conduct another review."
The editor-in-chief of the Kobe Journal of Medical Sciences, Shun-ichi Nakamura, a professor of medicine at Kobe University in Japan, did not respond to e-mails. But his assistant, Reiko Kharbas, writes that "Upon receiving the letter of acceptance, Dr. Obalanefah withdrew the paper," referring to the standard final e-mail I sent to journals that accepted the paper. "Therefore, the letter of acceptance we have sent … has no effect whatsoever."
Other publishers are glad to have dodged the bullet. "It is a relief to know that our system is working," says Paul Peters, chief strategy officer of Hindawi, an open-access publisher in Cairo. Hindawi is an enormous operation: a 1000-strong editorial staff handling more than 25,000 articles per year from 559 journals. When Hindawi began expanding into open-access publishing in 2004, Peters admits, "we looked amateurish." But since then, he says, "publication ethics" has been their mantra. Peer reviewers at one Hindawi journal, Chemotherapy Research and Practice, rejected my paper after identifying its glaring faults. An editor recommended I try another Hindawi journal, ISRN Oncology; it, too, rejected my submission.
Coda
From the start of this sting, I have conferred with a small group of scientists who care deeply about open access. Some say that the open-access model itself is not to blame for the poor quality control revealed by Science's investigation. If I had targeted traditional, subscription-based journals, Roos told me, "I strongly suspect you would get the same result."* But open access has multiplied that underclass of journals, and the number of papers they publish. "Everyone agrees that open-access is a good thing," Roos says. "The question is how to achieve it."
The most basic obligation of a scientific journal is to perform peer review, arXiv founder Ginsparg says. He laments that a large proportion of open-access scientific publishers "clearly are not doing that." Ensuring that journals honor their obligation is a challenge that the scientific community must rise to. "Journals without quality control are destructive, especially for developing world countries where governments and universities are filling up with people with bogus scientific credentials," Ginsparg says.
As for the publisher that got Aline Noutcha to pony up a publication fee, the IP addresses in the e-mails from Scientific & Academic Publishing reveal that the operation is based in China, and the invoice they sent me asked for a direct transfer of $200 to a Hong Kong bank account.
The invoice arrived with good news: After a science-free review process, one of their journals—the International Journal of Cancer and Tumor—accepted the paper. Posing as lead author Alimo Atoa, I requested that it be withdrawn. I received a final message that reads like a surreal love letter from one fictional character to another:
Dear Alimo Atoa,
We fully respect your choice and withdraw your artilce.
If you are ready to publish your paper,please let me know and i will be at your service at any time.
Sincerely yours, Grace Groovy
* Correction on 3 Oct. 2013: This sentence was clarified to better reflect Roos's view.