2013-09-03

The impact factor of an academic journal is a measure of the average number of citations for articles in the journal. In the academic world it is always better to have your papers published by a “high impact” journal. For editors and publishing houses, impact factor is directly related to revenues earned. Over the years the methods of increasing the number of citations (not self citations, by articles in other journals and preferably from a different publishing house) and the impact factors of journals have become increasingly sophisticated.

Citation stacking has been the method that has developed where editors of journals – sometimes even from quite different publishing houses – have colluded to see to it that articles in their journals cited articles in the others. This has been going on for some time and a year ago THE reported that Thomson Reuters had suspended 26 journals for citation stacking.

“Anomalous citation patterns” is a euphemism for excessive citation of other articles published in the same journal. It is generally assumed to be a ruse to boost a journal’s impact factor, which is a measure of the average number of citations garnered by articles in the journal over the previous two years.

Impact factors are often used, controversially, as a proxy for journal quality and, even more contentiously, for the quality of individual papers published in the journal and even of the people who write them.

When Thomson Reuters discovers that anomalous citation has had a significant effect on a journal’s impact factor, it bans the journal for two years from its annual Journal Citation Reports (JCR), which publishes up-to-date impact factors.

In Brazil the Ministry of Education is obsessed by impact factor. In consequence publishing in high impact factor journals has become a matter of survival in academia.  Journals have been caught in a vicious cycle where nobody wants to publish in their pages because their impact factor is too low and their impact factor falls further because they have insufficient citations.

“By 2009, editors of eight Brazilian journals decided to take measures into their own hands”. 

Ciencia Brasil has been pointing out dubious cases in Brazilian journals for some time. The Brazilian scam has now reached the pages of Nature as Thomson Reuters suspends some Brazilian journals from its rankings for ‘citation stacking:

Brazilian citation scheme outed

Mauricio Rocha-e-Silva thought that he had spotted an easy way to raise the profiles of Brazilian journals. From 2009, he and several other editors published articles containing hundreds of references to papers in each others’ journals — in order, he says, to elevate the journals’ impact factors.

Because each article avoided citing papers published by its own journal, the agreement flew under the radar of analyses that spot extremes in self-citation — until 19 June, when the pattern was discovered. Thomson Reuters, the firm that calculates and publishes the impact factor, revealed that it had designed a program to spot concentrated bursts of citations from one journal to another, a practice that it has dubbed ‘citation stacking’. Four Brazilian journals were among 14 to have their impact factors suspended for a year for such stacking. And in July, Rocha-e-Silva was fired from his position as editor of one of them, the journal Clinics, based in São Paulo.

…. Editors have tried before to artificially boost impact factors, usually by encouraging the citation of a journal’s own papers. Each year, Thomson Reuters detects and cracks down on excessive self-citation. This year alone, it red-flagged 23 more journals for the wearily familiar practice. But the revelation that journals have gained excessively from citations elsewhere suggests that some editors may be searching for less detectable ways to boost their journals’ profiles. In some cases, authors may be responsible for stacking, perhaps trying to boost citations of their own papers.



The journals flagged by the new algorithm extend beyond Brazil — but only in that case has an explanation for the results emerged. Rocha-e-Silva says the agreement grew out of frustration with his country’s fixation on impact factor. In Brazil, an agency in the education ministry, called CAPES, evaluates graduate programmes in part by the impact factors of the journals in which students publish research. As emerging Brazilian journals are in the lowest ranks, few graduates want to publish in them. This vicious cycle, in his view, prevents local journals improving.

Abel Packer, who coordinates Brazil’s system of free government-sponsored journals, known as SciELO, says that the citation-stacking venture was “unfortunate and unacceptable”. But he adds that many editors have long been similarly critical of the CAPES policy because it encourages local researchers to publish in high-impact journals, increasing the temptation for editors to artificially boost their own impact factors, he says.

Nature 500, 510–511 (29 August 2013) 

Read the article: doi:10.1038/500510a

Show more