Paper investigates: is your most cited work your best work?

By Lilian Nassi-Calò

In a recently published post in SciELO in Perspective blog1, a study that proposes a taxonomy of motives to cite publications in scientific articles was analyzed. The authors suggested that the range of motives that leads authors to cite one or another publication, regardless of field of knowledge, can be grouped into four main categories and some subcategories.

Lately, many criticisms regarding the use of impact indicators based on citations to evaluate researchers, research projects, hiring decisions and career promotions have been observed. The limitations of the Impact Factor are numerous and today there are available more comprehensive and reliable alternative metrics on the article level that take into account download data, sharing on social networks and coverage in print and online media.

However, it is about half a century that scientometrics have been largely using citation based metrics. Brazil, whose journals indexed in the Journal Citation Reports represent only 1% of the total number of journals indexed, bases the evaluation system of its graduate programs scientific production primarily in the Impact Factor measured on the Web of Science database (Thomson Reuters).

Two studies published in this week’s edition of Nature discuss citations and scientific impact. The first, authored by Richard Van Noorden and coworkers2 analyzed the 100 most cited articles of all times based on the Science Citation Index (Web of Science, Thomson Reuters) for their contribution to science. The number of citations of these top 100 papers range from 305,000 to 12,000 and their analysis reveals that most of these highly cited papers refer to biological and biochemical techniques, two of which resulted in Nobel prizes. Are also present in the top-100 list articles on bioinformatics, phylogenetic (the study of evolutionary relationships among species), statistics, density functional theory (the software based study of the behavior of electrons in a given material that predicts its properties) and crystallography. It is noteworthy, however, that an article reporting a real breakthrough knowledge, perhaps the greatest of the twentieth century in the field of biochemistry, such as the discovery by Watson and Crick3 of the DNA’s double-helix structure in 1953 that lead to a Nobel prize, received relatively few citations – about 5000. Another example is the article by Farman, Gardiner and Shanlin4 reporting the discovery of the hole in the ozone layer in 1985, which received only 1,871 citations.

The second article, by John Ioannidis and coworkers5 reports a study that consisted of interviewing the most prolific authors in biomedical sciences who were asked to score their ten most cited articles in six ways. In this way, the authors set out to answer questions such as: Are the most cited articles the most important ones? Science progresses mainly through evolution or revolution? Would these be mutually exclusive or complementary processes and which one is best reflected by the high rate of citations? Were the results surprising harder to publish? The study has many interesting findings, however, instead of answering the questions, it brings up even more interrogations.

The research methodology consisted of sending to the 400 most cited authors in the field of biomedical sciences in the 1996-2011 period questionnaires with questions about their view of their 10 most cited articles published from 2005 to 2008, and it was requested them to score the articles between zero and 100 in six dimensions, according to their influence and impact. The restricted period prevents publication bias that highly cited and old articles become stereotyped, and are often treated as canonical and more recent work that did not had the time to accrue enough citations.

About 1/3 of the researchers – 123 of them – have responded to the questionnaire, listing 1,214 papers overall. The questionnaire asked the authors to relate their work in six categories: Continuous Progress, Broader Interest and Greater Synthesis – cataloged as Measures of Evolution; the categories Disruptive Innovation and Surprise Publication – cataloged as Measures of Revolution. Publication Difficulty was not categorized in neither, but correlated mostly with Measures of Revolution. Researchers tended to rank their most influential articles in the first three categories, and articles with less citations to the last three categories. For most of them, the most influential articles were published easily, with some exceptions. The mean and median of self-attributed scores by the authors stood around 50 for the first three categories, and 20-40 in the last three categories.

Only 20 researchers (16%) indicated that their most relevant papers published in 2005-2008 were not among their 10 most cited articles. These 20 articles, however, still rank among the most cites articles in the period. Fifty-two papers were evaluated by at least two authors, indicating co-authorship. There was agreement regarding the category in which the article was ranked by the two co-authors and the score indicated by both between 74% to 86% in the six dimensions, but one must take into account the limited sample size.

The authors expected that the papers were self-assessed as of evolutionary or revolutionary nature, but not both. This does not totally agree with the survey results, since there was a strong correlation between the scores of the General Interest category and Measures of Revolution and Evolution.

The study has limitations, according to its authors. Firstly, only 30% of authors responded to the survey, and the others could have given higher scores for Measures of Revolution or Evolution. Secondly, the limited publication period chosen to give uniformity to the data. Third, the fact that the authors can evaluate their own work more positively than others might do. A fourth factor is that the sample is composed of researchers whose work has been widely accepted (and cited). Other researchers with innovative ideas that have not been well accepted were not included in this group. The authors believe that among medium cited work the mark of evolution would be more frequent.

The research by Ioannidis and coworkers confirms the wealth of information that is contained in citation analysis and how much of scientometrics remains to be studied. The authors speculate: How to early identify an innovative paper? This kind of article makes connections between knowledge areas that are not typically made, or is it cited by papers in remote areas? These and many other questions emerge from this study. According to the authors, a way to answer them would be to survey authors who cite highly cited articles, or to evaluate the profile of moderately cited articles. However, the authors are certain about the necessity to use other indexes besides citation based metrics to complement the assessment of science.


1 Study proposes a taxonomy of motives to cite articles in scientific publications. SciELO in Perspective. [viewed 01 November 2014]. Available from:

2 VAN NOORDEN, R., and MAHER, B.N.R. The top 100 papers. Nature. 2014, vol. 514, nº 7524, pp. 550-553. DOI: 10.1038/514550a. Available from:

3 WATSON, J.D., and CRICK, F.H.C. Molecular Structure of Nucleic Acids: A Structure for Deoxyribose Nucleic Acid. Nature. 1953, vol. 171, nº 4356, pp. 737-738. DOI: 10.1038/171737a0. Available from:

4 FARMAN, J.C., GARDINER, B.G., and SHANKLIN, J.D. Large losses of total ozone in Antarctica reveal seasonal ClOx/NOx interaction. Nature. 1985, vol. 315, pp. 207-10. DOI: 10.1038/315207a0. Available from:

5 IOANNIDIS, J.P.A., et al. Bibliometrics: Is your most cited work your best? Nature. 2014, vol. 514, nº 7524, pp. 561-562. DOI: 10.1038/514561a. Available from:


Altmetrics, Alternative metrics and Alternative measurements: new perspectives on the visibility and impact of scientific research. SciELO in Perspective. [viewed 01 November 2014]. Available from:

Rise of the Rest: The Growing Impact of Non-Elite Journals – Originally published on Google Scholar Blog on October 8, 2014. SciELO in Perspective. [viewed 01 November 2014]. Available from:

Declaration recommends eliminate the use of Impact factor for research evaluation. SciELO in Perspective. [viewed 01 November 2014]. Available from:

Article downloads: An alternative indicator of national research impact and cross-sector knowledge exchange – Originally published on the Elsevier newsletter “Research Trends Issue 36″. SciELO in Perspective. [viewed 01 November 2014]. Available from:

Interview with Vincent Larivière. SciELO in Perspective. [viewed 01 November 2014]. Available from:

IOANNIDIS, J.P.A., et al. Supplementary information to: Is your most-cited work your best? Nature. 2014, vol. 514, nº 7524, pp. 561–562. DOI: 10.1038/514561a. Available from:!/suppinfoFile/514561a_s1.pdf

What can alternative metrics – or altmetrics – offer us?. SciELO in Perspective. [viewed 01 November 2014]. Available from:

Publish or perish? The rise of the fractional author… – Originally published on the Elsevier newsletter “Research Trends Issue 38″. SciELO in Perspective. [viewed 01 November 2014]. Available from:


lilianAbout Lilian Nassi-Calò

Lilian Nassi-Calò studied chemistry at Instituto de Química – USP, holds a doctorate in Biochemistry by the same institution and a post-doctorate as an Alexander von Humboldt fellow in Wuerzburg, Germany. After her studies, she was a professor and researcher at IQ-USP. She also worked as an industrial chemist and presently she is Coordinator of Scientific Communication at BIREME/PAHO/WHO and a collaborator of SciELO.


Translated from the original in portuguese by Lilian Nassi-Calò.


How to cite this post [ISO 690/2010]:

NASSI-CALÒ, L. Paper investigates: is your most cited work your best work? [online]. SciELO in Perspective, 2014 [viewed ]. Available from:


Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation