Impact – Nature’s Viewpoint: comments on special issue 502 (7471) 17th October, 2013

Impact : the search for the science that matters is the principle theme in this special issue of Nature, where six articles written by journalists and researchers examine if the current evaluation systems of science really select research that is  truly influential and of the greatest benefit to society which, after all, finances it with the taxes it pays.

Three principal lines of argument are highlighted in this analysis:

  • The Impact Factor (IF), which measures the frequency with which an article is cited in a particular year or period works well enough, but falls short overall. In other words, the Journal Impact Factor continues to be a criterion considered relevant by agencies responsible for the evaluation of research performance and is used in all countries the world over where evaluations are carried out. However, there is a consensus that many aspects connected with this evaluation process slip through the net when this “objective mathematical paradigm” is applied.
  • Research funding agencies are beginning to incorporate other alternative ways of measuring the value of the journals which publish research output into the procedures they use to evaluate institutions and projects. Without dismissing these objective methods and the importance that they represent in the professional careers of researchers, these agencies are beginning to evaluate the “social impact” of the research that they fund. This new direction is in line with the creed of “political correctness” which is used by industrial and commercial companies under the heading of “social engagement” whereby they realize that when they are part of a society and wish to gain public acceptance, they must demonstrate that what they do is really necessary, useful and applicable to the man in the street who pays the taxes.
  • Even though publications in Open Access are reaching the figure of 50% of published output, access to citation databases and “scientometric” information is not equally available in this way, since over 95% of this information can be accessed by subscription only from two commercial providers : Thomson Reuters and Elsevier. Projects that are still being developed are put forward so that they can achieve Open Access to citations which brings with it an as yet pending task: How should works be cited which are not strictly speaking papers in the true sense of the word, but appear in other formats such as videos, microscope slides, runs of meteorological data, the genome, etc?

The present issue, which we will call “Impact – Nature’s Viewpoint”, begins with an editorial¹ where it is accepted that those who carry out evaluations are beginning to understand that the conventional methodology which uses the following elements – citation metrics, Journal Impact Factor, and / or the opinions of other colleagues – are necessary but insufficient and should incorporate other aspects, among others, those elements which take into account the new field of “altmetrics”. This means that the question of what is important should be measured outside the confines of academia and must include the impact of research on decision makers in policy and health, and the effects on industry, the economy and social players. Conventional metrics must not continue to be used blindly and inconsistently, since they would end up by putting at risk the very reputation of those carrying out the evaluation process.

The challenge therefore is to find a balance between the different aspects of the evaluation process. If the emphasis is put on publication in high impact journals, this will mean that academics will feel that they are under undue pressure to publish in order to advance their careers. On the other hand, if the emphasis is put on economic and social impact then they will divert their time away from academic and research activities per se towards bureaucratic and legal red-tape connected with formulating patents or ill-considered spin-out companies. It is therefore important that those that carry out evaluations have clear and transparent procedures and explicit methods for measuring impact.

“El día del juicio²” (The Day of Judgement) (for universities) is the first article which follows the editorial. The article reports that since the end of the 1980’s (the UK was the first to do this closely followed by the USA) the governments of these two countries initiated programs for the systematic evaluation of their universities. The governments of Australia, Germany and Italy, etc, then followed suit using similar methodologies and experiences. Although these evaluations have helped to improve national research evaluations by increasing the presence of their countries in citation indexes, the results of these exercises in terms of the enormous evaluation efforts required and the results, when looked at from a  cost-benefit viewpoint, have led to critical voices being raised at the level of researchers,  university administrators and leaders in the field of education.

The possibility that some researchers may end up being published in the most prestigious journals, which places them in some sort of “The Golden Club³”, which may well open doors for them in terms of their professional career, is currently under discussion. One of the reasons behind this criticism is that there is a growing output of works worldwide which are not published in the “star” journals such as Science, Cell and Nature. Although the number of submissions sent to these journals has increased by a little more than 40% over the last 16 years – along with the rejection rate – submissions have increased worldwide by some 86%. This suggests that researchers are going to other places to get published. Although these “top” journals annually publish more works that are highly cited, this growth does not keep pace with the rest of the industry, since the percentage of the total of highly cited works in those journals is declining in favor of other journals, many of which are Open Access.

Research practices suffer because citation data⁴ are not freely available, which brings us to the fourth article in Nature. It is a paradox that in the age of free access to information, citations in major journals which are the nucleus of academic evaluations are not freely available. Citations to Open Access journals are likewise unavailable because there are no institutions that are compiling and organizing them in a usable way.

Citation analyses always end up using WoS and Scopus from the two commercial vendors whose contract license terms present barriers to the dissemination of the citation data of these analyses.

The are various open sources of citations. The two principle ones are CiteSeerX, primarily in Computing Science and CitEc in Economics (years ago there was the Citebase database of citations but unfortunately it went offline). It is because of this that the author of the article that we have mentioned presents Open Citations Corpus (OCC), a project of Open Access citation sources which has been in a pilot phase over the past two years.

If we sum up the references available in those services mentioned, and include the references available in PubMed and in OCC, they barely cover 4% of the estimated 50 million existing bibliographic references to journal articles and books. There is certainly a lot of information in recent years to other types of literature in Google Scholar and Microsoft Academic Search, but their results are subject to various criticisms among which are the lack of consistency in the results from the same search done on different days.

OCC has established agreements with a variety of the world’s most important journal publishers (commercial and Open Access) so that in the future the bibliographic reference data may be collected and which, like the abstracts, may be made available in Open Access. The references will be centrally harvested in CrossRef, so publishers will have to publish their articles with a DOI as participants in the CitedBy Linking service, with the only condition being that the publishers give their consent to include new metadata for this purpose.

Once the citations are freely available, services can be developed to generate the desired metrics, including a service that could permit the correction of erroneous references by the interested parties.

Bibliographic references  have not died. Quite the contrary, they are growing in such a way that they cover the entire spectrum of research, from lines of software code to video frames. This is the topic of the next article in Nature titled “Referencing: The reuse factor⁵”.

We are entering the age of reuse. Researchers and evaluators are struggling to find ways to adequately handle citations to published research results because it is the way in which credit is formally given to research but… there is a greater amount of scientific data published in other ways, such as genetic sequences, datasets in economics, meteorology, clinical trials without significant results, etc.

This information is being compiled, for example, in Dryad Digital Repository and Gen Bank, and also in Figshare. None of this information is found in WoS or Scopus. It is in this context that the project Research Data Alliance (RDA) was presented and created in August of 2012 by various agencies in the USA, Europe and Australia. The objective of RDA is to accelerate and facilitate the interchange of data between different disciplines that have unconventional results and are mandated to be cited by the funding agencies. Examples of unconventional data to be cited could be columns in spreadsheets and image sequences in a video, such references should not be lost within the citations of the article to be published.

The emergence of a large number of citations with exponential growth, giving credit to a  wide variety of results in different forms, will not be the end of the conventional bibliographic citation -of course not – but it will be the decline of the classic measure of impact which counts only citations in written works.

Finally, the article Impact: Pack a punch6 rounds off this reflection by indicating that reviewers and evaluators of research funds will have to increasingly focus on the social impact of academic research project proposals.

Public and private entities around the world will have to redefine the concept of impact and the types of impact because the different stakeholders vary widely across the range of human activity, whether in health, laboratories, the economy, education or any other group in society.

External links

CiteSeerX –

CitEc –

Open Citations Corpus –

PubMed –

Google Scholar –

Microsoft Academic Search –

CrossRef –

CitedBy Linking –

Dryad Digital Repository –

Gen Bank –

Figshare –

Research Data Alliance –

Research evaluation: Impact –


¹ The maze of impact metrics

² Research assessments: Judgement day

³ Science publishing: The golden club

Publishing: Open citations

Referencing: The reuse factor

Impact: Pack a punch


Nature: Impact, 2013, vol. 502, nº 7471, pp. 271-402. Available from: <>.


Ernesto SpinakAbout Ernesto Spinak

Collaborator on the SciELO program, a Systems Engineer with a Bachelor’s degree in Library Science, and a Diploma of Advanced Studies from the Universitat Oberta de Catalunya (Barcelona, Spain) and a Master’s in “Sociedad de la Información” (Information Society) from the same university. Currently has a consulting company that provides services in information projects  to 14 government institutions and universities in Uruguay.


Translated from the original in Spanish by Nicholas Cop Consulting.


Como citar este post [ISO 690/2010]:

SPINAK, E. Impact – Nature’s Viewpoint: comments on special issue 502 (7471) 17th October, 2013 [online]. SciELO in Perspective, 2013 [viewed ]. Available from:


Leave a Reply

Your email address will not be published.

Post Navigation