{"id":2551,"date":"2017-06-01T11:47:44","date_gmt":"2017-06-01T14:47:44","guid":{"rendered":"http:\/\/blog.scielo.org\/en\/?p=2551"},"modified":"2017-06-01T11:49:59","modified_gmt":"2017-06-01T14:49:59","slug":"the-myopia-of-bibliometric-indicators","status":"publish","type":"post","link":"https:\/\/blog.scielo.org\/en\/2017\/06\/01\/the-myopia-of-bibliometric-indicators\/","title":{"rendered":"The myopia of bibliometric indicators"},"content":{"rendered":"<p><strong>By Lilian Nassi-Cal\u00f2<\/strong><\/p>\n<div id=\"attachment_3519\" style=\"width: 310px\" class=\"wp-caption alignright\"><a href=\"http:\/\/blog.scielo.org\/wp-content\/uploads\/2017\/06\/bling.png\" target=\"_blank\"><img loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-3519\" class=\"wp-image-3519 size-medium\" title=\"Image: David Parkins.\" src=\"http:\/\/blog.scielo.org\/wp-content\/uploads\/2017\/06\/bling-300x247.png\" alt=\"Image: David Parkins.\" width=\"300\" height=\"247\" \/><\/a><p id=\"caption-attachment-3519\" class=\"wp-caption-text\">Image: <a href=\"http:\/\/www.nature.com\/news\/reviewers-are-blinkered-by-bibliometrics-1.21877\" target=\"_blank\">David Parkins<\/a>.<\/p><\/div>\n<p>The use of bibliometric indicators in science evaluation is a ubiquitous practice, despite the absence of an unequivocal relationship between citations and scientific quality, impact or merit. When considering innovation &#8211; the inherent characteristic of scientific research &#8211; the relationship is even more disconnected.<\/p>\n<p>This is the opinion of researchers at Georgia State University, Atlanta, GA, USA, and the Department of Managerial Economics, Strategy and Innovation at the University of Leuven, Belgium, according to a recent article in Nature<sup>1<\/sup>. Paula Stephan, Reinhilde Veuglers and Jian Wang have noted that members of science panels in various countries still largely use citation-based bibliometric indicators &#8211; such as the Journal Impact Factor, h-index, and citations accrued from Google Scholar &#8211; as proxies for evaluating the quality and impact of the scientific output for hiring and promotion purposes. Initiatives such as the 2012 San Francisco Declaration on Research Assessment<sup>2<\/sup> and the 2015 Leiden Manifesto<sup>3<\/sup>, though their wide repercussion and support by numerous research institutions and funding agencies worldwide, actually little changed the way science and scientists are evaluated. After all, bibliometric indices provide a simple (in many cases, simplistic) and convenient way to evaluate a large number of candidates, proposals or articles.<\/p>\n<p>The limitations of the Journal Impact Factor (JIF) and similar indicators of journals\u2019 performance in assessing individual articles and researchers are known by all. According to Stephan, even funding agencies that do not specifically request to inform the JIF in publications lists use this indicator, as well as citations counts and the h index to rank proposals. The researchers themselves contribute to this vicious circle. When asked to identify their most relevant publications, they usually select them based on citation indexes, rather than assigning articles their true academic significance or spotting a particularly innovative discovery.<\/p>\n<p>The article mentions the large-scale use of citation-based indicators for career promotion and hiring. Besides Italy, the Czech Republic, Flanders (Northwest Belgium) and China, the authors cite the <em>Qualis<\/em> program of the Brazilian Ministry of Education, which uses the JIF to define the allocation of research resources, which particularly penalizes Brazil\u2019s journals. According to the authors, a special exception is made to the United Kingdom\u2019s Research Excellence Framework, which explicitly recommends not to use the JIF in the evaluations.<\/p>\n<h3>Innovation requires time<\/h3>\n<p>Scientists are eager to make breakthrough discoveries, and on their behalf, they are said to even incur unethical practices and overestimate preliminary results. Stephen and his coworkers, however, actually believe that excessive use of bibliometric indexes with short-term windows (2-3 years) may discourage the publication of innovative results. To test their hypothesis, the authors analyzed the citations in the Web of Science of more than 660,000 articles published between 2001-2015 categorized as research with high, moderate and no degree of innovation. As a proxy for the degree of innovation, the researchers evaluated the list of references of the articles in search of unusual patterns of combination. From this analysis, the authors came to the conclusion that highly innovative articles take longer to be cited, as compared to moderately innovative and non-innovative papers. Among the highly innovative articles, two types of behavior were observed: either they became highly cited articles &#8211; citations begin to increase after 3-4 years and keep growing up to 15 years after publication &#8211; or were ignored as compared to articles with no degree of innovation. However, it is important to note that in the first three years after publication, the probability of a highly innovative article being among the 1% most cited papers is less than the probability for articles with no degree of innovation. This led the authors to conclude that the current system of research evaluation underestimates work that may have a high impact on long-term evaluation. It is also important to point out that articles that proved to be of high impact over time were published in journals with lower JIF. Thus, Stephen and colleagues<sup>1<\/sup> conclude that &#8220;the more we bind ourselves to quantitative short-term measures, the less likely we are to reward research with a high potential to shift the frontier &#8211; and those who do it\u201d.<\/p>\n<p>However, this observation is not totally unprecedented. In 2014, an article by John Ioannidis<sup>4<\/sup>, also published in Nature, sought to investigate whether, in the own researchers\u2019 view, his most cited work was his best work. The paper, object of a post in this blog<sup>5<\/sup>, actually brought up more questions than answers, such as the difficulty of early identifying an innovative article based on 2-3 years window bibliometric indicators, or when they are cited by articles from other, less similar areas. However, at the time, one of the author\u2019s conclusions was the need of resorting to other indexes besides citation based metrics to complement the evaluation of science.<\/p>\n<h3>Recommendations to the scientific community<\/h3>\n<p>In order to encourage researchers to undertake more innovative fields of science, it is necessary to foster a shift in the whole scientific community in order to restrict the indiscriminate use of short-term bibliometric indicators.<\/p>\n<p><strong>Researchers<\/strong> \u2013 Restrict the use of JIF and citation-based indexes to guide the choice of topics and where to submit articles. Do not include such indicators in CV and grant proposals.<\/p>\n<p><strong>Funding agencies<\/strong> \u2013 Provide multiple ways to evaluate the publications of researchers and institutions. Exclude citation and JIF measures from grant proposals, and do not allow them to be discussed by reviewers. Include experts from other areas on review panels and periodically evaluate the performance of grant applicants using 5-10 year window indexes.<\/p>\n<p><strong>Peer reviewers<\/strong> \u2013 Attempt to evaluate the article or candidate leaving metrics aside, especially short term ones.<\/p>\n<p><strong>Editors<\/strong> \u2013 Ignore the metrics used to evaluate papers. Propose the use of metrics with longer time frames.<\/p>\n<p><strong>Universities<\/strong> \u2013 Adopt as a standard practice in the evaluation panels that members actually read the candidates\u2019 research and not just their bibliometric indexes, like the recommendation by UK\u2019s REF. When evaluating candidates, emphasize how researchers approach certain proposed questions. In this sense, Professor of Philosophy of Science at UFRJ Ant\u00f4nio Augusto P. Videira\u2019s consideration<sup>6<\/sup> applies: &#8220;The fact that the use of an indicator makes one author or the other eligible due to the fact that he has published in a journal with a higher IF, should be surprising, since more importance is given to where he was published than reading his work\u201d.<\/p>\n<p>The authors of the study believe that &#8220;if the academic community really wants to create more objective assessments, all of us &#8211; from early-career researchers to the heads of funding agencies \u2013 need to use qualitative and quantitative indicators responsibly [\u2026] in order to avoid indicators that penalize the types of researcher and project that have the greatest potential to push boundaries\u201d.<\/p>\n<h3>Notes<\/h3>\n<p>1. <span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&amp;rft.jtitle=Nature&amp;rft_id=info%3Adoi%2F10.1038%2F544411a&amp;rfr_id=info%3Asid%2Fresearchblogging.org&amp;rft.atitle=Reviewers+are+blinkered+by+bibliometrics&amp;rft.issn=0028-0836&amp;rft.date=2017&amp;rft.volume=544&amp;rft.issue=7651&amp;rft.spage=411&amp;rft.epage=412&amp;rft.artnum=http%3A%2F%2Fwww.nature.com%2Fdoifinder%2F10.1038%2F544411a&amp;rft.au=Stephan%2C+P.&amp;rft.au=Veugelers%2C+R.&amp;rft.au=Wang%2C+J.&amp;rfe_dat=bpr3.included=1;bpr3.tags=Research+%2F+Scholarship%2CEducation%2C+Ethics%2C+Funding%2C+Library+Science%2C+Policy%2C+Publishing%2C+Science+Communication%2C+Creative+Commons\">STEPHAN, P., VEUGELERS, R. and WANG, J. Reviewers are blinkered by bibliometrics. <em>Nature<\/em> [online]. 2017, vol. 544, no. 7651, pp. 411-412 [viewed 14 May 2017]. DOI: <a href=\"http:\/\/dx.doi.org\/10.1038\/544411a\" target=\"_blank\" rev=\"review\">10.1038\/544411a<\/a>. Available from: <a href=\"http:\/\/www.nature.com\/news\/reviewers-are-blinkered-by-bibliometrics-1.21877\" target=\"_blank\">http:\/\/www.nature.com\/news\/reviewers-are-blinkered-by-bibliometrics-1.21877<\/a><\/span><\/p>\n<p>2. The San Francisco Declaration on Research Assessment (DORA) [online]. San Francisco Declaration on Research Assessment (DORA) [viewed 14 May 2017]. Available from: <a href=\"http:\/\/www.ascb.org\/dora\/\" target=\"_blank\">http:\/\/www.ascb.org\/dora\/<\/a><\/p>\n<p>3. <span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&amp;rft.jtitle=Nature&amp;rft_id=info%3Adoi%2F10.1038%2F520429a&amp;rfr_id=info%3Asid%2Fresearchblogging.org&amp;rft.atitle=Bibliometrics%3A+The+Leiden+Manifesto+for+research+metrics&amp;rft.issn=0028-0836&amp;rft.date=2015&amp;rft.volume=520&amp;rft.issue=7548&amp;rft.spage=429&amp;rft.epage=431&amp;rft.artnum=http%3A%2F%2Fwww.nature.com%2Fdoifinder%2F10.1038%2F520429a&amp;rft.au=Hicks%2C+D.&amp;rft.au=Wouters%2C+P.&amp;rft.au=Waltman%2C+L.&amp;rft.au=de+Rijcke%2C+S.&amp;rft.au=Rafols%2C+I.&amp;rfe_dat=bpr3.included=1;bpr3.tags=Research+%2F+Scholarship%2CEducation%2C+Ethics%2C+Funding%2C+Library+Science%2C+Policy%2C+Publishing%2C+Science+Communication%2C+Creative+Commons\">HICKS, D., <em>et al<\/em>. Bibliometrics: The Leiden Manifesto for research metrics. <em>Nature<\/em> [online]. 2015, vol. 520, n\u00ba 7548, pp. 429-431 [viewed 14 May 2017]. DOI: <a href=\"http:\/\/dx.doi.org\/10.1038\/520429a\" target=\"_blank\" rev=\"review\">10.1038\/520429a<\/a>. Available from: <a href=\"http:\/\/www.nature.com\/news\/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351\" target=\"_blank\">http:\/\/www.nature.com\/news\/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351<\/a><\/span><\/p>\n<p>4. <span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&amp;rft.jtitle=Nature&amp;rft_id=info%3Adoi%2F10.1038%2F514561a&amp;rfr_id=info%3Asid%2Fresearchblogging.org&amp;rft.atitle=Bibliometrics%3A+Is+your+most+cited+work+your+best%3F&amp;rft.issn=0028-0836&amp;rft.date=2014&amp;rft.volume=514&amp;rft.issue=7524&amp;rft.spage=561&amp;rft.epage=562&amp;rft.artnum=http%3A%2F%2Fwww.nature.com%2Fdoifinder%2F10.1038%2F514561a&amp;rft.au=Ioannidis%2C+J.&amp;rft.au=Boyack%2C+K.&amp;rft.au=Small%2C+H.&amp;rft.au=Sorensen%2C+A.&amp;rft.au=Klavans%2C+R.&amp;rfe_dat=bpr3.included=1;bpr3.tags=Research+%2F+Scholarship%2CEducation%2C+Ethics%2C+Funding%2C+Library+Science%2C+Policy%2C+Publishing%2C+Science+Communication%2C+Creative+Commons\">IOANNIDIS, J. P. A., <em>et al<\/em>. Bibliometrics: Is your most cited work your best? <em>Nature<\/em> [online]. 2014, vol. 514, n\u00ba 7524, pp. 561-562 [viewed 14 May 2017]. DOI: <a href=\"http:\/\/dx.doi.org\/10.1038\/514561a\" target=\"_blank\" rev=\"review\">10.1038\/514561a<\/a>. Available from: <a href=\"http:\/\/www.nature.com\/news\/bibliometrics-is-your-most-cited-work-your-best-1.16217#assess\" target=\"_blank\">http:\/\/www.nature.com\/news\/bibliometrics-is-your-most-cited-work-your-best-1.16217#assess<\/a><\/span><\/p>\n<p>5. NASSI-CAL\u00d2, L. Paper investigates: is your most cited work your best work? [online]. <i>SciELO in Perspective<\/i>, 2014 [viewed 14 May 2017]. Available from: <a href=\"http:\/\/blog.scielo.org\/en\/2014\/11\/24\/paper-investigates-is-your-most-cited-work-your-best-work\/\" target=\"_blank\">http:\/\/blog.scielo.org\/en\/2014\/11\/24\/paper-investigates-is-your-most-cited-work-your-best-work\/ <\/a><\/p>\n<p>6. Videira A. A. P. Declara\u00e7\u00e3o recomenda eliminar o uso do Fator de Impacto na avalia\u00e7\u00e3o de pesquisa [online]. <em>Estudos de CTS \u2013 Estudos sociais e conceituais de ci\u00eancia, tecnologia e sociedade<\/em>, 2013 [viewed 14 May 2017]. Available from: <a href=\"http:\/\/estudosdects.wordpress.com\/2013\/07\/29\/declaracao-recomenda-eliminar-o-uso-do-fator-de-impacto-na-avaliacao-de-pesquisa\/\" target=\"_blank\">http:\/\/estudosdects.wordpress.com\/2013\/07\/29\/declaracao-recomenda-eliminar-o-uso-do-fator-de-impacto-na-avaliacao-de-pesquisa\/<\/a><\/p>\n<h3>References<\/h3>\n<p><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&amp;rft.jtitle=Nature&amp;rft_id=info%3Adoi%2F10.1038%2F520429a&amp;rfr_id=info%3Asid%2Fresearchblogging.org&amp;rft.atitle=Bibliometrics%3A+The+Leiden+Manifesto+for+research+metrics&amp;rft.issn=0028-0836&amp;rft.date=2015&amp;rft.volume=520&amp;rft.issue=7548&amp;rft.spage=429&amp;rft.epage=431&amp;rft.artnum=http%3A%2F%2Fwww.nature.com%2Fdoifinder%2F10.1038%2F520429a&amp;rft.au=Hicks%2C+D.&amp;rft.au=Wouters%2C+P.&amp;rft.au=Waltman%2C+L.&amp;rft.au=de+Rijcke%2C+S.&amp;rft.au=Rafols%2C+I.&amp;rfe_dat=bpr3.included=1;bpr3.tags=Research+%2F+Scholarship%2CEducation%2C+Ethics%2C+Funding%2C+Library+Science%2C+Policy%2C+Publishing%2C+Science+Communication%2C+Creative+Commons\">HICKS, D., <em>et al<\/em>. Bibliometrics: The Leiden Manifesto for research metrics. <em>Nature<\/em> [online]. 2015, vol. 520, n\u00ba 7548, pp. 429-431 [viewed 14 May 2017]. DOI: <a href=\"http:\/\/dx.doi.org\/10.1038\/520429a\" target=\"_blank\" rev=\"review\">10.1038\/520429a<\/a>. Available from: <a href=\"http:\/\/www.nature.com\/news\/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351\" target=\"_blank\">http:\/\/www.nature.com\/news\/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351<\/a><\/span><\/p>\n<p><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&amp;rft.jtitle=Nature&amp;rft_id=info%3Adoi%2F10.1038%2F514561a&amp;rfr_id=info%3Asid%2Fresearchblogging.org&amp;rft.atitle=Bibliometrics%3A+Is+your+most+cited+work+your+best%3F&amp;rft.issn=0028-0836&amp;rft.date=2014&amp;rft.volume=514&amp;rft.issue=7524&amp;rft.spage=561&amp;rft.epage=562&amp;rft.artnum=http%3A%2F%2Fwww.nature.com%2Fdoifinder%2F10.1038%2F514561a&amp;rft.au=Ioannidis%2C+J.&amp;rft.au=Boyack%2C+K.&amp;rft.au=Small%2C+H.&amp;rft.au=Sorensen%2C+A.&amp;rft.au=Klavans%2C+R.&amp;rfe_dat=bpr3.included=1;bpr3.tags=Research+%2F+Scholarship%2CEducation%2C+Ethics%2C+Funding%2C+Library+Science%2C+Policy%2C+Publishing%2C+Science+Communication%2C+Creative+Commons\">IOANNIDIS, J. P. A., <em>et al<\/em>. Bibliometrics: Is your most cited work your best? <em>Nature<\/em> [online]. 2014, vol. 514, n\u00ba 7524, pp. 561-562 [viewed 14 May 2017]. DOI: <a href=\"http:\/\/dx.doi.org\/10.1038\/514561a\" target=\"_blank\" rev=\"review\">10.1038\/514561a<\/a>. Available from: <a href=\"http:\/\/www.nature.com\/news\/bibliometrics-is-your-most-cited-work-your-best-1.16217#assess\" target=\"_blank\">http:\/\/www.nature.com\/news\/bibliometrics-is-your-most-cited-work-your-best-1.16217#assess<\/a><\/span><\/p>\n<p>NASSI-CAL\u00d2, L. Paper investigates: is your most cited work your best work? [online]. <i>SciELO in Perspective<\/i>, 2014 [viewed 14 May 2017]. Available from: <a href=\"http:\/\/blog.scielo.org\/en\/2014\/11\/24\/paper-investigates-is-your-most-cited-work-your-best-work\/\" target=\"_blank\">http:\/\/blog.scielo.org\/en\/2014\/11\/24\/paper-investigates-is-your-most-cited-work-your-best-work\/<\/a><\/p>\n<p><span class=\"Z3988\" title=\"ctx_ver=Z39.88-2004&amp;rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&amp;rft.jtitle=Nature&amp;rft_id=info%3Adoi%2F10.1038%2F544411a&amp;rfr_id=info%3Asid%2Fresearchblogging.org&amp;rft.atitle=Reviewers+are+blinkered+by+bibliometrics&amp;rft.issn=0028-0836&amp;rft.date=2017&amp;rft.volume=544&amp;rft.issue=7651&amp;rft.spage=411&amp;rft.epage=412&amp;rft.artnum=http%3A%2F%2Fwww.nature.com%2Fdoifinder%2F10.1038%2F544411a&amp;rft.au=Stephan%2C+P.&amp;rft.au=Veugelers%2C+R.&amp;rft.au=Wang%2C+J.&amp;rfe_dat=bpr3.included=1;bpr3.tags=Research+%2F+Scholarship%2CEducation%2C+Ethics%2C+Funding%2C+Library+Science%2C+Policy%2C+Publishing%2C+Science+Communication%2C+Creative+Commons\">STEPHAN, P., VEUGELERS, R. and WANG, J. Reviewers are blinkered by bibliometrics. <em>Nature<\/em> [online]. 2017, vol. 544, no. 7651, pp. 411-412 [viewed 14 May 2017]. DOI: <a href=\"http:\/\/dx.doi.org\/10.1038\/544411a\" target=\"_blank\" rev=\"review\">10.1038\/544411a<\/a>. Available from: <a href=\"http:\/\/www.nature.com\/news\/reviewers-are-blinkered-by-bibliometrics-1.21877\" target=\"_blank\">http:\/\/www.nature.com\/news\/reviewers-are-blinkered-by-bibliometrics-1.21877<\/a><\/span><\/p>\n<p>The San Francisco Declaration on Research Assessment (DORA) [online]. San Francisco Declaration on Research Assessment (DORA) [viewed 14 May 2017]. Available from: <a href=\"http:\/\/www.ascb.org\/dora\/\" target=\"_blank\">http:\/\/www.ascb.org\/dora\/<\/a><\/p>\n<p>Videira A. A. P. Declara\u00e7\u00e3o recomenda eliminar o uso do Fator de Impacto na avalia\u00e7\u00e3o de pesquisa [online]. <em>Estudos de CTS \u2013 Estudos sociais e conceituais de ci\u00eancia, tecnologia e sociedade<\/em>, 2013 [viewed 14 May 2017]. Available from: <a href=\"http:\/\/estudosdects.wordpress.com\/2013\/07\/29\/declaracao-recomenda-eliminar-o-uso-do-fator-de-impacto-na-avaliacao-de-pesquisa\/\" target=\"_blank\">http:\/\/estudosdects.wordpress.com\/2013\/07\/29\/declaracao-recomenda-eliminar-o-uso-do-fator-de-impacto-na-avaliacao-de-pesquisa\/<\/a><\/p>\n<p>&nbsp;<\/p>\n<h3><a href=\"http:\/\/blog.scielo.org\/wp-content\/uploads\/2015\/10\/Lilian.jpg\" target=\"_blank\"><img loading=\"lazy\" decoding=\"async\" class=\"alignright wp-image-907\" src=\"http:\/\/blog.scielo.org\/wp-content\/uploads\/2015\/10\/Lilian.jpg\" alt=\"lilian\" width=\"180\" height=\"163\" \/><\/a>About\u00a0Lilian Nassi-Cal\u00f2<\/h3>\n<p>Lilian Nassi-Cal\u00f2 studied chemistry at <em>Instituto de Qu\u00edmica <\/em>\u2013 USP, holds a doctorate in Biochemistry by the same institution and a post-doctorate as an Alexander von Humboldt fellow in Wuerzburg, Germany. After her studies, she was a professor and researcher at IQ-USP. She also worked as an industrial chemist and presently she is Coordinator of Scientific Communication at BIREME\/PAHO\/WHO and a collaborator of SciELO.<\/p>\n<p>&nbsp;<\/p>\n<p>Translated from the\u00a0original in <a href=\"http:\/\/blog.scielo.org\/blog\/2017\/06\/01\/a-miopia-dos-indicadores-bibliometricos\/\" target=\"_blank\">Portuguese<\/a>\u00a0by Lilian Nassi-Cal\u00f2.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The use of bibliometric indicators in science evaluation is a ubiquitous practice, despite the fact that there is no unequivocal relationship between citations and scientific quality, impact or merit. A recent study showed that the indiscriminate use of these indicators may hinder the publication of innovative research results, delaying the development of science. <span class=\"ellipsis\">&hellip;<\/span> <span class=\"more-link-wrap\"><a href=\"https:\/\/blog.scielo.org\/en\/2017\/06\/01\/the-myopia-of-bibliometric-indicators\/\" class=\"more-link\"><span>Read More &rarr;<\/span><\/a><\/span><\/p>\n","protected":false},"author":22,"featured_media":2553,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":"","_links_to":"","_links_to_target":""},"categories":[3],"tags":[29,10,31,32,7],"class_list":["post-2551","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-analysis","tag-bibliometrics","tag-impact-factor","tag-research-evaluation","tag-research-policy","tag-scholarly-communication"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/posts\/2551","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/users\/22"}],"replies":[{"embeddable":true,"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/comments?post=2551"}],"version-history":[{"count":4,"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/posts\/2551\/revisions"}],"predecessor-version":[{"id":2557,"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/posts\/2551\/revisions\/2557"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/media\/2553"}],"wp:attachment":[{"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/media?parent=2551"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/categories?post=2551"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blog.scielo.org\/en\/wp-json\/wp\/v2\/tags?post=2551"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}