Tag: Research Evaluation

The time has come for the quality journals of Brazil

Policies, programs and research projects are expected to leverage journals of Brazil which will contribute to widening the recognition and qualification of Brazilian science in its scientific and social dimensions, beyond the classic bibliometric ranking of journals which influences researchers, academic institutions, journals and funding agencies. Read More →

Collaboration and concerted action are key to making open data a reality [Originally published in LSE Impact of Social Sciences blog in October/2017]

The case for open data is increasingly inarguable. Improved data practice can help to address concerns about reproducibility and research integrity, reducing fraud and improving patient outcomes, for example. Research also shows good data practice can lead to improved productivity and increased citations. However, as Grace Baynes reports, recent survey data shows that while the research community recognises the value of open data, uptake remains slow, with good data practice and data sharing far from the status quo. To effect change, government, funders, institutions, publishers, and researchers themselves all have an important role to play. Read More →

Some ideas about Brazilian postgraduation

The level of both master’s and doctoral courses in Brazil can be improved by introducing new disciplines that focus on the “formation” of the student rather than “information”. Read More →

A statistical fix for the replication crisis in science [Originally published in The Conversation in October/2017]

How should we evaluate initial claims of a scientific discovery? Here’s is a new idea: Only P-values less than 0.005 should be considered statistically significant. P-values between 0.005 and 0.05 should merely be called suggestive, but statistical significance should not serve as a bright-line threshold for publication. Read More →

We have the technology to save peer review – now it is up to our communities to implement it [Originally published in LSE Impact of Social Sciences blog in September/2017]

There has been an explosion in innovation and experimentation in peer review in the last five years. While the ideal of peer review is still needed, it is its implementation, and the present lack of any viable alternative, that must be looked at for improvement, based on three core traits that underpin any viable peer-review system: quality control and moderation, performance and engagement incentives, and certification and reputation. Read More →

The Center for Open Science, alternative to Elsevier, announces new preprint services [Originally published in Ithaka S+R blog in August/2017]

As commercial providers buy and build their way into the institutional repository and preprint marketplace, the not-for-profit Center for Open Science (COS) is offering an alternative by expanding what it calls the preprint services it powers through its platform. Read More →

Science is largely a collective enterprise. That collectivity needs to be recognized more explicitly

There is a disconnect between the collective nature of science, and the way the publishing and scholarly credit and reward systems focus very strongly on individual achievements. This results in problems that affect not only science, but society’s trust in science, and thus society as a whole. Read More →

What will peer review be like in 2030?

Although the scientific literature has always been reviewed before it was published, current forms of peer review are only a few decades old and from the outset have been subjected to criticism and limitations. Open review and preprints servers have emerged in recent years as possible solutions in a world of growing communication in scientific research. Open reviews, artificial intelligence, collaborative and “cloud” reviews… what will peer review be like in 2030? Read More →

Editorial ethics – other types of plagiarism… and counting

Plagiarism and fraud multiply in a variety of ways. Recently two less frequent types have come up – accidental plagiarism and referee plagiarism. In any case, plagiarism is an ethical breach that erodes public confidence and we must prevent it. Read More →

The editors’ role on peer review: how to identify bad referees

A theoretical peer-review model assesses the effects of referees’ unethical conduct on approving and rejecting articles and how journal editors can mitigate this behavior. What is at stake is the reliability, transparency and efficiency of pre-publication peer review. Read More →

SciELO 20 Years – September 26-28th, 2018

The celebration of SciELO’s 20 years will comprise a series of events related to the evaluation of the performance of SciELO as a differentiated model of open access publishing. The events will culminate with an international conference which projects as a landmark forum for discussing the innovations of scholarly communication in line with open science practices. Read More →

The myopia of bibliometric indicators

The use of bibliometric indicators in science evaluation is a ubiquitous practice, despite the fact that there is no unequivocal relationship between citations and scientific quality, impact or merit. A recent study showed that the indiscriminate use of these indicators may hinder the publication of innovative research results, delaying the development of science. Read More →

Grant applications submitted to the NIH can cite preprints

The use of preprints as a means of accelerating research communication has become a frequent practice in many areas of knowledge also as a way to improve peer review. The U.S. National Institutes of Health, a renowned research and development agency, recently announced that grant applications and reports are entitled to cite preprints, “to speed the dissemination and enhance the rigor of their work”. Read More →

Gender disparities in science persist despite significant advances

The participation of women as authors in academic publications has been increasing significantly worldwide and in all areas of knowledge, reaching 49% in Brazil and Portugal, followed by Australia (44%) and the European Union (41%). Gender equity in science, however, still has a long way to go, especially in the editing and peer review functions. A study of more than 41,000 articles published between 2007 and 2015 shows that male editors – who are majority – preferentially select same gender referees. Read More →

Openness is the only quality of an academic article that can be objectively measured

Quality of scientific research articles is a widespread preoccupation in academic circles. The most used proxy is based on citation counts, not of the article itself, but of the averages of articles appearing in the same journal during a given time window. This is known as the Journal Impact Factor, which may be objective within its own definition, but utterly lacks objectivity with regard to scientific quality of individual articles. Only some technical qualities of articles can be assessed at the time of their publication, and, significantly, their openness, the degree to which the research results they describe can be immediately and universally shared. Read More →