By Sonia MR Vasconcelos*,**, Patrick Menezes*, Mariana D Ribeiro* e Elizabeth Heitman***
Research integrity and scientific rigor
The discussion on research integrity and scientific rigor has gained increasing attention in recent decades, especially in the scope of scientific output and associated challenges. Among the various, are those of ethical and methodological nature that confront the reliability of results.
In the context of scholarly publications, transformations in the scope of editorial policies have been remarkable, especially in the last fifteen years. In 2009, publishers of great international permeation in the most diverse areas, such as the Nature Publishing Group (NPG), made their policies on copyright responsibility more explicit. According to an editorial published in 2009 by the NPG:
Before submitting the paper, at least one senior member from each collaborating group must take responsibility for their group’s contribution. Three major responsibilities are covered: preservation of the original data on which the paper is based, verification that the figures and conclusions accurately reflect the data collected and that manipulations to images are in accordance with Nature journal guidelines (http://tinyurl.com/cmmrp7), and minimization of obstacles to sharing materials, data and algorithms through appropriate planning.1
This initiative was the result of a process of discussion at the publisher and monitoring of a consultation with the authors, whose opinions on stricter policies on scientific authorship were not consensual. About a year earlier, in 2008, NPG engaged in another initiative focused on publication ethics, to verify originality of manuscripts, with the support of the CrossCheck database. Among the scientific publishers involved were the Association for Computing Machinery, American Society of Neuroradiology, BMJ Publishing Group, Elsevier, Institute of Electrical & Electronics Engineers, and NPG. Gradually, attention to plagiarism detection systems, commercial and free, has been expanding in the publications context. These and other actions in the editorial scenario stem from motivations that include concerns of ethical and scientific nature – in part, from the researchers themselves. Such concerns are consistent with notions about scientific rigor, both in communicating results and in peer review.
But when we talk about scientific rigor, in general, what are we talking about, objectively? Casadevall and Fang (2016),2 among the researchers who have contributed most to the discussion on scientific rigor and integrity, cite the Online Etymology Dictionary when describing that the word “rigor” derives from an old French word, “rigueur” – strength and hardness3. A prevalent perspective in the natural and exact sciences brings the idea of rigor to “solid work”, expressing a sense of reliable information. However, as Casadevall and Fang (2016)2 define, the words “exact” and “careful”, which make up this solidity, do not account for what it would be like to practice rigorous science – this rigor goes beyond the concept of accuracy and care in the experimental design.
Casadevall and Fang (2016)2 propose “a Pentateuch” to represent scientific rigor, whose items, briefly presented, would be the following: (i) redundancy in the experimental design (use of controls, replicates, etc.); (ii) solid statistical analysis (observing effect size, for example); (iii) error recognition (checking sources of error, such as contamination of reagents); (iv) precaution to avoid logical pitfalls (especially in the interpretation of data); (v) intellectual honesty (ethical posture and good research practices, such as verification, regardless of the results by other researchers). For Casadevall and Fang (2016) “scientific rigor is multifaceted. No single criterion can define it. Even the most careful experimental approach is not rigorous if the interpretation relies on a logical fallacy or is intellectually dishonest”.2
In this context, the research report has received particular attention. Researchers registered with the Swiss Federal Food Safety and Veterinary Office (FSVO), participated in a survey (n = 530) exploring their perceptions of scientific rigor in animal research. In this study, the relationship with rigor was explored considering the way the respondents managed the risk of research bias, including the reporting of the results. As described by Reichlin et al (2016), “participants performed rather poorly when asked to choose effective over ineffective measures against six different biases.”4 The authors draw attention to the need for more reliable scientific rigor indicators in animal research.
This relationship between scientific rigor and bias control in the various stages of the research – including the report – harmonizes with the perspective of one of the largest financiers of biomedical research, the US National Institutes of Health (NIH). For NIH, it would be:
… scientific rigor is the strict application of the scientific method to ensure robust and unbiased experimental design, methodology, analysis, interpretation and reporting of results. This includes full transparency in reporting experimental details so that others may reproduce and extend the findings.5
In 2014, after a series of biomedical studies identified as non-reproducible, the NIH announced policies to respond to this apparent crisis, seeking “to restore the self-correcting nature of preclinical research.”6 However, the agency noted that although discussions on reproducibility focus on pre-clinical research, “the basic principles and areas of focus apply to the full spectrum of biomedical research – from basic to clinical.”5 The main policy was directed towards a greater promotion of rigor and transparency in each phase of the research. It was defined that, as of January 25, 2016, each application for NIH should clarify more objectively how researchers would promote greater rigor and transparency in the proposed research. In reviewing their criteria, the NIH listed four aspects that would play a relevant role in promoting greater transparency and rigor: the scientific premise, rigor in experimental design, relevant biological variables (e.g., sex), and the authentication of chemical and biological resources considered central to research.
In another document, entitled Advanced Notice of Coming Requirements for Formal Instruction in Rigorous Experimental Design and Transparency to Enhance Reproducibility, the NIH would also require, from 2017, “… formal instruction in scientific rigor and transparency to enhance reproducibility for all individuals supported by institutional training grants, institutional career development awards, or individual fellowships.”7
Scientific rigor in qualitative research
In the field of qualitative research, the discussion on rigor has also expanded in recent years, in an arena of disputes over the concept itself. A search ( in Dec 28, 2020) at the Scopus database for the terms “rigor” (which includes British spelling “rigour”) and “qualitative research” in article titles, abstract, and keywords (1999-2019) allowed us to observe the growth in the number of related publications, especially as of 2012. Of the 851 documents in the period, 90% of the publications were “articles” (76.3%) or “reviews” (13.4%). For the total of documents, the highest percentage of distribution was for publications associated with Medicine (28%), Social Sciences (23%) and Nursing (16%).
Among the articles that discuss, more punctually, scientific rigor in qualitative research, there is Barbour’s (2001),8 one of the most cited. When arguing about how to improve rigor in qualitative research, Barbour (2001)8 highlighted a perception that, for the author, seems common among researchers and that she believes to be mistaken: that adherence to purely technical procedures in conducting and reporting qualitative research would confer rigor to studies. Barbour (2001)8 suggests that there is a kind of blind spot inherent in this understanding of scientific rigor and draws attention to the role of research design and data analysis:
Reducing qualitative research to a list of technical procedures (such as purposive sampling, grounded theory, multiple coding, triangulation, and respondent validation) is overly prescriptive and results in “the tail wagging the dog.” None of these “technical fixes” in itself confers rigour; they can strengthen the rigour of qualitative research only if embedded in a broader understanding of qualitative research design and data analysis.8
About 20 years after that observation by Barbour,8 academic concern with scientific rigor and credibility of qualitative research is observed in recommendations and checklists developed and/or explored over that period. For Johnson, Adkins and Chauvin (2020),9 one of the features that directly affect research credibility is the honest and transparent account of how researchers deal with bias and other possible confounding factors during the conduct of the research.
Regarding the financing of projects based on qualitative research, this attention to scientific rigor is also of strategic interest to agencies, such as the National Science Foundation (NSF) in the United States. In 2003, NSF funded a workshop entitled Scientific Foundations of Qualitative Research.10 In the report published by the organizers, the importance of clearer criteria, including rigor and systematic approach, is highlighted for the evaluation of qualitative research in project proposals. The report adds that “ad hoc and casual approaches to data collection and analysis should not be counted as qualitative research strategies.”10
In 2005, NSF also financed the Workshop on Interdisciplinary Standards for Systematic Qualitative Research,11 which explored approaches focused on the areas of cultural anthropology, law, political science, and sociology. According to the associated report,
… twenty-four scholars from the four disciplines were charged to (1) articulate the standards used in their particular field to ensure rigor across the range of qualitative methodological approaches; 1* [*Methodological approaches include ethnography, historical and comparative analysis, textual and discourse analysis, focus groups, archival and oral history, observational studies, interpretation of images and cultural materials, and unstructured and semi-structured interviews]. (2) identify common criteria shared across the four disciplines for designing and evaluating research proposals and fostering multidisciplinary collaborations; and (3) develop an agenda for strengthening the tools, training, data, research design, and infrastructure for research using qualitative approaches.11
The group of experts identified a consensus among the four disciplines, indicating ten criteria for the design and evaluation of good quality qualitative research. They include a strong foundation in the associated academic literature; a detailed description of the methods for collecting the data and the analysis techniques; clear connection between theory and data. The report11 recommends the NSF’s contribution to promote the improvement of qualitative research. This contribution would be made not only by increasing investments that could include “education, training and infrastructure”, but also by publicizing its commitment to support “high-level” qualitative research. They also recommended funding to support student training by conducting research of this nature, not just established researchers.
In 2018, NSF funded the Workshop Intention and Implementation: Transparency in Qualitative Social Science, focusing on anthropology, sociology and political science.12 One of the questions for the appreciation of participating researchers was: “How do the various contexts and constraints of your work, including specific data collection/creation activities but also activities for which the creation of research data is not a defined goal, shape your understanding of ‘data sharing’ or ‘data access’ as a means of achieving ‘research transparency’?”12
This provocation is consistent with the challenges that are required to improve criteria related to scientific rigor and the quality of research in areas that generate qualitative data. Combined with the ongoing discussion on these criteria, initiatives, such as those supported by NSF, suggest a new agenda within the scope of project judging based on qualitative research. This agenda seems to be more clearly configured when we consider that our research environment is undergoing a major transformation. In this context, reinterpretations are inserted on, for example, what gives quality and reliability to research in the most varied areas and its relationship with new demands placed on open science.
Scientific rigor, qualitative research, and open science
Among the incentives for open science are a transformation in access to scientific tools, with multiple conceptions about the concept of open science. As described by Sarita Albagli, the meaning of open science involves “… greater porosity and interlocution of science with other social segments and other types of knowledge, in the broad spectrum of possibilities and spaces for the production of knowledge.”13 According to the Livro Verde-Ciência aberta e dados abertos: mapeamento e análise de políticas, infraestruturas e estratégias em perspectiva nacional e internacional, published in 2017 by the Fundação Oswaldo Cruz (Fiocruz),
… Open Science encompasses different pillars, among which are open access to publications and the opening of scientific data, with the main benefits: capacity for research reproducibility, greater transparency of public funding, increased speed of information circulation as an input for science advancement and reuse of data in new research, resulting in higher quality science, with faster progress and in line with the needs of societies.14
This ongoing transformation requires a new approach to the scientific process and the dissemination of generated knowledge, strengthening the relationship between science and society. At the heart of this approach is the removal of barriers to sharing resources, methods and tools at any stage of the research process. In this sense,
open access to publications, open research data, open source software, open collaboration, open peer review, open notebooks, open educational resources, open monographs, citizen science, and research crowdfunding, fall into the boundaries of Open Science.15
As described in 2018 in the document The state of open science in the Nordic countries,
Open access is one of the means of achieving open science…. Research data refers to information, in particular facts or numbers, collected to be examined and considered as a basis for reasoning, discussion, or calculation. In a research context, examples of data include statistics, results of experiments, measurements, observations resulting from fieldwork, survey results, interview recordings.16
Accompanying the advances in the development of policies for open science, there are concerns about the relationship of this culture under development with the tradition of scientific assessment. In 2016, for example, Portugal formed an Interministerial Working Group on the National Open Science Policy (GT-PNCA, in the Portuguese acronym). In the report produced17, recommendations were made to encourage good scientific evaluation practices. This open science policy is presented as a priority by the Government and the Ministry of Science, Technology and Higher Education (MCTES). It includes, among the objectives, reinforcing initiatives to expand public access to the scientific research results, whose main pillars would be:
i) transparency in practices, methodology, observation and data collection, ii) public availability and reuse of scientific data, public access and transparency in scientific communication, iii) the use of web-based tools in order to facilitate scientific collaboration.18
In 2018, the European Commission, in collaboration with funding agencies and scientific councils from 11 European countries, decided to take a radical step and launch a revolutionary initiative, with the aim of “making total and immediate open access a reality.”19 This is a task force so that only studies published in open access would have funding through public resources.
In response to the open access plan, more explicit criteria of quality and scientific rigor are part of the demands that journals or platforms must meet. This concern is expressed in Latin America, for example, through SciELO, which held an event commemorating its 20th anniversary in September 2018, with open science and its relationship with the dynamics of scientific communication as its focus, the SciELO 20 Years Conference. Initiatives like that are consistent with other actions in the research field. One of these was recently implemented by the American Psychological Association (APA). In 2018, APA released the document Journal Article Reporting Standards for Qualitative Primary, Qualitative Meta-Analytic, and Mixed Methods Research in Psychology: The APA Publications and Communications Board Task Force Report.20
The document describes typical characteristics of qualitative research:
Qualitative researchers report their research to reflect the situated ness of their research in a number of ways. First… the context of the investigators themselves is an issue. Researchers’ relationship to the study topic, with their participants, and to related ideological commitments may all have bearing upon the inquiry process. Second, qualitative researchers describe the context within which a phenomenon or study topic is being construed as well… Third, they also describe the contexts of their data sources… In addition to describing the phenomena, data sources, and investigators in terms of their location, era, and time periods, qualitative researchers seek to situate these factors in relation to relevant social dynamics.20
One of the aspects that Levitt, et al. (2018)20 make explicit is that “qualitative researchers have long sought language to describe rigor in their approach.” The authors list the details that should be described in an article, with a focus on qualitative research, in order to maximize the accuracy and ethics of the report. The document explores different contexts of appropriation of qualitative research and provides guidelines for reviewing journals: Journal Article Reporting Standards for Qualitative Research (JARS-Qual): Information Recommended for Inclusion in Manuscripts That Report Qualitative Meta-Analyses; Information for Inclusion in Qualitative Meta-Analysis Reporting Standards (QMARS); Mixed Methods Article Reporting Standards (MMARS).20 In the methodological integrity section, at JARS-Qual, the research report is expected to include a number of items such as
Demonstrate that the claims made from the analysis are warranted and have produced findings with methodological integrity…. Present findings in a coherent manner that makes sense of contradictions or disconfirming evidence in the data (e.g. Reconcile discrepancies, describe why a conflict might exist in the findings).20
About explaining “contradictions or evidence not validated by the data”, this is a perspective that finds support in the very concept of scientific integrity brought by Richard Feynman – some decades ago (Feynman, 1974):
… it’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.21
A previous rigor: beyond conducting and reporting research
As has been indicated in this perspective, in the current discussion on scientific rigor, articulated with perceptions about research integrity, there is a marked concern with honesty and transparency in scientific reporting. However, one aspect that has received less attention is the importance of the initial research question and the assumptions that inform the methodology used.
In qualitative research, this issue is particularly relevant when the objective of the study is, for example, to inform public policies. Well-defined criteria deserve special attention in the initial proposal of the project and not only in the report and review of the research – in progress or after it has already been carried out. As Fanelli describes (2018),21 “… no amount of pre-registration, transparency, and sharing of data and code can turn a badly conceived and badly designed study into a good one”. The author adds that “… by superficially complying to bureaucratic reproducibility standards, a flawed study might acquire undeserved legitimacy.” Although more focused on reproducibility, whose application is controversial in qualitative research22, Fanelli’s (2018)23 observation it is a timely warning that the debate on rigor, which intensifies in the scope of open science, is not concentrated in the post-project stage.
The literature describes recurring criticisms of the reliability of qualitative research, which include claims of little rigor and/or little methodological clarity. In this sense, Moravcsik (2014)24 argues, from the perspective of the political sciences, about the need for a revolution in qualitative research. He cites an initiative by the American Political Science Association (APSA), which in 2013 established recommendations for greater transparency in the conduct and reporting of quantitative and qualitative research. Moravcsik (2014)24 considers that the results of social scientific research are constantly confronted with the concern that “… the measures, cases and sources—that an author has selected reveal only a subset of the data that could be relevant to the research question. This raises the danger of selection bias…”24
The influence of the researcher’s confirmation bias – a type of cognitive bias that, among some developments, can lead the researcher to see what he believes or even what he would like to see in his research – may impact the selection and interpretation of evidence. Expanding space for discussion about the influence of this type of bias in the consideration of methodological strategies and in the evaluation of results is not a specific issue for qualitative research. However, it deserves all the attention in this field, at this moment when reflections on scientific rigor in the academy are enriched.
In this changing scenario, open science increases the possibilities for us to deepen our understanding of the factors that can skew perceptions and/or interpretations about a given social phenomenon – even decision-making – from qualitative research. In an increasingly marked way, the appropriation of qualitative research is made by researchers in the most varied areas, not restricted to those working in the human and social sciences. We believe that, therefore, stimulating the discussion about scientific rigor in the light of these different appropriations and research traditions becomes increasingly necessary. This discussion permeates aspects about the integrity and quality of qualitative research not only in the scope of publications, but also in the proposal and evaluation of research projects.
Notes
* Programa de Educação, Gestão e Difusão em Biociências, Instituto de Bioquímica Médica Leopoldo de Meis (IBqM), Universidade Federal do Rio de Janeiro (UFRJ), Brasil;
** Programa de Mestrado Profissional em Educação, Gestão e Difusão em Biociências, IBqM/UFRJ;
*** Programa de Ética em Ciência e Medicina, University of Texas Southwestern, Estados Unidos.
1. Authorship policies. Nature [online]. 2009, vol. 459, no. 7242, pp. 1078 [viewed 05 January 2021]. https://doi.org/10.1038/4581078a. Available from: https://www.nature.com/articles/4581078a
2. CASADEVALL, A. et al. Rigorous Science: A How-To Guide. mBio [online]. 2016, vol. 07, no. 06, e01902-16 [viewed 05 January 2021]. https://doi.org/10.1128/mbio.01902-16. Available from: https://mbio.asm.org/content/7/6/e01902-16
3. HARPER, D. “Rigor.” [online]. Online etymology dictionary. 2016 [viewed 05 January 2021]. Available from: http://www.etymonline.com/index.php?term=rigor
4. REICHLIN, T. S. et al. The researchers’ view of scientific rigor—Survey on the conduct and reporting of in vivo research. PLOS One [online]. 2016, vol, 11, no. 12 [viewed 05 January 2021]. https://doi.org/10.1371/journal.pone.0165999. Available from: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0165999
5. National Institutes of Health (NIH). Enhancing reproducibility through rigor and transparency [online]. NOT-OD-15-103. 2015 [viewed 05 January 2021]. Available from: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-15-103.html
6. COLLINS, F. S. et al. Policy: NIH plans to enhance reproducibility. Nature [online]. 2014, vol. 505, no. 7485, pp. 612-613 [viewed 05 January 2021]. https://doi.org/10.1038/505612a. Available from: https://www.nature.com/news/policy-nih-plans-to-enhance-reproducibility-1.14586
7. National Institutes of Health (NIH). Advanced notice of coming requirements for formal instruction in rigorous experimental design and transparency to enhance reproducibility: NIH and AHRG institutional training grants, institutional career development awards, and individual fellowships [online]. NOT-OD-16-034. 2015 [viewed 05 January 2021]. Available from: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-16-034.html
8. BARBOUR, S. R. Checklists for improving rigour in qualitative research: A case of the tail wagging the dog? British Medical Journal [online]. 2001, vol. 322, no. 7294 [viewed 05 January 2021]. https://doi.org/10.1136/bmj.322.7294.1115. Available from: https://www.bmj.com/content/322/7294/1115
9. JOHNSON, J. L. et al. A Review of the Quality Indicators of Rigor in Qualitative Research. American Journal of Pharmaceutical Education [online]. 2020, vol. 84, no. 01, pp. 7120 [viewed 05 January 2021]. https://doi.org/10.5688/ajpe7120. Available from: https://www.ajpe.org/content/84/1/7120
10. RAGIN, C. C. Workshop Scientific Foundations of Qualitative Research [online]. NSF 04-2918. 2004 [viewed 05 January 2021]. Available from: https://www.nsf.gov/pubs/2004/nsf04219/nsf04219.pdf
11. LAMONT, M. Workshop on Interdisciplinary Standards for Systematic Qualitative Research [online]. ISSQR_rpt. 2008 [viewed 05 January 2021]. Available from: https://www.nsf.gov/sbe/ses/soc/ISSQR_rpt.pdf
12. PLEMMONS, D. Transparency and Qualitative Research Workshop [online]. Open Science Framework. 2018 [viewed 05 January 2021]. Available from: https://osf.io/fshux/
13. ALBAGLI, S. et al. Ciência aberta, questões abertas. Brasília: IBICT; Rio de Janeiro: UNIRIO; 2015. Available from: https:/livroaberto.ibict.br/handle/1/1060
14. SANTOS, P. X. et al. Livro Verde – Ciência aberta e dados abertos: mapeamento e análise de políticas, infraestruturas e estratégias em perspectiva nacional e internacional. Rio de Janeiro: FIOCRUZ. 2017.
15. FUENTE, G. B. What is Open Science? Introduction [online]. Foster. [viewed 05 January 2021]. Available from: https://www.fosteropenscience.eu/content/what-open-science-introduction
16. JAUNSEN, A. The state of Open Science in the Nordic countries: Enabling data-driven science in the Nordic countries [online]. Nordic Co-operation. 2018 [viewed 05 January 2021]. Available from: http://norden.diva-portal.org/smash/record.jsf?pid=diva2%3A1257306&dswid=3471
17. RIBEIRO, L. et al. Política Nacional de Ciência Aberta em Portugal: recomendações do grupo de trabalho sobre Avaliação Científica = National Policy on Open Science in Portugal: recommendations of the working group on Scientific Assessment. Posted Content [online]. 2019 [viewed 05 January 2021]. https://doi.org/10.31229/osf.io/y4gq5. Available from: https://osf.io/preprints/lissa/y4gq5/
18. CARVALHO, J. et al. A Plataforma Integrada de Apoio à Publicação Científica. In: 100 Conferência Luso-Brasileira Ciência Aberta. Manaus, 2019 [viewed 05 January 2021]. Available from: http://repositorium.sdum.uminho.pt/handle/1822/61749
19. cOAlition S: Making Open Access a reality by 2020 [online]. cOAlition-S. 2018 [viewed 05 January 2021]. Available from: https://www.coalition-s.org/coalition-s-launch/
20. LEVITT, H. M. et al. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. American Psychologist [online]. 2018, vol. 73, no. 01, pp. 26-46 [viewed 05 January 2021]. https://doi.org/10.1037/amp0000151. Available from: https://doi.apa.org/fulltext/2018-00750-003.html
21. EYNMAN, R. P. Cargo Culture Science [online]. Cargo Cult Science. 1974 [viewed 05 January 2021]. Available from: http://calteches.library.caltech.edu/51/2/CargoCult.htm.
22. The discussion on reproducibility gains broader and bigger shapes and spaces, raising questions related to studies conducted in humane and social sciences, as reflected by Peels & Bouter, Palgrave Communications, 2018, https://www.nature.com/articles/s41599-018-0149-x; E Camerer, et al, Nature Human Behaviour, 2018, https://www.nature.com/articles/s41562-018-0399-z
23. FANELLI, D. “Reproducible” is not synonymous with “true”: a comment on the NAS report [online]. Daniele Fanelli. 2018 [viewed 05 January 2021]. Disponível em: http://danielefanelli.com/Blog.html
24. MORAVCSIK, A. Transparency: The revolution in qualitative research. Political Science & Politics [online]. 2013, vol. 47, no. 1, pp. 48-53 [viewed in 05 February 2021]. DOI: https://doi.org/10.1017%2Fs1049096513001789. Available from: https://www.cambridge.org/core/journals/ps-political-science-and-politics/article/abs/transparency-the-revolution-in-qualitative-research/7FBEEFCF62C4CF45D91DE54AEE418DE2
References
ALBAGLI, S. et al. Ciência aberta, questões abertas. Brasília: IBICT; Rio de Janeiro: UNIRIO; 2015. Available from: https:/livroaberto.ibict.br/handle/1/1060
ATLAS, M. C. Emerging ethical issues in instructions to authors of high-impact biomedical journals. Journal of the Medical Library Association [online]. 2003, vol. 91, no. 04, pp. 442-449 [viewed 05 January 2021]. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC209510/
Authorship policies. Nature [online]. 2009, vol. 459, no. 7242, pp. 1078 [viewed 05 January 2021]. https://doi.org/10.1038/4581078a. Available from: https://www.nature.com/articles/4581078a
BARBOUR, S. R. Checklists for improving rigour in qualitative research: A case of the tail wagging the dog? British Medical Journal [online]. 2001, vol. 322, no. 7294 [viewed 05 January 2021]. https://doi.org/10.1136/bmj.322.7294.1115. Available from: https://www.bmj.com/content/322/7294/1115
CARVALHO, J. et al. A Plataforma Integrada de Apoio à Publicação Científica. In: 100 Conferência Luso-Brasileira Ciência Aberta. Manaus, 2019 [viewed 05 January 2021]. Available from: http://repositorium.sdum.uminho.pt/handle/1822/61749
CASADEVALL, A. et al. Rigorous Science: A How-To Guide. mBio [online]. 2016, vol. 07, no. 06, e01902-16 [viewed 05 January 2021]. https://doi.org/10.1128/mbio.01902-16. Available from: https://mbio.asm.org/content/7/6/e01902-16
cOAlition S: Making Open Access a reality by 2020 [online]. cOAlition-S. 2018 [viewed 05 January 2021]. Available from: https://www.coalition-s.org/coalition-s-launch/
COLLINS, F. S. et al. Policy: NIH plans to enhance reproducibility. Nature [online]. 2014, vol. 505, no. 7485, pp. 612-613 [viewed 05 January 2021]. https://doi.org/10.1038/505612a. Available from: https://www.nature.com/news/policy-nih-plans-to-enhance-reproducibility-1.14586
CrossCheck Plagiarism Screening Service launches today [online]. Crossref. 2008 [viewed 05 January 2021]. Available from: https://www.crossref.org/news/2008-06-19-crosscheck-plagiarism-screening-service-launches-today/
CSE’s White Paper on Promoting Integrity in Scientific Journal Publications [online]. Council of Science Editors (CSE). 2006 [viewed 05 January 2021]. Available from: https://cseditors.wpengine.com/wp-content/uploads/CSE-White-Paper_2018-update-050618.pdf
ESF-ORI First World Conference on Research Integrity: Fostering Responsible Research Lisbon [online]. World Conferences on Research Integrity. 2007 [viewed 05 January 2021]. Available from: https://wcrif.org/documents/297-2007-242fp/file
FANELLI, D. “Reproducible” is not synonymous with “true”: a comment on the NAS report [online]. Daniele Fanelli. 2018 [viewed 05 January 2021]. Disponível em: http://danielefanelli.com/Blog.html
FANG, F. C. et al. .Misconduct accounts for the majority of retracted scientific publications. Proceedings of the National Academy of Sciences [online]. 2012, vol. 109, no. 42, pp. 17028 – 17033 [viewed 05 January 2021]. https://doi.org/10.1073/pnas.1212247109. Available from: https://www.pnas.org/content/109/42/17028
FEYNMAN, R. P. Cargo Culture Science [online]. Cargo Cult Science. 1974 [viewed 05 January 2021]. Available from: http://calteches.library.caltech.edu/51/2/CargoCult.htm
FUENTE, G. B. What is Open Science? Introduction [online]. Foster. [viewed 05 January 2021]. Available from: https://www.fosteropenscience.eu/content/what-open-science-introduction
GARNER, H. R. Combating unethical publications with plagiarism detection services. Urologic Oncology: Seminars and Original Investigations [online]. 2011, vol. 29, no. 01, pp. 95-99 [viewed 05 January 2021]. https://doi.org/10.1016/j.urolonc.2010.09.016. Available from: https://www.crossref.org/news/2008-06-19-crosscheck-plagiarism-screening-service-launches-today/
GILES, J. Taking on the cheats. Nature [online]. 2005, vol. 435, no. 7040, pp. 258–259 [viewed 05 January 2021]. https://doi.org/10.1038/435258a. Available from: https://www.nature.com/articles/435258a
HARPER, D. “Rigor.” [online]. Online etymology dictionary. 2016 [viewed 05 January 2021]. Available from: http://www.etymonline.com/index.php?term=rigor
HAVEN, T. L. et al. Preregistering qualitative research. Accountability in Research [online]. 2019, vol. 26, no. 03, pp. 229-244 [viewed 05 January 2021]. https://doi.org/10.1080/08989621.2019.1580147. Available from: https://www.tandfonline.com/doi/full/10.1080/08989621.2019.1580147
IOANNIDIS, J. P. A. Why most published research findings are false. PLOS Medicine [online]. 2005, vol. 02, no. 08 [viewed 05 January 2021]. https://doi.org/10.1371/journal.pmed.0020124. Available from: https://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.0020124
JAUNSEN, A. The state of Open Science in the Nordic countries: Enabling data-driven science in the Nordic countries [online]. Nordic Co-operation. 2018 [viewed 05 January 2021]. Available from: http://norden.diva-portal.org/smash/record.jsf?pid=diva2%3A1257306&dswid=3471
JOHNSON, J. L. et al. A Review of the Quality Indicators of Rigor in Qualitative Research. American Journal of Pharmaceutical Education [online]. 2020, vol. 84, no. 01, pp. 7120 [viewed 05 January 2021]. https://doi.org/10.5688/ajpe7120. Available from: https://www.ajpe.org/content/84/1/7120
KORTELING, J. E. et al. A neural network framework for cognitive bias. Frontiers in Psychology [online]. 2018, vol. 09 [viewed 05 January 2021]. https://doi.org/10.3389/fpsyg.2018.01561. Available from: https://www.frontiersin.org/articles/10.3389/fpsyg.2018.01561/full
LAMONT, M. Workshop on Interdisciplinary Standards for Systematic Qualitative Research [online]. ISSQR_rpt. 2008 [viewed 05 January 2021]. Available from: https://www.nsf.gov/sbe/ses/soc/ISSQR_rpt.pdf
LEVITT, H. M. et al. Journal article reporting standards for qualitative primary, qualitative meta-analytic, and mixed methods research in psychology: The APA Publications and Communications Board task force report. American Psychologist [online]. 2018, vol. 73, no. 01, pp. 26-46 [viewed 05 January 2021]. https://doi.org/10.1037/amp0000151. Available from: https://doi.apa.org/fulltext/2018-00750-003.html
LONG, T. C. et al. Responding to possible plagiarism. Science [online]. 2009, vol. 323, no. 5919, pp. 1293-1294 [viewed 05 January 2021]. https://doi.org/10.1126/science.1167408. Available from: https://science.sciencemag.org/content/323/5919/1293
MARSHALL, C. et al. Designing qualitative research. 6th ed. Thousand Oaks: SAGE Publishing, 2014.
Ministério da Ciência, Tecnologia e Ensino Superior. Política Nacional de Ciência Aberta [online]. 2016 [viewed 05 January 2021]. Available from: https://arquivo.pt/wayback/20181013141314/http://www.ciencia-aberta.pt/pnca
MORAVCSIK, A. Transparency: The revolution in qualitative research. PS: Political Science & Politics [online]. 2014, vol. 47, no. 01, pp. 48-53 [viewed 05 January 2021]. https://doi.org/10.1017/s104909651300178. Available from: https://www.cambridge.org/core/journals/ps-political-science-and-politics/article/abs/transparency-the-revolution-in-qualitative-research/7FBEEFCF62C4CF45D91DE54AEE418DE2
MORSE, J. et al. Verification strategies for establishing reliability validity in qualitative research. International Journal of Qualitative Research [online]. 2002, vol. 01, no. 02, pp. 13-22 [viewed 05 January 2021]. https://doi.org/10.1177/160940690200100202. Available from: https://journals.sagepub.com/doi/10.1177/160940690200100202
National Institutes of Health (NIH). Advanced notice of coming requirements for formal instruction in rigorous experimental design and transparency to enhance reproducibility: NIH and AHRG institutional training grants, institutional career development awards, and individual fellowships [online]. NOT-OD-16-034. 2015 [viewed 05 January 2021]. Available from: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-16-034.html
National Institutes of Health (NIH). Enhancing reproducibility through rigor and transparency [online]. NOT-OD-15-103. 2015 [viewed 05 January 2021]. Available from: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-15-103.html
OBLE, H. et al. Issues of validity and reliability in qualitative research. Evidence-Based Nursing [online]. 2015, vol. 18, no. 02, pp. 34-35 [viewed 05 January 2021]. https://doi.org/10.1136/eb-2015-102054. Available from: https://ebn.bmj.com/content/18/2/34
O’BRIEN, B. C. et al. Standards for reporting qualitative research: a synthesis of recommendations. Academic Medicine [online]. 2014, vol. 89, no. 09, pp. 1245-1251 [viewed 05 January 2021]. https://doi.org/10.1097/acm.0000000000000388. Available from: https://journals.lww.com/academicmedicine/Fulltext/2014/09000/Standards_for_Reporting_Qualitative_Research__A.21.aspx
PLEMMONS, D. Transparency and Qualitative Research Workshop [online]. Open Science Framework. 2018 [viewed 05 January 2021]. Available from: https://osf.io/fshux/
RAGIN, C. C. Workshop Scientific Foundations of Qualitative Research [online]. NSF 04-2918. 2004 [viewed 05 January 2021]. Available from: https://www.nsf.gov/pubs/2004/nsf04219/nsf04219.pdf
REICHLIN, T. S. et al. The researchers’ view of scientific rigor—Survey on the conduct and reporting of in vivo research. PLOS One [online]. 2016, vol, 11, no. 12 [viewed 05 January 2021]. https://doi.org/10.1371/journal.pone.0165999. Available from: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0165999
RIBEIRO, L. et al. Política Nacional de Ciência Aberta em Portugal: recomendações do grupo de trabalho sobre Avaliação Científica = National Policy on Open Science in Portugal: recommendations of the working group on Scientific Assessment. Posted Content [online]. 2019 [viewed 05 January 2021]. https://doi.org/10.31229/osf.io/y4gq5. Available from: https://osf.io/preprints/lissa/y4gq5/
ROLFE, G. Validity, trustworthiness and rigour: quality and the idea of qualitative research. Journal of Advanced Nursing [online]. 2006, vol. 53, no. 03, pp. 304–310 [viewed 05 January 2021]. https://doi.org/10.1111/j.1365-2648.2006.03727.x. Available from: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-2648.2006.03727.x
SANDELOWSKI, M. Rigor or rigor mortis: the problem of rigor in qualitative research revisited. Advanced Nursing Science [online]. 1993, vol. 16, no 02, pp. 01–08 [viewed 05 January 2021]. https://doi.org/10.1097/00012272-199312000-00002. Available from: https://journals.lww.com/advancesinnursingscience/Abstract/1993/12000/Rigor_or_rigor_mortis__The_problem_of_rigor_in.2.aspx
SANTOS, P. X. et al. Livro Verde – Ciência aberta e dados abertos: mapeamento e análise de políticas, infraestruturas e estratégias em perspectiva nacional e internacional. Rio de Janeiro: FIOCRUZ. 2017.
Science as an open enterprise: Open data for open science [online]. The Royal Society. 2012 [viewed 05 January 2021]. Available from: https://royalsociety.org/~/media/policy/projects/sape/2012-06-20-saoe.pdf
Science Europe. Briefing Paper: Research Integrity: what it means, why it is important and how we might protect it [online]. Science Europe. 2015 [viewed 05 January 2021]. Available from: http://www.scienceeurope.org/wp-content/uploads/2015/12/Briefing_Paper_Research_Integrity_web.pdf
SILVA, F. C. C. et al. O ecossistema da Ciência Aberta. Transinformação [online]. 2019, vol. 31 [viewed 05 January 2021]. http://dx.doi.org/10.1590/2318-0889201931e190001. Available from: http://ref.scielo.org/zhy758
The truth will out. Nature Physics [online]. 2009; vol. 5, no. 07, pp. 449 [viewed 05 January 2021]. https://doi.org/10.1038/nphys1317. Available from: https://www.nature.com/articles/nphys1317
Who is accountable? Nature [online]. 2007, vol. 450, vol. 7166 [viewed 05 January 2021]. https://doi.org/10.1038/450001a. Available from: https://www.nature.com/articles/450001a
WOELFLE, M. Open science is a research accelerator. Nature Chemistry [online]. 2011, vol. 03, no. 10, pp. 745–748 [viewed 05 January 2021]. https://doi.org/10.1038/nchem.1149. Available from: https://www.nature.com/articles/nchem.1149
External links
Grupo de Trabalho Interministerial da Política Nacional de Ciência Aberta (GT-PNCA): http://www.ciencia-aberta.pt/pnca
Online Etymology Dictionary: https://www.etymonline.com/
SciELO 20: https://www.scielo20.org/
Translated from the original in Portuguese by Lilian Nassi-Calò.
Como citar este post [ISO 690/2010]:
Pingback: The Pathological Sciences: Understanding The Causes And Effects Of Disease | Steve Gallik