Glossary

Glossary of credibility terms


From TOP guidelines to ReproducibiliTea

  • Center for Open Science (COS)

    An organisation based in Charlottesville, Virginia, USA whose mission is 'to increase openness, integrity, and reproducibility of research.’ The COS is used by researchers from numerous disciplines, including astronomers, biologists, chemists, computer scientists, education researchers, engineers, neuroscientists and psychologists. COS provides tools and service to foster open and reproducible science.  

  • Citizen Science

    Research that is conducted in collaboration with nonprofessional scientists (e.g. patients or members of the public). Non-experts can get involved at all stages of a research project, for example during hypothesis generation, study design, data collection and analysis. Zooniverse is one platform with examples of Citizen Science projects. Also see Patient and Public Involvement.

  • cOAlition S

    cOAlition S is an international consortium of research funders (including UKRI and Wellcome Trust) who endorse and work together to implement plan S. 

  • CRediT Taxonomy

    CRediT taxonomy is used by many journals to acknowledge the specific contributions of each author to a research article.  

  • Credibility

    An umbrella term to describe research that is open, reproducible, replicable and reliable.

  • Data Dredging

    Another term to describe p-hacking. 

  • Data Fishing

    Another term to describe p-hacking. 

  • Exploratory reports

    Journal format for articles pertaining to research that is not hypothesis driven (cf. confirmatory studies). Research suitable for an exploratory report includes studies that are hypothesis generating or are addressing a relatively open research question. 

    The journal Cortex launched the first example of the exploratory report publishing format: see McIntosh, R. D. (2017). Exploratory reports: A new article type for Cortex. Cortex, 96, A1-A4. doi:10.1016/j.cortex.2017.07.014 

  • DORA

    San Francisco Declaration on Research Assessment (DORA) is a set of recommendations developed during the Annual Meeting of The American Society for Cell Biology (ASCB) in San Francisco, 2012, in response to a ‘pressing need’ to improve the ways scientific research output is evaluated. 


    The DORA recommendations – aimed at funders, institutions, publishers, organisations and researchers – focus on the discontinued use of journal-based metrics, such as Journal Impact Factors, in assessing the quality and impact of research output, or an individual scientist’s contributions. Notably, it is recommended that Journal Impact Factor should not be used in hiring, promotion, or funding decisions. 


    Signatories of DORA agree to support and adopt the declaration. The BNA signed the declaration in February 2019. 


  • Electronic lab notebook

    Electronic lab books digitalise study workflows. As an alternative to paper lab books, electronic lab books integrate experimental notes (and in some cases experimental software) with data collection/data management. An introduction to Electronic lab books can be found here.

  • FAIR Guiding Principles

    The FAIR principles are a set of community-developed guidelines to ensure that data or any digital object (software, models, workflows, documents, etc.) are Findable, Accessible, Interoperable and Reproducible. The FAIR principles put a specific emphasis on enhancing the ability of machines to automatically find and use data or any digital object, therefore use of data at scale, in addition to supporting its reuse by individuals.


    FAIR principles arose from discussions held at the ‘Jointly Designing a Data Fairport’ workshop in Leiden, Netherlands, in 2014 between academics and private stakeholders who were interested in ‘overcoming data discovery and reuse obstacles’. 

  • FOSTER portal

    E-learning website with more information on open science and strategies and skills for its implementation. 

  • HARKing

    Stands for Hypothesising After the Results are Known (creating a hypothesis post hoc, based on observed results). This is circular and therefore represents poor practice.

  • Leiden Manifesto

    The Leiden Manifesto for research metrics (Hicks, 2015 Bibliometrics: The Leiden Manifesto for research metrics. Nature. 520(7548):429-31. doi: 10.1038/520429a) outlines 10 principles that aim to stop the misuse of research metrics in the evaluation of research performance.  More  information can be found here

  • Open access publishing

    Making research articles freely available.

  • Open analytic code

    Openly sharing of the programming code used to analyse data sets. In the spirit of reproducibility, sharing of the code used to analyse your data allows others to reproduce your findings and/or double check your analysis. 

  • Open educational material

    Open sharing of research methods and experimental materials increases the likelihood that others can replicate your work. 

  • Open experimental methods/materials

    Open sharing of research methods and experimental materials increases the likelihood that others can replicate your work. 

  • Open peer review

    An umbrella term for review methods that seek to increase the transparency of the review process. This can include Open identities, Open reports, Open participation, Open Pre-review, Open Final Version, Open Interaction and Open Platforms. More information  For more information see: https://www.fosteropenscience.eu/node/2231; https://www.fosteropenscience.eu/learning/open-peer-review/#/id/5a17e150c2af651d1e3b1bce  

  • Open Research/Science Working Groups

    Bodies of researchers on a mission to inform colleagues of the open science movement and promote cultural change.

  • Open Science

    Movement that calls for transparency at all stages of scientific discovery.  

  • Open Science badges

    Also referred to as TOPs badges, Open Science Badges is an initiative from the Center for Open Science. Open Science Badges are awarded to researchers who openly share data, code, materials or preregister their work. These incentives therefore encourage open science practice without banning papers that do not (or cannot) meet such aims (such as sharing of personally identifiable data or proprietary code). 


    An increasing number of Journals including neuroscience journals offer open science badges and includes the Journal of Neuroendocrinology, Journal of Neurochemistry, Journal of Neuroscience Research and our Journal, Brain and Neuroscience Advances. 

  • Open Science Framework

    OSF is a free, open source, web-based project management tool offered by the Center for Open Science. The framework facilitates the managing and sharing of research. 

  • Open Source

    Online content - including documents/information and facilities/products (e.g., software) - that are free for all to use and include permission to use the source code. For more information see here.  

  • Patient and Public involvement

    Involvement of (and partnership with) patients and members of the public in research, especially at early stages of projects in priority setting, hypothesis generation and study design (this is different to, for example, recruitment of subjects/patients for a clinical trial). This is one aspect of Citizen Science.

  • Peer Reviewers’ Openness (PRO) Initiative

    The PRO initiative recommends that scientists should only peer-review scientific articles that adhere to minimal open science initiatives e.g., for which associated materials and data are shared.  Researchers can show their support by joining the initiative.

  • P-Hacking

    (Also known as data-fishing and data-dredging) p-hacking is analysing data in multiple ways to reach significance (or find trends) and then only reporting the analyses that “worked.” This includes trying different statistical tests, investigating different variables or combinations of variables, removal of outliers, performing analyses with and without covariates, and adding or removing samples in order to reach significance. 

  • Plan S

    A radical open-access initiative launched in September 2018. Plan S (the S can stand for ‘Science, Speed, Solution, Shock’) requires that that recipients of grant funding from members of cOAlition S make resulting publications immediately open access (without embargoes) from 2020 onwards: ‘No science should be locked behind a paywall.’ 

  • Poster Preregistration

    This is a form of Study Preregistration but in poster format, usually at a scientific conference. The emphasis is solely on the study design and offers a way to run a study design by your peers and get feedback, before collecting data. 

    The BNA trialled preregistration posters at BNA2019 festival with great success! 


    Other organisations have introduced posters of a similar type, termed slightly differently - for example, 

    • 'In Progress' posters 
    • 'Planned Studies' posters, as used by CuttingEEG
    • 'Non-standard abstracts' as used by ENCP
  • Preprint archiving

    Preprint archiving allows researchers to upload the original/first version of their manuscript so that it is openly available to read before (and while) it is at peer-review stage. Many journals permit preprint archiving: the SHERPA/ROMEO provides information on policies of individual journals. Many funders now accept preprints as part of the publication list when applying for grants or fellowships.

  • Registered Reports

    A relatively new publishing format in which articles are submitted before results are known (typically before data are collected), and peer-reviewed solely on the quality of the research question and study design. If accepted, the article is published regardless of the outcome of the experiments, provided the authors adhere to the research plan, or give a credible explanation for any divergence. This is a form of Study Preregistration but with peer-review and potential journal publication. 


    For more information see our credibility preregistration/registered reports toolkit.

  • Registered Report Funding Model

    Journals and funders work together to approve study design and publication. 

  • Research Excellence Framework (REF)

    The UK’s system for assessing the quality of research in UK higher education institutions. Guidelines for the forthcoming Research Excellence Framework (REF) includes recognition of research practices that support open and reproducible science - ‘registered reports, pre-registration, publication of data sets, experimental materials, analytic code’ as well as ‘replication studies’ are all recognised in REF 2021.

  • Reliability

    Trustworthiness of experimental tools. In relation to research tools, reliability is inversely related to error of measurement. The more a measure varies from one measurement occasion to another, the less reliable it is. Reliability is also used when describing research findings, where it is a synonym for replicability.

  • Replicability

    Independent researcher repeats an experiment in a different context and gets consistent results. Relies on availability of transparent methods.

  • Replication Study

    A laboratory experiment is repeated by a different scientist/team of scientists using the same experimental methods. 

  • Reproducibilitea

    A grassroots initiative by early-career researchers of Open Science journal clubs to discuss diverse issues, papers and ideas about improving science, reproducibility and the Open Science movement. Started in early 2018 at the University of Oxford, 


    ReproducibiliTea has now spread to 45 institutions in 16 different countries, spanning 3 different continents. 

  • Reproducibility

    Independent researcher repeats analysis of data and gets the same results. Relies on availability of original data (and code). 

  • Reproducibility movement

    The reproducibility movement was born from concerns that there are many examples where published data is not replicable.  Lack of replicability may be a result of measurement error, artefacts in measurement methods, context-sensitivity of findings, accidental error, or deliberate data massage in order to achieve statistical significance. 


    The reproducibility movement calls for more studies to be reproduced and replicated, to double-check our findings as we go along.  Whilst having one’s work reanalysed/replicated can be a daunting prospect, it is an important process which creates a better foundation for future work to be built upon.   

  • Reproducibility Study

    Reanalysing data using the same analytical methods/code. By re-running analysis, the data are double checked for human error and/or errors in analysis and code. 

  • Robust Research

    Term use to describe open research that has been conducted to a high standard using reliable tools. Research results should be reproducible and replicable.  

  • Study Preregistration

    Also known as Study Registration. A system for openly planning work in advance. The idea is that if you clearly state your hypothesis and proposed methods, including power calculations and statistical analysis upfront, then this will reduce the incidence of questionable research practices (e.g., p-hacking and HARKing). It also generates valuable feedback from the scientific community before the study has started, when it is most important. Currently preregistration is most commonly performed by posting a research plan on an independent registry e.g., Open Science Framework. 


    For more information see our credibility preregistration/registered reports toolkit .

  • Study Registration

    Another term for Study Preregistration. 

  • Transparency and Openness promotion (TOP) guidelines

    TOP Guidelines are eight modular standards that journals can implement to increase the transparency of research they publish. 

  • UK Network of Open Research Working Groups (UK-ORWG)

    National network of UK open research working groups (UK-ORWGs) that want to improve the transparency, reproducibility and quality of science research. A grass-roots movement, the network aims to bring together UK-ORWGs that have formed within institutions across the UK. The UK-ORWG network forms part of the UKRN network (a network within a network). More information on UK-ORWG can be found here.

  • UK Reproducibility Network (UKRN)

    The UKRN is a peer-led association that aims to ensure the UK remains a centre for world-leading research by investigating the factors that contribute to robust research, providing training and disseminating best practice, and working with stakeholders to ensure coordination of efforts across the sector. The UKRN has a growing number of regional hubs based in UK universities. 


    The BNA is a member of the UKRN stakeholder engagement group. 

  • UK Research Integrity Office

    An independent charity that helps all involved in research to support good research practice and address research misconduct. More information on UKRIO can be found here

Are there terms which aren't listed here?  Please get in touch and we will add them!

Share by: