Reese Richardson

Case studies in scientific reproducibility

Introducing COSIG: the Collection of Open Science Integrity Guides

Published by

on

COSIG logo

I am beyond thrilled to introduce the Collection of Open Science Integrity Guides (COSIG), a community-led open source resource for performing post-publication peer review. At the end of this post, I’ve reproduced a commentary I wrote explaining the motivation behind COSIG and how we hope that it will be used. COSIG has been in development for the better part of a year now and involved the labor and feedback of lots of fantastic people, most of whom are mentioned below.

The commentary below is a suitable introduction to the project but I’ll add this editorial note: whenever we talk about science integrity and breaches of it, there is always concern that we are only giving ammunition to the anti-science crowd. This is especially pertinent now that this crowd is in charge of the US government and is trying to redefine what constitutes scientific misconduct and what constitutes “gold standard science” to meet their policy goals. Now is not the time for people concerned about the integrity of the scientific enterprise to remain quiet; now is the time to double down on making the scientific enterprise better and commit to doing good science (and good peer review thereof) out in the open. We want to arm scientists with the tools to uphold the integrity of the literature, precisely so that the anti-science crowd isn’t allowed to redefine science integrity on their terms.

A version of the commentary below is available as a pre-print here. COSIG is available at cosig.net or at doi.org/10.17605/OSF.IO/2KDEZ.

The number of scientific articles retracted annually has recently reached record highs [1]. This trend has bolstered concerns about the reliability of the published scientific literature and about a general “reproducibility crisis” across scientific fields.

Two opposing trends underlie this recent increase. One trend is the apparent rise of research paper mills, organizations that facilitate and profit from systematic publication fraud [2]. On the other hand, recent years have seen developments in the popularity and recognition of post-publication peer review (PPPR), the practice by which the published scientific literature is revisited and reappraised, often with a critical lens. A community of prolific PPPR practitioners, often called “sleuths”, has emerged around this practice [3, 4]. Many recent high-profile retractions and revelations of research misconduct were the direct result of volunteer work on the part of this community, as have been thousands of retractions due to paper mill involvement. PPPR mostly occurs on platforms like PubPeer, where users can leave comments pseudonymously or under their name on any article. PubPeer now hosts more than 300,000 comments [5].

This burgeoning community gathered in Paris in September 2024 for a meeting on “decontamination of the scientific literature”. There, we arrived at the consensus that:

  1. The great majority of problematic papers persist in the scientific literature without being detected, let alone retracted.
  2. It is a strong possibility that systematic scientific fraud is responsible for an increasing proportion of the scientific articles published annually [6].
  3. Detection of problematic papers is bottlenecked by the number of concerned individuals performing PPPR.
  4. To better maintain the scientific literature and better understand the scope of systematic scientific fraud [7], we should seek to rapidly expand participation in PPPR.
  5. Although working scientists are the best prepared to perform PPPR, anyone is capable of being a steward of the scientific literature. Indeed, several of the most prolific PPPR practitioners are non-scientists and retired scientists working outside of their career field.
  6. There are barriers to participation in PPPR, including but not limited to not knowing where to start, unfamiliarity with PPPR platforms, lack of domain-specific knowledge and fear of career consequences.

Following on this consensus, we began developing the Collection of Open Science Integrity Guides (COSIG), now available at https://doi.org/10.17605/OSF.IO/2KDEZ or cosig.net. COSIG is an open source, constantly expanding collection of accessible guides maintained by publication integrity experts and PPPR practitioners sharing best practices and tutorials for conducting PPPR in topics across scientific disciplines. At the time of writing, COSIG features 27 guides:



All content in COSIG can be distributed freely under a Creative Commons Attribution – Noncommercial – Share Alike 4.0 International (CC-BY-NC-SA 4.0) license, either individually in standalone PDFs or as a part of one, textbook-like document containing all COSIG entries. Because the project is open source, anyone can make contributions or suggest changes to COSIG. All past versions of COSIG are maintained and new versions of entries prominently feature the last revision date.

The primary goal of COSIG is to be an eminent, comprehensive starting point for those wishing to take part in PPPR. However, we anticipate that COSIG will be useful for a variety of stakeholders including institutional research integrity officers, funding organizations, journal editors and educators.

As a community-led open source project, we welcome contributions from anyone interested in PPPR. Such contributions might include feedback and revisions on existing guides, ideas for new guides and drafting new material for COSIG. Suggestions to improve COSIG can be submitted by opening an issue on COSIG’s GitHub repository at https://github.com/cosig-pppr/cosig or by emailing admin@cosig.net.

COSIG was initially conceived during discussions between Boris Barbour, Elisabeth Bik, Jennifer Byrne, Jana Christopher, Kevin Patrick, Reese Richardson and Maarten van Kampen. At the time of writing, the following individuals have contributed to COSIG: Anna Abalkina, René Aquarius, Lonni Besançon, Elisabeth Bik, David Bimler, Jennifer Byrne, Guillaume Cabanac, Jana Christopher, M.V. Dougherty, Yagmur Ozturk, Kevin Patrick, Solal Pirelli, Reese Richardson, Nicholas Ritchie, Matt Spick, Stefan Stender and Nerita vitiensis (pseudonym).

COSIG launches amidst sweeping cuts to public scientific infrastructure in the United States [8-9] and widespread questions about the prevalence of untrustworthy science [6, 7]. To safeguard the integrity of the scientific literature and to conserve the global public’s strong trust in science [10], we maintain that science (and review thereof) is best done out in the open [11]. To that end, we are excited to launch COSIG and we extend an open invitation to take part in the stewardship of the scientific literature.

References

  1. Van Noorden R. More than 10,000 research papers were retracted in 2023—a new record. Nature. 2023;624(7992):479-81.
  2. Paper Mills — Research report from COPE and STM. COPE and STM, 2022.
  3. Wapner J. The Rise of the Science Sleuths. Undark. 2024.
  4. Abalkina A, Aquarius R, Bik E, Bimler D, Bishop D, Byrne J, et al. ‘Stamp out paper mills’ — science sleuths on how to fight fake research. Nature. 2025;637:1047–50.
  5. Einstein Foundation Institutional Award 2024: PubPeer: Einstein Foundation; 2024. Available from: https://award.einsteinfoundation.de/award-winners-finalists/recipients-2024/pubpeer.
  6. Van Noorden R. How big is science’s fake-paper problem? Nature. 2023;623(7987):466–7.
  7. Byrne JA, Abalkina A, Akinduro-Aje O, Christopher J, Eaton SE, Joshi N, et al. A call for research to address the threat of paper mills. PLoS Biology. 2024;22(11):e3002931.
  8. Tollefson J, Garisto D, Kozlov M, Witze A. Trump proposes unprecedented budget cuts to US science. Nature. 2025;641(8063):565-6.
  9. Tollefson J, Garisto D, Ledford H. Will US science survive Trump 2.0?. Nature. 2025 May;641(8061):26-30.
  10. Cologna V, Mede NG, Berger S, Besley J, Brick C, Joubert M, et al. Trust in scientists and their role in society across 68 countries. Nature Human Behaviour. 2025:1-18.
  11. UNESCO recommendation on open science. United Nations Educational, Scientific and Cultural Organization, 2021.

2 responses to “Introducing COSIG: the Collection of Open Science Integrity Guides”

  1. Assistant Prof. Avatar
    Assistant Prof.

    Great job!

    In my institution we’re asked every semester or so to propose some kind of short “bonus” courses that are at the edge or outside the curriculae so that students can choose from them. I recently had a thought to propose a workshop in identifying fradulent papers. These guidelines would be a good resource 🙂

    As for the guideline on reporting, do you have an experience of reporting anonymously? I’m a bit concerned of scientific “venegance” if the editor or authors have some relation to the reporting person. Do editors even read anonymous emails?

    Like

    1. Reese Richardson Avatar

      Glad to hear that! That is exactly one of the use cases for COSIG we were intending. 🙂

      As for reporting anonymously, many editors take anonymous reports seriously (as they should). Many do not. Some people who find problems in papers but would prefer to work anonymously will approach a sleuth to help report their concerns without revealing their identity. Feel free to reach out to me at richardsonr43@gmail.com and I can maybe help!

      Like

Leave a reply to Reese Richardson Cancel reply