Navigation auf uzh.ch

Suche

Psychologisches Institut

Newsletter #5: Herbstsemester 23

Welcome

Dear members of the Department of Psychology

The fifth newsletter of the Open Science Initiative at the Department of Psychology again promises to be an exciting read! This time, we cover the question "Do researchers adhere to their preregistration plans?", as well as the challenges posed by systematic reviews in the era of open science. We would also like to remind everyone of the upcoming deadline for submissions for the Open Science Award!

Questions, suggestions and contributions to this newsletter may sent to openscience@psychologie.uzh.ch – the next newsletter will appear at the beginning of FS24.

Best regards,
Your Open Science Initiative

 

Topics

Interesting to Know: Do Researchers Adhere to Their Preregistration Plans?

The preregistration of hypotheses and their documentation at a public repository (e.g., OSF, https://osf.io; AsPredicted, https://aspredicted.org) before the data collection for a study begins has become a key element of an open and transparent science. Preregistrations can help clarify which part of the subsequently published research is confirmatory (i.e., testing a priori specified hypotheses), and which part is exploratory (i.e., examining emerging questions with a data-driven approach). In turn, preregistrations can help to prevent the problems of hypothesizing after the results are known (HARKing).

But how closely do researchers follow their preregistrations in their published studies? Two investigations took a closer look and suggest that deviations from the original preregistrations are quite frequent in psychological research: Claesen et al. (2021) investigated the adherence to preregistrations (and disclosure of deviations) for all publications with a “preregistered” badge in Psychological Science between February 2015 and November 2017. Only two out of 27 preregistered studies contained no deviations from the original preregistration plan. Among the studies that had some deviations, one study disclosed all deviations, but nine studies disclosed none of the deviations. Claesen et al. (2021) stated that they observed deviations from the initial plan regarding the sample size, exclusion criteria, and statistical analysis. A more recent investigation by van den Akker et al. (2023) evaluated the extent of selective hypothesis reporting in psychological research by comparing the hypotheses described in the articles with the hypotheses found in 459 preregistrations. More than half of the preregistered studies added hypotheses (57%), or contained omitted hypotheses (52%). Interestingly, the authors also found that replication studies were less likely to include selectively reported hypotheses than original studies.

Taken together, these investigations highlight an important point: Reviewers of manuscripts, journal editors, and researchers should consider more carefully whether the hypotheses described in manuscripts accurately follow the ones outlined in preregistrations, and whether these hypotheses were clearly formulated in the first place. Like other important methodological details, this information tends to be “hidden” in supplemental materials or links to online repositories. Of course, strict adherence to an original preregistration plan is not always feasible (and is not necessarily desirable), but it is important to transparently describe deviations. Without disclosure of deviations, readers who do not carefully consult the preregistration plan can get the incorrect impression that a study was exactly conducted and reported as planned. Editors and reviewers should thus encourage making such deviations transparent.

A more principled approach is the journal format of registered reports, in which the hypotheses, proposed methods, and analyses are all evaluated by reviewers in advance, before any research is undertaken. Registered reports promise to provide a remedy for publication biases. An increasing number of journals in psychology now offer this format (an updated list of journals, including user ratings, at which researchers can submit registered-report manuscripts, can be found here: https://registeredreports.cardiff.ac.uk/feedback/dashboards; further useful information can also be found here: www.akmontoya.com/registered-reports and here: www.cos.io/initiatives/registered-reports; see also our Newsletter #1 for personal experiences with registered reports of department members: https://www.psychologie.uzh.ch/de/bereiche/open-science/news/nl-001.html). In a recent overview, Chambers and Tzavella (2021) reflected on the progress and future of registered reports and offered helpful guidance for researchers. Fortunately, Chambers and Tzavella (2021) portrayed a positive picture so far: Registered reports indeed promote transparency, reproducibility, and may help to reshape how society evaluates research across various scientific disciplines.

References

  • Chambers, C. D., & Tzavella, L. (2021). The past, present and future of registered reports.     Nature Human Behaviour, 6(1), 29–42. https://doi.org/10.1038/s41562-021-01193-7
  • Claesen, A., Gomes, S., Tuerlinckx, F., & Vanpaemel, W. (2021). Comparing dream to         reality: An assessment of adherence of the first generation of preregistered studies.     Royal Society Open Science, 8(10), 211037. https://doi.org/10.1098/rsos.211037
  • van den Akker, O. R., Van Assen, M. A. L. M., Enting, M., De Jonge, M., Ong, H. H., Rüffer,     F., Schoenmakers, M., Stoevenbelt, A. H., Wicherts, J. M., & Bakker, M. (2023).     Selective hypothesis reporting in psychology: Comparing preregistrations and         corresponding publications. Advances in Methods and Practices in Psychological     Science, 6(3), 1–15. https://doi.org/10.1177/25152459231187988

Open Science Practices in Systematic Review Writing: Advancing Transparency, Reproducibility, and Efficiency

Systematic reviews are indispensable for evidence-based decision-making in fields such as medicine, psychology, and education (Fernández-Castilla & Van den Noortgate, 2022). Systematic reviews include comprehensive collection, critical appraisal, and synthesis of available evidence in relation to a well-defined research question. Open science in systematic review writing promotes transparency, making all facets of the review, from research question to data analysis, accessible to the public (Nordström et al., 2022). This transparency allows for scrutiny of the methods and findings. While Open Science Practices are widely known in the context of primary studies, psychologists are less aware of the state of the art in conducting transparent and reproducible systematic literature reviews and the range of freely available tools that support the steps required for this research.

Essential elements of open science are already firmly established in the field of systematic review, bolstering the credibility and utility of these reviews, and offering a means to enhance transparency, reproducibility, and the overall quality of research (Norris et al., 2022). Back in 2011, even before the replication crisis in psychology brought Open Science practices to the forefront, an important open science development was taking place in the field of medicine. This development was the introduction of the PROSPERO register (Schiavo, 2019). The PROSPERO register, officially known as the International Prospective Register of Systematic Reviews, or PROSPERO, is an open-access online database specifically designed to house systematic review protocols across a broad spectrum of subjects. Clear, well-documented protocols enable other researchers to build upon their findings, which leads to research that is less susceptible to bias and errors. PROSPERO's core function is to allow researchers to prospectively register their systematic reviews. This means that before conducting a review, researchers can provide a detailed plan of their review, including the objectives, methods, inclusion/exclusion criteria, the search strategy, and expected outcomes preventing arbitrary decisions and inviting community feedback. By doing so, they commit to a predetermined research path, reducing the likelihood of post hoc changes and selective reporting of results, which can compromise the integrity of the research process.

Initially, PROSPERO was primarily focused on medicine, serving as a repository for systematic review protocols in the medical domain. However, its scope has since expanded into neighboring health-related non-medical fields as well, including protocols from diverse fields like clinical psychology, criminology, social care, or education. By now, there are a few more prospective registration options available for systematic reviews like INPLASY, which is designed exclusively for systematic reviews or Open Science Framework Registries and protocols.io, which are generic registries open to various study types (Pieper & Rombey, 2022).

However, open science practices do not stop at pre-registered protocols. Comprehensive search documentation, including databases used and search terms, are shared openly to enable replication. Open access screening tools and programs, such as Rayyan, Litsearchr, and RevTools, assist in streamlining the systematic review process (Scott et al., 2021). Rayyan allows collaborative article screening, ensuring transparency and progress sharing – note that there are potentially more powerful, but also rather costly tools available, such as DistillerSR. Litsearchr, using machine learning algorithms, automates article screening, reducing reviewer workload and enhancing transparency. RevTools, an open-source R package, aids in data extraction, synthesis, and meta-analysis while ensuring open and transparent data analysis and reporting. The further steps hardly differ from the open publication processes for primary studies. Open data and code access can be given through platforms like GitHub. Publishing systematic reviews as preprints and choosing open-access journals further supports open science. Collaboration and crowdsourcing are encouraged, with platforms like Open Review offering peer feedback and suggestions.

Still, open science practices in systematic reviews could be improved in several areas. One area where improvement is needed is data sharing (Yoong et al., 2022). Surprisingly, very few systematic reviews make their extracted primary research data available. While some reviews provide details for reproducing searches, few share the primary data that was extracted from the original studies. This level of detail, however, is crucial for critical examination and for the creation of open data sets. Journals may need to emphasize the importance of these practices and provide the infrastructure for data sharing to become more common in systematic reviews. Here, too, a positive development is emerging. By now, the Systematic Review Data Repository (SRDR), an initiative by the Brown University with the goal of enhancing the accessibility of systematic review (SR) data has amassed more than 150 systematic reviews and has made available data from over 15,000 studies spanning various health-related research topics. Additionally, the Cochrane collaboration has taken steps to promote open access to systematic review data and has set a commitment to achieving full open access status by the year 2025 (Martinou & Angelidi, 2022).

In conclusion, open science practices have become firmly established within the systematic review process, significantly enhancing transparency, credibility, and the overall quality of research. And although some desiderates of open science like data sharing remain an area of concern systematic reviews are on a promising trajectory within the realm of open science, offering a blueprint for more reliable and impactful primary research in the years to come.
 

References

  • Fernández-Castilla, B., & Van Den Noortgate, W. (2022). Network meta-analysis in         psychology and educational sciences: A systematic review of their characteristics.     Behavior Research Methods, 55(4), 2093–2108. https://doi.org/10.3758/s13428-022-01905-5
  • Martinou, E., & Angelidi, A. (2022). The role of open research in improving the standards of     evidence synthesis: Current challenges and potential solutions in systematic reviews.     F1000Research, 11, 1435. https://doi.org/10.12688/f1000research.127179.1
  • Nordström, T., Kalmendal, A., & Batinovic, L. (2022). Risk of bias and open science         practices in systematic reviews of educational effectiveness: A meta-review.         PsyArXiv. https://osf.io/5xumg
  • Norris, E., Prescott, A., Noone, C., Green, J. A., Reynolds, J., Grant, S. P., & Toomey, E.     (2022). Establishing open science research priorities in health psychology: A         research prioritisation Delphi exercise. Psychology & Health, 1–25.             https://doi.org/10.1080/08870446.2022.2139830
  • Pieper, D., & Rombey, T. (2022). Where to prospectively register a systematic review.         Systematic Reviews, 11(1), 8. https://doi.org/10.1186/s13643-021-01877-1
  • Schiavo, J. H. (2019). PROSPERO: An international register of systematic review protocols.     Medical Reference Services Quarterly, 38(2), 171–180.                     https://doi.org/10.1080/02763869.2019.1588072
  • Scott, A. M., Forbes, C., Clark, J., Carter, M., Glasziou, P., & Munn, Z. (2021). Systematic     review automation tools improve efficiency but lack of knowledge impedes their         adoption: A survey. Journal of Clinical Epidemiology, 138, 80–94.             https://doi.org/10.1016/j.jclinepi.2021.06.030
  • Yoong, S. L., Turon, H., Grady, A., Hodder, R., & Wolfenden, L. (2022). The benefits of data     sharing and ensuring open sources of systematic review data. Journal of Public         Health, 44(4), e582–e587. https://doi.org/10.1093/pubmed/fdac031

New Members at the Doctoral Level Wanted

The Open Science Initiative is currently looking for PhD students who would like to join us. As a working group, our mission is to support the implementation of Open Science practices in research and teaching at the Department of Psychology. The OSI meets about 1-2 times per semester. Additionally, you can get involved in sub-committees. If you are interested in Open Science and want to help improve transparency and reproducibility at the Department of Psychology (or have any questions) please contact Dr. Walter Bierbauer (walter.bierbauer[at]psychologie.uzh.ch).

Nominate Your Research for the Open Science Award

Attention ExPra students, Master students, PhD students and Post-Docs: The deadline for submitting your work for consideration for the Open Science Award 2024 is January 31st, 2024. More information here: https://www.psychologie.uzh.ch/de/bereiche/open-science/preis.html ExPra teachers are kindly asked to remind their students of this opportunity!

Upcoming Events

•    There are very interesting talks upcoming as part of the ReproducibiliTea, jointly organized by the universities of Basel and Zurich – check out their detailed program: https://www.crs.uzh.ch/en/training/ReproducibiliTea.html

Closing

This newsletter is published once a semester – feel free to contact us if you have any questions regarding it: openscience@psychologie.uzh.ch

As this newsletter is only published once per semester, we are unable to inform you about events scheduled at (rather) short notice. Thus, we recommend subscribing to the mailing list of UZH’s Center for Reproducible Science to stay up to date on offers for further training and scientific exchange on open science at UZH, such as the ReproducibiliTea Journal Club.

Current members of the open science initiative

Prof. Dr. Johannes Ullrich (Head); Dr. Walter Bierbauer; Prof. Dr. Renato Frey; Dr. Martin Götz; Dr. Sebastian Horn; Dr. André Kretzschmar; Prof. Dr. Nicolas Langer; M.Sc. Zita Mayer Dr. Susan Mérillat; Dr. Robin Segerer; Prof. Dr. Carolin Strobl; Dr. Lisa Wagner; Dr. Katharina Weitkamp