Guidelines for implementation and factors influencing replicability across different fields in psychology
 

Aim

For several years, a "replication crisis" has been discussed in multiple domains of science, as many studies could not be confirmed by replication. The project "ConRep" investigates factors that influence the replicability of research results. As a methodological project, the aim is to provide an overview of factors that influence replicability across disciplines and to provide guidelines for conducting systematic replication studies.

In particular, disciplinary challenges for replication studies are of interest, with regard to different psychological disciplines that are in the focus of the replication crisis (social psychology, cognitive psychology, educational psychology). Primary project concepts address the development, evaluation, and interdisciplinary comparison of design and analysis techniques for studying replication factors.

The project is integrated into the DFG priority program SPP 2317: A Metascientific Program for the Analysis and Optimization of Replicability in the Behavioral, Social, and Cognitive Sciences (META-REP).

 

Background

Conceptual replications are conducted to test the stability of research results under altered study conditions. The focus is on the question of whether modifications in the study design critically influence the results.

Requirements for the systematic design of replication studies have been proposed, for instance, by The National Science Foundation and The Institute of Education Sciences, U.S. Department of Education (2018). However systematic variation of study factors has seldom been put to practice. The causal replication framework (CRF) of Vivian Wong (University of Virginia) and Peter Steiner (University of Maryland) provides a general theoretical basis and guidance for designing replication studies. Detailed information on the framework, on design-based replication approaches, as well as on analysis techniques to ensure and test CRF assumptions can be found on the website Collaboratory Replication Lab (edreplication.org). So far, the CRF was applied for conceptual replications in educational psychology. In collaboration with these efforts, we make the CRF applicable to various disciplines and provide hands-on example studies as well as guidelines.

 

Approach

  • Literature review on the design of replication studies as an overview of reported and targeted variations between studies
  • Pre-registration and implementation of own replication studies in social psychology and cognitive psychology to explain effect heterogeneity
  • Methodological developments for design-based and statistical control of confounding effects in the comparison of study results
  • Derivation of guidelines as a meta-perspective on design and analysis methods for conceptual replications
 

Results

In order to draw causal conclusions about the reasons for differing study results, it is necessary to precisely define the effects under investigation, exclude confounding factors by design, and collect additional measures that make differences between studies identifiable and controllable.

The literature review on the design of replication studies shows that, to date, procedural aspects—such as methods, procedures, and analysis techniques—have been controlled for, while other study characteristics, such as the population studied or the setting, have been neglected.

In empirical study series, various replication factors were therefore specifically varied between studies and unintended differences in study characteristics were controlled for. In particular, the composition of the sample proved to be a key factor influencing the variation in effects, while characteristics such as recruitment time or technical equipment had no substantial influence.

New analysis methods were developed for the statistical control of study differences, which preserve intended variations and adjust for unintended variations. In addition to the theoretical assumptions and estimation procedures, sensitivity analyses are provided and the implementation is illustrated using empirical studies.

 

Project profile

Dr. Marie-Ann Sengewald / LIfBi

Prof. Dr. Steffi Pohl / Freie Universität Berlin

Prof. Dr. Anne Gast / University of Cologne

Dr. Mathias Twardawski / LMU Munich

  • Project duration: 01/2022 – 06/2025
  • Funding: German Research Foundation, DFG Priority Program SPP 2317: A meta-scientific program to analyze and optimize replicability in the behavioral, social, and cognitive sciences (META-REP)

meta-rep

 
Project partners
Freie Universität Berlin
Universität zu Köln
LMU Ludwig-Maximilians-Universität München
 

Publications

2025

Hoffmann, J., Twardawski, M., Höhs, J. M., Gast, A., Pohl, S., & Sengewald, M.-A. (2025). The design of current replication studies: A systematic literature review on the variation of study characteristics. Advances in Methods and Practices in Psychological Science, 8(2), Advance online publication. https://doi.org/10.1177/25152459251328273

2024

Hoffmann, J., Twardawski, M., Kondzic, D., Pohl, S., Gast, A., Höhs, J., & Sengewald, M.-A. (2024). Identifying causes of effect heterogeneity in replication studies: An application of the Causal Replication Framework. In U. Ansorge, D. Gugerell, U. Pomper, B. Szaszkó, & L. Werner (Eds.), Congress of the German and Austrian Psychological Societies 2024 (pp. 597-598). Pabst Science Publishers. https://doi.org/10.2440/0003
Koch, T., & Sengewald, M.-A. (2024). Improving Psychological research through formal methodological frameworks: Illustrations for measurement, missing data, explanation, replication. In U. Ansorge, D. Gugerell, U. Pomper, B. Szaszkó, & L. Werner (Eds.), Congress of the German and Austrian Psychological Societies 2024 (pp. 598). Pabst Science Publishers. https://doi.org/10.2440/0003
Kondzic, D., Hoffmann, J., Sengewald, M.-A., & Pohl, S. (2024). Assessing replication success: A systematic comparison of correspondence measures. In U. Ansorge, D. Gugerell, U. Pomper, B. Szaszkó, & L. Werner (Eds.), Congress of the German and Austrian Psychological Societies 2024 (pp. 597). Pabst Science Publishers. https://doi.org/10.2440/0003
Pohl, S., Kondzic, D., Hoffmann, J., & Sengewald, M.-A. (2024). Analyses of causal effects of study aspects on effect heterogeneity across replication studies. In U. Ansorge, D. Gugerell, U. Pomper, B. Szaszkó, & L. Werner (Eds.), Congress of the German and Austrian Psychological Societies 2024 (pp. 600-601). Pabst Science Publishers. https://doi.org/10.2440/0003