Data from: Estimating the reproducibility of psychological science A. AARTS Alexer et al Stephanie C. LIN 10.25440/smu.12062757.v1 https://researchdata.smu.edu.sg/articles/dataset/Data_from_Estimating_the_reproducibility_of_psychological_science/12062757 <p>This record contains the underlying research data for the publication "Estimating the reproducibility of psychological science" and the full-text is available from: <a href="https://ink.library.smu.edu.sg/lkcsb_research/5257">https://ink.library.smu.edu.sg/lkcsb_research/5257</a></p><p>Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.</p> 2015-08-01 00:00:00 empirical analysis error analysis innovation meta-analysis psychology research method confidence interval correlational study prediction reproducibility sampling selection bias social psychology Psychology not elsewhere classified