Erik Ø. Sørensen
(NHH Norwegian School of Economics)
Discussants:
Anna Dreber
(Stockholm School of Economics)
Sergio Puerto
(Center for Effective Global Action, UC Berkeley)
Fernando Hoces De La Guardia Figueroa
(UC Berkeley)
Seung Yong Sung
(UC Berkeley)
Prashant Garg
(Imperial College London)
Format:
Panel
Location:
Sessions:
Monday 30 June, -
Time zone: Europe/London
Meta-economics? Meta-research innovations in economics.
Panel T1.5 at conference Metascience 2025.
This panel showcases five initiatives advancing meta-research in economics: replication workshops (I4R), standardized hypothesis reporting in RCTs (RGPB), a platform to crowdsource computational reproducibility (SSRP), causal claim analysis in papers, and evidence aggregation.
Long Abstract
Motivated by the growing demand for credible and reliable research, this panel addresses the fundamental challenges in economic research. By focusing on hypothesis building, evidence aggregation, and the replicability of results, the panel introduces initiatives that seek to increase the transparency of economic studies. These are essential steps towards creating a more credible research environment and more robust, research-based policy.
This panel presents four innovative initiatives that are transforming meta-research in economics through institutional partnerships and technological solutions. The Institute for Replication (I4R) is pioneering reproducibility and replication workshops to incentivize and communicate reproducibility and replication results, addressing low replication attempts in the field. The Reporting Guidelines for Publication Bias (RGPB) project introduces standardized hypothesis reporting for trials in the AEA RCT Registry, providing insight into publication bias at the hypothesis level. The Social Science Reproduction Platform (SSRP) enables the computational assessment of reproducibility using a standardized procedure. Finally, the Impact Data and Evidence Aggregation Library (IDEAL), a World Bank-led initiative, is developing an evidence synthesis platform for comparing impact evaluation results across contexts by standardizing treatment effects.
Together, these projects demonstrate how institutional collaboration, new platforms and reporting tools, and standardized protocols can enhance the credibility of economic research. The panel will explore lessons learned, challenges faced, and future directions for reproducibility and reporting in economics and social science more broadly.
We analyze 44k economics papers using LLMs. Causal claims rose from ~4% (1990) to ~28% (2020). Causal complexity predicts top-5 publication and citations. Novelty boosts impact only if causal. Connecting key concepts draws citations, but bridging new gaps yields mixed results.
Long abstract
We analyze over 44,000 NBER and CEPR working papers from 1980 to 2023 using a custom language model to construct knowledge graphs that map economic concepts and their relationships. We distinguish between general claims and those documented via causal inference methods (e.g., DiD, IV, RDD, RCTs). We document a substantial rise in the share of causal claims-from roughly 4% in 1990 to nearly 28% in 2020-reflecting the growing influence of the "credibility revolution." We find that causal narrative complexity (e.g., the depth of causal chains) strongly predicts both publication in top-5 journals and higher citation counts, whereas non-causal complexity tends to be uncorrelated or negatively associated with these outcomes. Novelty is also pivotal for top-5 publication, but only when grounded in credible causal methods: introducing genuinely new causal edges or paths markedly increases both the likelihood of acceptance at leading outlets and long-run citations, while non-causal novelty exhibits weak or even negative effects. Papers engaging with central, widely recognized concepts tend to attract more citations, highlighting a divergence between factors driving publication success and long-term academic impact. Finally, bridging underexplored concept pairs is rewarded primarily when grounded in causal methods, yet such gap filling exhibits no consistent link with future citations. Overall, our findings suggest that methodological rigor and causal innovation are key drivers of academic recognition, but sustained impact may require balancing novel contributions with conceptual integration into established economic discourse.
We will present the Institute for Replication (I4R) that conducts reproducibility and replication workshops to incentivize and communicate reproducibility and replication results, addressing low numbers of replication attempts in economics and political science in particular but also more broadly.
Long abstract
We will present the work of the Institute for Replication (I4R) that conducts reproducibility and replication workshops to incentivize and communicate reproducibility and replication results in economics and political science in particular but also more broadly in the social and behavioral sciences. After several years of Replication Games, led by I4R's chair Abel Brodeur, we believe that this is one of several ways forward to address the relatively low numbers of replication attempts and to promote a culture of replication.
We will present the Social Science Reproduction Platform (SSRP), a tool for conducting and logging standardized computational reproduction exercises. Drawing on 400+ reproductions in economics, we find that 30–38% of recent studies meet at least a basic standard of reproducibility.
Long abstract
We will present the Social Science Reproduction Platform (SSRP), a platform for publicly recording computational reproductions in economics and other social sciences using a standardized framework. SSRP enables crowd-sourced evaluations of computational reproducibility at the table or figure level a 10-point reproducibility scale, and aggregates these assessments to the claim and paper level. Drawing on over 480 reproduction exercises—including over 410 in economics—we document substantial heterogeneity in reproducibility scores. Approximately 30 to 38% of recent economics papers reproduced on SSRP achieve some form of computational reproducibility (from either raw or analysis data), though only a subset is fully reproducible from raw data.
Sergio Puerto (Center for Effective Global Action, UC Berkeley)
Fernando Hoces De La Guardia Figueroa (UC Berkeley)
Seung Yong Sung (UC Berkeley)
Prashant Garg (Imperial College London)
Short Abstract
This panel showcases five initiatives advancing meta-research in economics: replication workshops (I4R), standardized hypothesis reporting in RCTs (RGPB), a platform to crowdsource computational reproducibility (SSRP), causal claim analysis in papers, and evidence aggregation.
Long Abstract
Motivated by the growing demand for credible and reliable research, this panel addresses the fundamental challenges in economic research. By focusing on hypothesis building, evidence aggregation, and the replicability of results, the panel introduces initiatives that seek to increase the transparency of economic studies. These are essential steps towards creating a more credible research environment and more robust, research-based policy.
This panel presents four innovative initiatives that are transforming meta-research in economics through institutional partnerships and technological solutions. The Institute for Replication (I4R) is pioneering reproducibility and replication workshops to incentivize and communicate reproducibility and replication results, addressing low replication attempts in the field. The Reporting Guidelines for Publication Bias (RGPB) project introduces standardized hypothesis reporting for trials in the AEA RCT Registry, providing insight into publication bias at the hypothesis level. The Social Science Reproduction Platform (SSRP) enables the computational assessment of reproducibility using a standardized procedure. Finally, the Impact Data and Evidence Aggregation Library (IDEAL), a World Bank-led initiative, is developing an evidence synthesis platform for comparing impact evaluation results across contexts by standardizing treatment effects.
Together, these projects demonstrate how institutional collaboration, new platforms and reporting tools, and standardized protocols can enhance the credibility of economic research. The panel will explore lessons learned, challenges faced, and future directions for reproducibility and reporting in economics and social science more broadly.
Accepted papers
Session 1 Monday 30 June, 2025, -