T1.5


Meta-economics? Meta-research innovations in economics 
Convenor:
Jo Weech (BITSS, UC Berkeley)
Chair:
Erik Ø. Sørensen (NHH Norwegian School of Economics)
Discussants:
Anna Dreber (Stockholm School of Economics)
Sergio Puerto (Center for Effective Global Action, UC Berkeley)
Fernando Hoces De La Guardia Figueroa (UC Berkeley)
Seung Yong Sung (UC Berkeley)
Prashant Garg (Imperial College London)
Format:
Panel
Location:
Sessions:
Monday 30 June, -
Time zone: Europe/London

Short Abstract

This panel showcases five initiatives advancing meta-research in economics: replication workshops (I4R), standardized hypothesis reporting in RCTs (RGPB), a platform to crowdsource computational reproducibility (SSRP), causal claim analysis in papers, and evidence aggregation.

Long Abstract

Motivated by the growing demand for credible and reliable research, this panel addresses the fundamental challenges in economic research. By focusing on hypothesis building, evidence aggregation, and the replicability of results, the panel introduces initiatives that seek to increase the transparency of economic studies. These are essential steps towards creating a more credible research environment and more robust, research-based policy.

This panel presents four innovative initiatives that are transforming meta-research in economics through institutional partnerships and technological solutions. The Institute for Replication (I4R) is pioneering reproducibility and replication workshops to incentivize and communicate reproducibility and replication results, addressing low replication attempts in the field. The Reporting Guidelines for Publication Bias (RGPB) project introduces standardized hypothesis reporting for trials in the AEA RCT Registry, providing insight into publication bias at the hypothesis level. The Social Science Reproduction Platform (SSRP) enables the computational assessment of reproducibility using a standardized procedure. Finally, the Impact Data and Evidence Aggregation Library (IDEAL), a World Bank-led initiative, is developing an evidence synthesis platform for comparing impact evaluation results across contexts by standardizing treatment effects.

Together, these projects demonstrate how institutional collaboration, new platforms and reporting tools, and standardized protocols can enhance the credibility of economic research. The panel will explore lessons learned, challenges faced, and future directions for reproducibility and reporting in economics and social science more broadly.

Accepted papers

Session 1 Monday 30 June, 2025, -