AI for Evidence-Informed Policymaking: Detecting, Tracing & Analysing Research Evidence Use in Legislatures
Afagh Mulazadeh
(UCL)
Paper Short Abstract
This study employed NLP and ML methods to examine research evidence use in legislatures. Transformer-based models, NER, and citation mapping were utilised to detect research uptake and trace its sources, while an evidence classification framework was developed to enable sentence-level analysis.
Paper Abstract
A working assumption in legislative science advice (LSA) literature is that transparency and robust, research evidence-based scrutiny are critical to effective policymaking. Yet, there are no comprehensive models examining research evidence use in legislatures. This study leveraged NLP and ML methods to systematically detect, trace, and analyse explicit and tacit research evidence use in legislatures at scale.
The study focused on British parliamentary scrutiny since 2022. Over 20,000 records were processed, including debate transcripts, written questions, committee documents, and in-house research (IHR) briefings with a corresponding dataset of cited academic and policy sources. We applied transformer-based models and NER to detect instances where parliamentarians used IHR briefings verbatim. Citation mapping then traced matched sentences to their likely original sources. Finally, a classification framework enabled analysis of how parliamentarians contextualise different evidence.
A qualitative examination of parliamentary speech identified seven types of evidence. Every detected instance from an IHR briefing was assigned one of seven labels. To explore contextualisation, the preceding and following sentences were also extracted and classified. These three-sentence sequences were recorded in a new dataset to identify patterns.
Our findings reveal patterns of selective citation, a strong reliance on high-impact publications, and the strategic framing of research. Additionally, metadata analysis identified instances where IHR briefings were relevant but not used.
This research advances metascience by providing a scalable, AI-driven framework to assess research impact in governance and enhance transparency. It informs strategies to increase research visibility and uptake, offering actionable insights for policymakers, researchers, and knowledge brokers.
Accepted Poster
Paper Short Abstract
Paper Abstract
A working assumption in legislative science advice (LSA) literature is that transparency and robust, research evidence-based scrutiny are critical to effective policymaking. Yet, there are no comprehensive models examining research evidence use in legislatures. This study leveraged NLP and ML methods to systematically detect, trace, and analyse explicit and tacit research evidence use in legislatures at scale.
The study focused on British parliamentary scrutiny since 2022. Over 20,000 records were processed, including debate transcripts, written questions, committee documents, and in-house research (IHR) briefings with a corresponding dataset of cited academic and policy sources. We applied transformer-based models and NER to detect instances where parliamentarians used IHR briefings verbatim. Citation mapping then traced matched sentences to their likely original sources. Finally, a classification framework enabled analysis of how parliamentarians contextualise different evidence.
A qualitative examination of parliamentary speech identified seven types of evidence. Every detected instance from an IHR briefing was assigned one of seven labels. To explore contextualisation, the preceding and following sentences were also extracted and classified. These three-sentence sequences were recorded in a new dataset to identify patterns.
Our findings reveal patterns of selective citation, a strong reliance on high-impact publications, and the strategic framing of research. Additionally, metadata analysis identified instances where IHR briefings were relevant but not used.
This research advances metascience by providing a scalable, AI-driven framework to assess research impact in governance and enhance transparency. It informs strategies to increase research visibility and uptake, offering actionable insights for policymakers, researchers, and knowledge brokers.
Poster session
Session 1 Tuesday 1 July, 2025, -