Accepted Paper

Evaluating the Public Sector Decarbonisation Scheme – Lessons for Policy and Programme Adaptation  
Michelle Hollier (GC Insight) Vivien Kizilcec (Department for Energy Security and Net Zero)

Send message to Authors

Paper short abstract

Evaluation of the UK Public Sector Decarbonisation Scheme shows how evidence shaped policy and delivery, using mixed methods and adaptive feedback loops. Findings highlight barriers, context and strategies for influencing decisions in decarbonisation programmes.

Paper long abstract

This session explores how evaluation shapes real-world policy decisions, influences stakeholders and drives better outcomes. To demonstrate the impact of evaluation on programme change, we will use the Public Sector Decarbonisation Scheme (PSDS) as a case study, which is a multibillion-pound initiative supporting DESNZ’s aim of reducing emissions from public sector buildings by 75% by 2037.

Now in its fourth phase, PSDS has evolved significantly through the support of multiple rounds of evaluation. This session will present key findings from the process, impact and economic evaluation of Phases 1 and 3, which used a mixed-methods approach including surveys, interviews, focus groups, quasi-experimental assessment and value-for-money analysis. We will highlight how our evaluation has directly informed scheme redesign, by enabling productive collaboration with policy teams.

PSDS’s role in accelerating decarbonisation across the public sector can be seen through the results of our evaluation. Notably, 75% of Phase 3 grant recipients surveyed said they would not have implemented any of the funded actions within the same timeframe without PSDS and 79% reported they would not have expected to take action within the next three years.

We collected regular feedback through interim reporting and stakeholder workshops, building an adaptive approach and informing adjustments for implementation of the next phase. Our evaluation findings highlight that overall, the changes implemented were welcomed by recipients and seen as improving fairness. For example, following feedback from some grant recipients that the first-come first-served application system prioritised application speed over the project’s carbon impact, the approach was changed to a targeted allocation system. Similarly, the introduction of multi-year and planning-year applications addressed barriers to delivering complex installations within a single financial year, allowing grant recipients more preparation time to enable a smoother delivery process.

Lesson learnt sessions were vital, enabling both internal and external stakeholders to identify any issues or areas for improvement. For instance, the policy team regularly reviewed and updated the guidance documents based on grant recipient’s feedback to improve clarity and encourage a higher volume of quality applications. The process of collecting monitoring data was also refined to gather additional building information and reduce data input errors thereby improving the analysis of the scheme. To achieve this, we engaged stakeholders from the policy and analysis teams, as well as the PSDS’s delivery partner to better understand their priorities and re-design the application form accordingly.

By identifying barriers and enablers, our evaluation informed iterative improvements of PSDS, strengthened collaboration between evaluators and policymakers and enhanced delivery for grant recipients. This session will therefore be able to offer practical insights into how evaluation can influence programme change and contribute to continuous learning and policy innovation.

Abstract PK3
Pecha Kucha 3
  Session 1 Thursday 21 May, 2026, -