- Format:
- Pecha Kucha
- Mode:
- Presenting in-person
- Location:
- Room 2
- Sessions:
- Thursday 21 May, -
Time zone: Europe/London
Accepted papers
Session 1 Thursday 21 May, 2026, -Paper short abstract
A qualitative rapid evaluation of a rent subsidy revealed uneven beneficiary outcomes. Subsequent subsidy modelling informed by findings shaped reforms proposed in the 2026 budget, showing how adaptive, timely evaluation can drive substantive policy change.
Paper long abstract
Adequate housing is recognised as a fundamental human right (UNHCR, 2009), yet in Malta, rising rental costs and limited regulatory safeguards have intensified affordability pressures since 2013 (Micallef, 2021). In response, the Housing Authority introduced the Housing Benefit Scheme (HBS), a rent-subsidy programme designed to support low-income households (Malta Housing Authority, 2019). This presentation shows how a rapid qualitative evaluation of HBS directly shaped subsidy reform and is currently being appraised by the appropriate ministry to be adopted in the forthcoming 2026 national budget.
An evaluation of the HBS was carried out by an external consultant who used a logic model to identify causal links between inputs and outcomes. Data was gathered and analysed using a rapid qualitative assessment approach with six beneficiaries representing various types of households using thematic analysis. Despite its small scale, the evaluation revealed stark disparities: beneficiaries facing severe structural disadvantages remained unable to achieve housing security, while others reported stability and even aspirations to purchase property. These findings underscored that a “one-size-fits-all” approach was insufficient to address heterogeneous needs, even within the same household category.
Building on the qualitative insights, a subsequent quantitative modelling exercise using administrative data from beneficiaries was carried out. Multiple subsidy scenarios were developed to better reflect market conditions, align with the needs of diverse household types, and identify cost-effective solutions. Through an iterative process with ministries and the Housing Authority, one scenario was adopted, and the revised methodology is being appraised for the 2026 National Budget.
The case illustrates how adaptive evaluation approaches—combining timely qualitative insights with targeted modelling—can influence high-level policy decisions under pressing conditions. It also highlights practical lessons: the benefits and trade-offs of outsourcing evaluations, the balance between methodological rigour and timeliness, and the role of effective communication tools (e.g., visuals, scenario modelling) in securing stakeholder buy-in.
This contribution demonstrates that even modest, rapid evaluations can catalyse substantive policy change when strategically embedded in adaptive, utilisation-focused processes. It offers transferable insights for evaluators working in dynamic policy environments where evidence must be timely, credible, and actionable.
References:
Malta Housing Authority. (2019). Housing Benefit On Privately Rented Dwellings (2019). https://housingauthority.gov.mt/wp-content/uploads/2023/12/ HBS-Conditions-English-2024.pdf
Micallef, B. (2021). The Long-Lasting Legacy of Rent Controls: Perspectives on the Private Rental Market in Malta within the Context of a Dual Market. International Journal of Real Estate Studies, 15(2), 43-54.
UNHCR. (2009). The right to adequate housing. The office of the United Nations High Commissioner for Human Rights. Geneva: United Nations Office at Geneva. https://www.ohchr.org/sites/default/files/Documents/Publications/ FS21_rev_1_Housing_en.pdf
Paper short abstract
A presentation on how Choice Care and Akerlof collaborated on an evaluation of the impact of care on residents, families and staff, and in doing so made the case for a move towards outcomes-led care commissioning.
Paper long abstract
Adult social care in the UK contributes over £55 billion in gross value added and supports 1.8 million jobs, yet it is still widely viewed as a drain on public finances rather than a driver of social and economic wellbeing. This perception limits opportunities to improve the quality of care, in particular for people with complex mental health needs and learning disabilities, where the absence of meaningful data on wellbeing and outcomes has constrained how commissioners and regulators define “good care.”
This presentation explores how Choice Care and Akerlof partnered to challenge this issue through evaluation. Working together for the past two years, we have developed The Real Value of Care (2025), a Social Return on Investment (SROI) study that evidences £2.48 of social value for every £1 invested. More importantly, it provides a framework for how lived experience can shape care and commissioning practices by bringing into consideration the voices of those who we have historically failed to engage.
In our first year working together, we engaged 61 people – colleagues, family, residents and a therapist. This enabled us to understand the impact of care from multiple perspectives. Recognising that some residents could not engage directly, we used proxy voices to triangulate feedback. Moving into our second year of working together, we have developed inclusive communication tools to capture more of the authentic experiences of residents experiencing care. This has enabled us to work with a wider range of people. Over the next three years, we will continue developing tools to ensure more voices are heard. Working with vulnerable people requires care: the questions, methods, and environment must help residents feel safe contributing now and in the future.
Methodologically, the study combined qualitative interviews and surveys with monetised wellbeing outcomes using the Social Value Engine, Measure Up, and Green Book–aligned SROI modelling. Across three homes, we identified consistent outcomes valued most by colleagues and residents—feeling safe, living fulfilled lives, and being supported—and translated these into measurable social value. The additional wellbeing impact equated to £175 million across the organisation, or £84,000 per resident and £46,000 per colleague. Importantly, the process itself strengthened trust and inclusion, showing evaluation can be an intervention as well as a measurement tool.
The impact has been both organisational and systemic. Within Choice Care, findings have informed workforce development, management approaches, wellbeing initiatives, and annual impact reporting. Externally, the evidence has strengthened dialogue with commissioners and regulators, shifting conversations from compliance and cost to value and outcomes. Some local authorities are already beginning to adopt outcome-based commissioning and quality frameworks.
For evaluators, this work offers a replicable approach to influencing programme and policy change by connecting lived experience and wellbeing evaluation to social value and economic argument. As the partnership develops, Choice Care and Akerlof continue refining inclusive, repeatable frameworks linking experience to outcomes. We invite UK Evaluation Society Conference attendees to reflect on how bringing marginalised voices to the fore can lead to meaningful policy and practice change.
Paper short abstract
Evaluation of the UK Public Sector Decarbonisation Scheme shows how evidence shaped policy and delivery, using mixed methods and adaptive feedback loops. Findings highlight barriers, context and strategies for influencing decisions in decarbonisation programmes.
Paper long abstract
This session explores how evaluation shapes real-world policy decisions, influences stakeholders and drives better outcomes. To demonstrate the impact of evaluation on programme change, we will use the Public Sector Decarbonisation Scheme (PSDS) as a case study, which is a multibillion-pound initiative supporting DESNZ’s aim of reducing emissions from public sector buildings by 75% by 2037.
Now in its fourth phase, PSDS has evolved significantly through the support of multiple rounds of evaluation. This session will present key findings from the process, impact and economic evaluation of Phases 1 and 3, which used a mixed-methods approach including surveys, interviews, focus groups, quasi-experimental assessment and value-for-money analysis. We will highlight how our evaluation has directly informed scheme redesign, by enabling productive collaboration with policy teams.
PSDS’s role in accelerating decarbonisation across the public sector can be seen through the results of our evaluation. Notably, 75% of Phase 3 grant recipients surveyed said they would not have implemented any of the funded actions within the same timeframe without PSDS and 79% reported they would not have expected to take action within the next three years.
We collected regular feedback through interim reporting and stakeholder workshops, building an adaptive approach and informing adjustments for implementation of the next phase. Our evaluation findings highlight that overall, the changes implemented were welcomed by recipients and seen as improving fairness. For example, following feedback from some grant recipients that the first-come first-served application system prioritised application speed over the project’s carbon impact, the approach was changed to a targeted allocation system. Similarly, the introduction of multi-year and planning-year applications addressed barriers to delivering complex installations within a single financial year, allowing grant recipients more preparation time to enable a smoother delivery process.
Lesson learnt sessions were vital, enabling both internal and external stakeholders to identify any issues or areas for improvement. For instance, the policy team regularly reviewed and updated the guidance documents based on grant recipient’s feedback to improve clarity and encourage a higher volume of quality applications. The process of collecting monitoring data was also refined to gather additional building information and reduce data input errors thereby improving the analysis of the scheme. To achieve this, we engaged stakeholders from the policy and analysis teams, as well as the PSDS’s delivery partner to better understand their priorities and re-design the application form accordingly.
By identifying barriers and enablers, our evaluation informed iterative improvements of PSDS, strengthened collaboration between evaluators and policymakers and enhanced delivery for grant recipients. This session will therefore be able to offer practical insights into how evaluation can influence programme change and contribute to continuous learning and policy innovation.