T0162


From complexity to clarity: making economic evaluation work for local mental health funding 
Contributors:
Helen Butler (Mind)
Anita Fernandes (Mind - the mental heath charity)
Send message to Contributors
Format:
Pecha Kucha
Mode:
Presenting online
Sector:
Nonprofit / charity

Short Abstract

An economic evaluation of Mind’s Supported Self-Help programme, led by Ipsos and supported by Mind’s Research & Evaluation team, explored cost-effectiveness and how complex evidence can be translated into actionable insights for local funding and commissioning.

Description

Mind’s Supported Self-Help programme is a six-week guided intervention designed to support people experiencing common mental health problems. Delivered by 46 local Minds, the programme was evaluated by Ipsos through an economic study, with strategic and operational support from Mind’s Research & Evaluation team.

Ipsos led the design and delivery of the evaluation, applying economic modelling and quantitative analysis to assess outcomes and value for money. Mind’s Research & Evaluation team played a key role in shaping the evaluation scope, liaising with delivery partners, and ensuring the findings were relevant and usable for local funding and commissioning conversations.

The evaluation faced significant challenges, particularly around data quality and consistency across delivery sites. These issues reflected broader systemic barriers to embedding robust evaluation practices in federated service models. Despite this, the evaluation produced valuable insights into the programme’s outcomes and cost-effectiveness, contributing to a growing evidence base for scalable, low intensity mental health support.

A key objective was to ensure the findings were usable for local decision makers. This required translating complex, jargon-heavy economic analysis into accessible, actionable messages for non-technical audiences. Mind’s team worked closely with Ipsos and local Minds to understand evidence needs and adapt dissemination strategies accordingly. This included framing findings in ways that supported local advocacy and sustainability.

This session will share lessons learned from conducting a complex evaluation across a large delivery network, including:

• Approaches to managing data quality and variation.

• Strategies for making economic evidence usable in local policy contexts.

• Reflections on balancing methodological rigour with practical utility.

It will also explore how evaluation can be positioned as a tool for influencing programme sustainability and funding, particularly in the mental health sector where services often face resource constraints and high demand.