- Contributors:
-
Oliver Hamer
(Edge Hill University)
Michelle Howarth (PerCIE (Personalised Care Interprofessional Network))
Jade Thomson (Edge Hill University)
Send message to Contributors
- Format:
- Poster
- Mode:
- Presenting in-person
- Sector:
- Academia
Short Abstract
This abstract features an evaluation of Barnardo’s social prescribing service (Cumbria LINK) and how it prompted key refinements in programme delivery. The findings highlight how a learning partnership approach including relational practice and iterative feedback loops, helped refine the programme.
Description
The Barnardo’s LINK social prescribing programme was established in the Northwest of England to address a critical gap in support for children and young people (CYP) experiencing social, emotional, and mental health difficulties (that fell below clinical thresholds). Designed within a social prescribing model, the programme positioned social support and community engagement as therapeutic mechanisms to improve wellbeing.
The programme was independently evaluated by Edge Hill University over three years, generating evidence that measured impact and refined programme delivery. The evaluation adopted a learning partnership approach (alongside a process and impact evaluation) that embedded iterative reflection between researchers and LINK practitioners. This design transformed the role of evaluation from a retrospective assessment into a live process of co-inquiry and service improvement. Through multiple methods (e.g., qualitative interviews, outcome monitoring, and development of a Theory of Change), the evaluation gathered insights into how relational practice, and adaptability were key drivers of success. In turn, these insights catalysed a series of refinements for programme delivery, and system integration (particularly in health, education and social care).
The evaluation’s impact was evident in how the learning partnership refined programme delivery. Evaluation findings which highlighted inconsistent referral pathways, were used to prompt the development of clearer guidance to help families better understand the offer. At the organisational level, evaluation insights informed a shift from an initial pilot to a mature, system-embedded programme, encouraging efforts to improve visibility (across systems such as social care). Findings from evaluation activities also highlighted practical challenges faced by families (e.g., difficulties with transport), which led to programme refinements aimed at making the service easier to access. In addition, the evaluation team identified key data inefficiencies within the programme’s existing monitoring systems, highlighting gaps that limited the ability to evidence outcomes. These insights re-shaped data collection processes to capture more meaningful evidence, strengthening the programme’s capacity to demonstrate robust impact to commissioners.
The evaluation process also revealed barriers that hindered the translation of evaluation into decision-making (e.g., workforce capacity pressures, and constraints in funding), whilst also highlighting how an adaptive, learning partnership-based approach could overcome these hurdles. By maintaining regular dialogue between researchers and LINK practitioners, evaluation findings were mobilised in real time, promoting a culture of reflection and shared ownership of change. Ultimately, the evaluation of the programme did not simply describe what worked or outline its impact; it became a crucial mechanism for programme refinement. This case illustrates how evaluation (when relational and adaptive), can bridge the gap between evidence and action, not as an endpoint, but as an evolving process of learning.