- Author:
-
Joanne McLean
(NatCen Social Research)
Send message to Author
- Format:
- Single slot (20 min) presentation
- Mode:
- Presenting in-person
- Sector:
- Nonprofit / charity
Short Abstract
Adaptive evaluation of DBI shaped decisions and improved outcomes. Recommendations tracked via an action log led to changes in referral pathways, training, adaptations for key groups, international roll-out, and inspired further research, including a £1m NIHR-funded study.
Description
This paper explores the pivotal role of evaluation in shaping decisions, influencing stakeholders, and driving better outcomes within the Distress Brief Intervention (DBI) programme—an innovative mental health initiative commissioned by the Scottish Government in 2017. Over eight years, a dynamic collaboration between government, ScotCen Social Research, academic partners, and service providers has produced a rich tapestry of learning, adaptation, and impact.
Evaluation as a Driver of Change and Learning
The original DBI pilot evaluation was designed not only to assess fidelity and impact but also to serve as a catalyst for continuous improvement. Through proactive and regular feedback, the evaluation team established action learning sets and feedback loops at both local and national levels, enabling stakeholders to absorb lessons, share insights, and co-create solutions in real time. In-person gatherings and workshops provided space for direct engagement, facilitating action planning and embedding learning into operational and strategic decisions.
Influencing Stakeholders and Shaping Policy
Stakeholder involvement was central to the evaluation’s success. Service users, practitioners, and management participated in co-creating evaluation tools, ensuring responsiveness to diverse needs and enhancing the relevance of findings. Real-time sharing of learning allowed DBI to adapt rapidly, with mid-evaluation insights informing decisions about national scaling and strategic direction. The participatory ethos fostered trust and ownership, amplifying influence on policy and practice.
The evaluation journey was not without challenges, particularly in translating findings into actionable decisions including:
• Contextual Complexity: Variation in local service contexts affected the relevance and feasibility of recommendations. What worked well in one setting did not always translate to another.
• Systemic Constraints: Organizational structures and limited resources sometimes slowed change, requiring persistent advocacy and adaptive strategies.
• Stakeholder Alignment: Achieving consensus among diverse stakeholders demanded ongoing negotiation and flexibility, especially when lessons from failures or unmet needs surfaced.
Learning from What Has Not Worked
The evaluation process provided space to learn from less successful intervention aspects e.g. adaptations for younger people (including pilots in schools) and A&E attendees were prompted by findings that these groups’ experiences were less positive exemplifing how evaluation can highlight gaps and drive targeted improvements.
Adaptive Approaches in Dynamic Contexts
The DBI experience underscores the value of adaptive evaluation approaches in dynamic and uncertain policy areas. Strategies such as real-time feedback, facilitated action planning, and participatory tool development enabled the programme to respond to emerging challenges and opportunities. Evaluation recommendations were systematically implemented through an action log and routine programme management with evidence of concrete changes, including new tests of change, enhanced training, and continuous improvement processes ensuring that both successes and setbacks were systematically tracked, planned, and monitored. Service user involvement was instrumental in shaping ongoing improvements and supporting further research, including a successful NIHR grant to investigate DBI’s impact on suicidal ideation and self-harm.
Conclusion
Adaptive, context-sensitive evaluation shapes decisions, influences stakeholders, and improves outcomes. It provides space to learn from failures and address barriers to change. Lessons learned highlight flexibility, stakeholder engagement, and continuous learning in translating evaluation findings into meaningful policy and practice improvements.