T0112


Closing the Evidence-Action Gap: Iterative Evaluation in a SEL Sports Program for Children 
Contributors:
Pedro Lopez (Fundación Luksic)
Silvana Lorenzini (Fundación Luksic)
Claudia Chiang (Fundacion Luksic)
Marizza Espinoza (Fundación Luksic)
Send message to Contributors
Format:
Poster
Mode:
Presenting in-person
Sector:
Nonprofit / charity

Short Abstract

The session shows how an internal evaluation model delivers timely and iterative insights to improve a SEL based sports program for children, using collaborative sensemaking and tailored communication. It highlights ways to reach diverse audiences and make findings truly actionable.

Description

At Fundación Luksic, the Evaluation Department has developed an internal evaluation approach inspired by utilisation-focused principles, aiming to produce insights that are practical, timely, and genuinely useful for program improvement. Our evaluation cycle includes design, implementation, results, and—when feasible—impact assessments, each offering evidence at key stages of a program’s development. These evaluations are conducted collaboratively with the Foundation’s implementation teams, fostering shared learning and strengthening the translation of evidence into concrete improvements.

We draw on the case of a Socio Emotional Learning (SEL) based sports program that has been running for the past two years to illustrate how an internal evaluation model can generate early and iterative insights, enabling informed decision-making.

The program seeks to nurture social and emotional skills in children aged 6 to 13 through formative after-school sports workshops inspired by the SEL framework, complemented by a positive parenting program for caregivers.

Since the program’s inception, the Evaluation Department has supported implementation through a series of implementation and results evaluations. During its first two years, these assessments informed several important design and delivery decisions. Data collection combined different methods—interviews, direct observation, surveys, and analysis of administrative records—applied to different actors, including sport instructors, caregivers, children, and program staff. We have been particularly challenged to innovate in participative methods adapted for children.

Evaluating the sports program has also offered opportunities to refine how we communicate findings and adapt them to different audiences. Internally, results are shared through workshop-style sessions that encourage implementers to reflect on the evidence, discuss its implications, and agree on the most relevant and feasible areas for improvement. Externally, insights from the evaluation—particularly those gathered from work with participating children in 2024—were presented to other NGOs working in child development and sports as part of the 2025 Evaluation Week organized by the Global Evaluation Initiative. This experience broadened our reach and highlighted the value of clear reporting, visual storytelling, and strategic framing in making evaluation findings accessible and actionable.