- Author:
-
Tom Morton
(Imperial College)
Send message to Author
- Format:
- Single slot (20 min) presentation
- Mode:
- Presenting in-person
- Sector:
- Academia
Short Abstract
Having co-produced/evaluated a research seed fund, we will outline how peer evaluation has built a culture of reflection on the ethics of engagement as part of an evidence-based approach. This work offers learning for grant-giving and those with an interest in participatory methods.
Description
This presentation will share learning from 18 months of work co-producing, piloting, and evaluating Imperial College London’s ‘Collaboration Kickstarter’, an engaged research seed fund that aims to build a broader participatory culture across Imperial by seed funding partnerships between researchers and local communities. Our participatory grant making method foregrounds the power dynamics rooted within academic institutions by actively embedding a team of community peer evaluators into decision making at all stages of our seed fund pilot, ensuring that those traditionally marginalised in academic research have a voice in how its resources are allocated. From co-producing a grant call and evaluation framework to reviewing grant applications and interpreting evaluation data to produce collaborative learning, we will outline how working with community peer evaluators has been crucial to embedding accountability into an evidence-based approach to the delivery of the second round of the Collaboration Kickstarter through critical reflection on the ethics of engaging communities within academic research.
Sharing early insights from our co-authored Evaluation and Impact Report, we will present alongside one of our community peer evaluators, outlining how we approached co-producing the fund and evaluation framework, before highlighting two emerging learnings from our peer-evaluation process. These are (1) that embedding peer evaluation into participatory grant making can create an environment in which evidence for broader research culture change can be collected and used to inform future practice, and (2) that the ethical considerations championed by peer evaluators highlight the scale of challenge such culture change poses, as imbalanced power dynamics reproduce themselves in pragmatic compromises that are made in response to limitations within the current research environment.
This work could offer significant learning to grant-giving bodies as well as to those with an interest in applying participatory methods of evaluation. It provides practice-based insights into how a pathway from evaluation to delivery can be embedded in a programme that runs across multiple rounds and ensures that evidence is valued and used.