- Authors:
-
Denis Donoghue
(Skills Development Scotland)
Rebecca McCartan (Skills Development Scotland)
Send message to Authors
- Format:
- Single slot (20 min) presentation
- Mode:
- Presenting in-person
- Sector:
- Government or public sector
Short Abstract
Skills Development Scotland’s Apprentice Voice approach gathers real-time feedback from apprentices. This presentation covers how we developed a provider benchmarking tool using this data to inform and improve service delivery in work-based learning, demonstrating how we turn evidence into action.
Description
Skills Development Scotland (SDS) funds and manages Scotland’s Modern Apprenticeship programme, providing work-based learning for school leavers and employees across Scotland. Around 25,000 Modern Apprentices (MAs) start annually supported by 235 learning providers including Further Education colleges and specialist training providers. MAs are available in over 80 frameworks, including construction, financial services, and healthcare.
In 2022, the SDS Evaluation and Research Team launched Apprentice Voice (AV): an automated rolling approach to gathering feedback from apprentices reporting in real time. This approach replaced multiple previous set-piece surveys with an automatic distribution of questionnaires through our management information system. It gathers feedback at three stages:
• In-training (six months after starting)
• Leaver (three months after finishing)
• Follow-up (15 months after finishing)
AV has subsequently gathered more than 50,000 responses. The findings are used to monitor impact,manage performance and communicate outcomes by sector, geography and equalities.
In 2024, the AV Working Group began to explore using the data to support quality management and improvement among providers delivering apprenticeships. While SDS already delivered a high level of support, compliance and quality monitoring for providers, there was no consistent input from apprentices about their experience. We developed a benchmarking tool to:
• support continuous improvement;
• allow providers to benchmark against their peers and track performance;
• identify providers in need of support.
In this session we will outline how we identified the most appropriate indicators to use in the benchmarking tool. We will describe how we went from shortlisting ten measures to selecting six measures which reflected the overall apprentice journey, from induction through to post-training perspectives.
We will outline how the tool identifies the mean point score for each measure by framework for:
• the provider;
• all respondents within the framework;
• the best-performing provider.
We will also outline how these scores are shown on a radar chart to provide a visual comparison, and outline the reporting, which comprises:
• An interactive PowerBi for internal use by Skills Investment Advisers (SIAs)
• One page reports summarising perception data shared with each qualifying provider as part of regular reporting. The one-page report also contains contextual information on the overall MA numbers in-training and response rates to the survey.
We will reflect on how the approach was piloted by SIAs working with five providers covering 12 frameworks in October 2025. We will discuss how the piloting demonstrated the value of the reports in helping providers to identify areas of strength and aspects which they can improve upon, and how it also helped us to make final adjustments to the tool and guidance materials before rolling the approach out to all providers.
The key takeaways will include how:
• The benchmarking PowerBi provides insight into apprentice perceptions of training provider performance that is used by the Quality Delivery Team to support continuous improvement.
• The individual provider reports enable Skills Investment Advisers to provide tailored support to improve programme delivery.
• Our plans to use the approach to identify and share good practice