You see, but do you observe? A new research integrity assessment tool for education sciences
Marta Topor
(Linköping University)
Lucija Batinovic
(Linköping University)
Henrik Danielsson
(Linköping University)
Paper Short Abstract
Quality assessment of primary studies is an essential part of evidence synthesis influencing the interpretation of evidence. This task is complicated by the exponential growth of research. We present a new checklist for identifying problematic studies and early exclusion from evidence synthesis.
Paper Abstract
The credibility of findings from evidence syntheses is becoming an increasing concern, as the unexpected amounts of retrieved studies, questionable data and inconsistent findings are slowly overwhelming the publication system and potentially biasing meta-analytic conclusions. We present a newly developed integrity assessment checklist and a pilot study to provide an overview of research integrity in special education research and assess the feasibility and generalizability of the checklist on other social sciences. The checklist consists of items aimed at evaluating the research integrity of the study and items that assess the business practices of the publisher. More specifically, we evaluate ethics statements, transparency in reporting, originality of the text, plausibility of the findings, consistency of statistical reporting within the text, and the editorial practices of the journal and the business model of its publisher. An expert panel provided feedback and suggestions for the items, after which the checklist was piloted by an end-user panel. Piloting was conducted on studies from a scoping review of educational interventions and later tested on several adjacent social science areas of research to evaluate generalizability. We expect that the checklist will enable researchers to conduct quick checks of studies to exclude problematic studies before they are included in thorough quality appraisal, which in turn will facilitate evidence synthesis and ensure the trustworthiness of included studies.
Accepted Poster
Paper Short Abstract
Paper Abstract
The credibility of findings from evidence syntheses is becoming an increasing concern, as the unexpected amounts of retrieved studies, questionable data and inconsistent findings are slowly overwhelming the publication system and potentially biasing meta-analytic conclusions. We present a newly developed integrity assessment checklist and a pilot study to provide an overview of research integrity in special education research and assess the feasibility and generalizability of the checklist on other social sciences. The checklist consists of items aimed at evaluating the research integrity of the study and items that assess the business practices of the publisher. More specifically, we evaluate ethics statements, transparency in reporting, originality of the text, plausibility of the findings, consistency of statistical reporting within the text, and the editorial practices of the journal and the business model of its publisher. An expert panel provided feedback and suggestions for the items, after which the checklist was piloted by an end-user panel. Piloting was conducted on studies from a scoping review of educational interventions and later tested on several adjacent social science areas of research to evaluate generalizability. We expect that the checklist will enable researchers to conduct quick checks of studies to exclude problematic studies before they are included in thorough quality appraisal, which in turn will facilitate evidence synthesis and ensure the trustworthiness of included studies.
Poster session
Session 1 Tuesday 1 July, 2025, -