The cultures and qualities of research, far from being homogenous, vary markedly between nations, between disciplines, and between types of organisation. This session offers a lively smorgasbord of studies examining the lessons to be drawn from this diversity.
Long Abstract
The central theme of the session is an examination of how cultures and policies can interact in ways that are very context-dependent to induce change in research practice. Rinze Benedictus and Alex Rushforth will present new data on how research assessment can be reformed within institutions shaped by a culture that demands excellence, while Jackie Thompson will discuss how a study of the various manifestations of responsible conduct of research in different disciplines in terms of methodologies, epistemologies, values and other dimensions has allowed her and her coworkers to devise a toolkit to guide reform efforts. Amy Devenney and Jess Adams will present insights from analyses of how projects designed to improve research culture through change management approaches have impacted various UK HE institutions. Andrew Millar takes a more granular look at the operation of interdisciplinary research centres, asking how quantitative and qualitative information might best be combined to monitor whether such organisational configurations success in their missions. And finally Paul Bramley describes the ripple effects that propagate to researchers of journal policies that require data access for editors, which can enhance the detection of problems pre-publication.
This study provides long term empirical evidence for the slow institutionalization of excellence in Dutch academic health research and the consequences for research integrity and culture. Also, it demonstrates the possibility of reforming research assessment in an excellence-shaped research culture.
Long abstract
The excellence notion of quality that deeply affected universities over the past decade has been linked to many research integrity issues. As such, it has been extensively critiqued and analyzed, but related attempts to improve research assessment have received less scholarly attention.
This meta-science project tries to address that gap by demonstrating the possibility and the complexity of research assessment reform in a nation-wide, single-discipline case study. The study shows how the excellence notion of research quality became institutionalized in academic health research in The Netherlands over a long-term period.
The study takes a distinctive methodological approach. The researcher worked in one university medical center where the excellence notion became contested and reform of research assessment was initiated. The combination of this situated access to internal decision-making with a long-term, macrolevel perspective on science policy gives unparallelled insights in understanding the complexities of reform. The study draws on interviews with 50 academic health researchers and policy makers in The Netherlands and analysis of hundreds of (policy)documents, using concepts from organizational sociology.
It shows how the concepts of rationalization and organizational completeness can help to understand why university medical centers, radical mergers of medical faculties and academic hospitals, were constructed. These organizations shaped conditions for knowledge production in unwanted ways, through the Matthew effect, hyper-competition and reproduction of inequalities. At the same time, this study provides empirical evidence how research institutions, also after decades of institutionalized excellence, can initiate a process towards reforming research assessment and improving research culture.
Research Centres often aim to help researchers to interact more broadly. Network metrics offer new evidence of a Centre’s interdisciplinarity. We use local case studies to show how context is critical to this narrative, suggesting that the metrics offer less advantage at institutional/larger scales.
Long abstract
Interdisciplinarity has been a mantra of UK research policy for decades but it remains difficult to pinpoint. At national scale, interdisciplinary research in part justified linking the sectoral Research Councils into UKRI (>£8bn pa.). Systems Biology is an earlier example, which linked life, computer and physical scientists to model biological systems at the molecular level. This underpinned the current Synthetic/ Engineering Biology, which engineers new capabilities in cells. UKRI’s funding here includes £10M-scale Research Centres, Mission Hubs etc. Their success is often reported using a few case studies and lists of research outputs. We tested whether network metrics could provide more systematic evidence for research leaders to report their Centre’s interdisciplinarity.
Simple bibliometric measures are biased in several ways but publication data also offers more complex metrics. We used publication outputs from research centres in Medicine, Social Science and natural Sciences at the University of Edinburgh, to calculate a range of metrics from social network analysis. For the Centre for Systems Biology (CSBE; later SynthSys, and the Centre for Mammalian Synthetic Biology), the resulting network graphs for publication co-authorship and disciplinary focus revealed changing patterns of collaboration over ten years. Betweenness centrality (rising) and network diameter (falling) are promising indicators of funding impact, at the individual and collective levels. This richer evidence enhances narrative reporting at Centre scale but was only useful with both technical interpretation of the unfamiliar metrics and local knowledge from the former Centre Director. Could future tools aggregate such qualitative reports to institutional/larger scales?
We descriptively illustrate how best practice manifests differently across disciplines, through a recent mixed-methods study in which 60 experts across varied disciplines rated importance of 38 dimensions of ‘responsible conduct of research.’ We further discuss interpretations and implications.
Long abstract
Metaresearch is often siloed by different models of knowledge generation, separated by barriers of differing jargon and assumptions. In order to be effective and applicable as widely as possible, our field needs to understand key elements of how the research ecosystem it studies (e.g., research disciplines, methodologies and epistemologies) vary. What are the universals of research practice that all research should aim for, and which are more niche: important in one field but irrelevant in others?
Accordingly, we conducted a mixed-methods study overviewing how Responsible Conduct of Research (RCR) manifests differently across disciplines. We first identified over 30 possible dimensions of RCR, through a scoping review of relevant literature and thematic analysis of interviews with experts. We then conducted a 3-round Delphi process bringing together 60 RCR experts from varying disciplines of academic research (including arts/humanities, social sciences, life/health sciences, and physical/mathematical sciences) who made and refined quantitative judgements of the importance of each dimension in their discipline, informed by qualitative feedback from other panellists. Our findings (see https://osf.io/8ntex) richly illustrate which dimensions of RCR are considered more universal, versus which vary widely across disciplines. These divides split along lines of methodologies, epistemologies, values, and more.
To create actionable insights from our findings, we have developed a toolkit of interactive visualisations, exercises and reports to help relevant audiences (e.g., leadership and research support staff at HEIs, and other sector stakeholders such as funders, publishers, and policymakers) better understand this landscape, informing how they deploy guidance, reforms and interventions.
Clinical trial patient level data is rarely made available. Journal policy requiring data access for editors when a paper is accepted has demonstrated high levels of trial data provision, evidence that it prompts some authors re-check their data, and an ability to detect problems pre-publication.
Long abstract
With ongoing concerns about the replicability of published scientific research, there is interest in making study documents available to review - including individual patient data. This is of particular interest in clinical randomised controlled trials, because of their high costs and long duration. However, efforts to make data available post publication have had mixed results, where data statements, even when present, do not guarantee data will be provided.
As a result of ongoing concerns about trial integrity in the anaesthetic literature, the journal ‘Anaesthesia’, from 2019 has had a policy of requesting individual patient data for all trials being accepted for publication. This has allowed an editor to reproduce trial findings, and investigate for signs of failures of research integrity, where both checks are now routine.
Authors have provided anonymised participant data to editors in > 87% of cases, higher than other studies have shown for data requests post publication. Additionally, when asked to provide data, 9% find errors in their own results. We have also discovered that most of the research integrity failures we have found included false data, and many could only be identified with individual data.
Data request at the point of article submission or acceptance may represent a model for how failures of research integrity could be detected and (in some cases) ameliorated, since at this stage incentives appear better aligned: authors would like to be published, and journals would like to avoid retractions. However, limited resources and expertise may prevent this from being widely undertaken.
This paper will focus on how research culture projects at several UK HEIs are working with different change theories.
Long abstract
Change is an integral part of metascience: as it ‘explodes into the mainstream’ and ‘new initiatives and alliances are set up’, change is required by institutions to rethink, adapt and redesign the systems, infrastructure and practice of research. Institutions face multiple and substantial challenges as they seek to foster environments which are fair, inclusive and supportive through the implementation of initiatives to improve culture. Although what we want to change is often easily articulated, how we effect change is often more difficult. Leadership-driven change directives often have no traction, while bottom-up change can struggle to have sufficient influence on effecting change. How can we draw on change theories to successfully implement metascience initiatives and change research cultures? This panel will focus on how funders and funded research culture improvement projects at several UK HEIs are working with different change management approaches to overcome the challenges of change at both the sector level and within large organisations and across multidisciplinary teams. Our proposed paper will feature colleagues working directly on funded projects at several UK institutions, as well as funders, who are grappling with how to create change across the research ecosystem. It will cover projects working to: enhance leadership capacity to build safe and inclusive research environments; review fixed-term research contracts and pilot new employment models to reduce precarity; improve institutional decision-making by facilitating evidence-informed practice, to promote research excellence; and build a body of evidence to support and inform sector initiatives to foster equitable, diverse and supportive research environments.
Short Abstract
The cultures and qualities of research, far from being homogenous, vary markedly between nations, between disciplines, and between types of organisation. This session offers a lively smorgasbord of studies examining the lessons to be drawn from this diversity.
Long Abstract
The central theme of the session is an examination of how cultures and policies can interact in ways that are very context-dependent to induce change in research practice. Rinze Benedictus and Alex Rushforth will present new data on how research assessment can be reformed within institutions shaped by a culture that demands excellence, while Jackie Thompson will discuss how a study of the various manifestations of responsible conduct of research in different disciplines in terms of methodologies, epistemologies, values and other dimensions has allowed her and her coworkers to devise a toolkit to guide reform efforts. Amy Devenney and Jess Adams will present insights from analyses of how projects designed to improve research culture through change management approaches have impacted various UK HE institutions. Andrew Millar takes a more granular look at the operation of interdisciplinary research centres, asking how quantitative and qualitative information might best be combined to monitor whether such organisational configurations success in their missions. And finally Paul Bramley describes the ripple effects that propagate to researchers of journal policies that require data access for editors, which can enhance the detection of problems pre-publication.
Accepted papers
Session 1 Monday 30 June, 2025, -