Accepted Paper

Can network metrics show that your Interdisciplinary Research Centre is succeeding?  
Andrew Millar (University of Edinburgh) Haya Deeb (University of Edinburgh) Daniel Thedie (University of Edinburgh) Rodrigo Liscovsky Barrera (University of Edinburgh - STIS) Niki Vermeulen (University of Edinburgh)

Short abstract

Research Centres often aim to help researchers to interact more broadly. Network metrics offer new evidence of a Centre’s interdisciplinarity. We use local case studies to show how context is critical to this narrative, suggesting that the metrics offer less advantage at institutional/larger scales.

Long abstract

Interdisciplinarity has been a mantra of UK research policy for decades but it remains difficult to pinpoint. At national scale, interdisciplinary research in part justified linking the sectoral Research Councils into UKRI (>£8bn pa.). Systems Biology is an earlier example, which linked life, computer and physical scientists to model biological systems at the molecular level. This underpinned the current Synthetic/ Engineering Biology, which engineers new capabilities in cells. UKRI’s funding here includes £10M-scale Research Centres, Mission Hubs etc. Their success is often reported using a few case studies and lists of research outputs. We tested whether network metrics could provide more systematic evidence for research leaders to report their Centre’s interdisciplinarity.

Simple bibliometric measures are biased in several ways but publication data also offers more complex metrics. We used publication outputs from research centres in Medicine, Social Science and natural Sciences at the University of Edinburgh, to calculate a range of metrics from social network analysis. For the Centre for Systems Biology (CSBE; later SynthSys, and the Centre for Mammalian Synthetic Biology), the resulting network graphs for publication co-authorship and disciplinary focus revealed changing patterns of collaboration over ten years. Betweenness centrality (rising) and network diameter (falling) are promising indicators of funding impact, at the individual and collective levels. This richer evidence enhances narrative reporting at Centre scale but was only useful with both technical interpretation of the unfamiliar metrics and local knowledge from the former Centre Director. Could future tools aggregate such qualitative reports to institutional/larger scales?

Panel T2.6
Research cultures and research qualities
  Session 1 Monday 30 June, 2025, -