Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.

Accepted Contribution:

Integrating ethical reflection in AI development: challenges and lessons from developing an AI ethics tool  
Sarah Hladikova (Tufts University) Andreia Martinho (Tufts University) Yuling Wang (Tufts University)

Send message to Authors

Short abstract:

In our contribution, we will share our reflections on the crucial role of the tacit knowledge in bridging the gap between academia and software developers in a research project that aims to address the gap between research and practice in AI Ethics.

Long abstract:

Our research project aimed to bridge the gap between AI Ethics research and its practical application. To enhance the accessibility of ethical considerations in AI, we opted for a web-based platform commonly used by software companies for digital documentation. The development process of this tool presented us with challenges and opportunities to reflect on the pivotal role of tacit knowledge in connecting academia and software developers.

We developed the AI Ethics Tool, a pragmatic framework designed to incorporate ethical considerations into the development and deployment of AI systems. This tool addresses challenges such as bias, unfairness, and lack of transparency in AI systems. It underscores the importance of involving diverse stakeholders and addresses gaps in AI ethics research.

Our research is a step towards responsible AI practices. Throughout the process of developing the tool, we identified the moments in software development timeline where there were opportunities for intervention and provided a relatable yet research-based framework enabling practitioners to engage with the normative challenges in AI development in an accessible and intuitive platform.

Combined Format Open Panel P066
Envisioning ethics – what does it mean to integrate ethical reflection into the early phases of technology development?
  Session 2 Tuesday 16 July, 2024, -