Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
Accepted Paper:
Paper short abstract:
This talk will discuss what is not said in conversations around teaching for Responsible AI in academia. These silent topics, while likely less palatable to technologists, are a necessary next step in making sure the field of AI is behaving responsibly.
Paper long abstract:
Responsible AI development is becoming an increasing priority for governments, industry, and research institutions, with degree and professional development programs starting to include elements of Safety, Responsibility or Ethics. But these conversations tend towards some topics more than others.
Concerns focused on engineering, such as verification and security, are perfectly palatable to technical scholars as more problems they can solve with more technology, and are readily added to curricula. Topics that force engineers to consider what their creations will be used for, or by whom, are avoided, and any discussion that might frame technology as at all impotent, such as where the solution requires consulting with and respecting the expertise of other disciplines, is engaged with sparingly.
Against a backdrop of practical skills development that encourages continuous creation of new projects, new features and more complexity, students are left unprepared to criticize or meaningfully refuse development, or to leave space to listen to others attempting the same.
For a truly Responsible field of AI to develop, our education programs need to be joined up and talking with non technical experts, able to understand each other and incorporate our translated expertise into development practices. Our students need to be empowered to be hesitant, critical, and to listen, and educators teaching them must act as exemplars of this difficult, but necessary practice.
The origins and technological evolutions of silence
Session 1 Thursday 18 July, 2024, -