Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality, and to see the links to virtual rooms.

Accepted Paper:

ON THE SEX OF ALGORITHMS: EXPLAINABILITY, RELIABILITY AND NEW RESISTANCES ON THE WEB  
Eduardo Iglesias (Universidad Alfonso X El Sabio)

Paper short abstract:

What do IA algorithms do in unfair situations? We search here for the resistance movements that questions the neutrality in the development of IA systems, and their emic approach to the common life in the new Demiurge known as 4.0 society with their toolbox chasing for hope and a non-biased present

Paper long abstract:

The intangible space of relation and production of machine-person interaction constructs civic life so intensely that the regulatory institutions of the public sphere, noticing the present impacts and sensing future ones, are forcibly constructing the normative field of the digital sphere. In these frameworks, interesting debates are established about the limitations of the field itself, directed by the ethics of the procedures, reliability, applicability and possibilities of verifying results, as well as the explicability of a technology characterised by a kind of obscurantism; either due to questions of commercial patents or to more sceptical positions about the different models of the procedures that make executive algorithms learn.

Although the study of these new intelligences is mainly focused on the field of the Internet-of-things, we present here a work proposal that questions the development of methodologies that worked before pandemic era and, in the face of these new technological horizons and on the basis of the discourses circulating in the field, questioning the neutrality of these technologies. These critical currents within the circulating discourses are known as XAI - eXplicable Artificial Intelligence - and propose the study of bias analysis in the automation of decisions that are currently creating less just societies, by reproducing current biases through the reproduction of discriminatory situations by the hand of administrative machines, represented here as Artificial Intelligence algorithms.

Panel P027
Imagining alternative data futures
  Session 1 Friday 29 July, 2022, -