Log in to star items.
Accepted Paper
Paper short abstract
The paper analyses the efforts of the Coalition for Content Provenance and Authenticity (a initiative for provenance and authenticity standards), to identify a distinct mode of infrastructural governance in the context of generative AI, enabled by big tech’s “convening power”.
Paper long abstract
Provenance is commonly approached as a technical matter of traceability, documentation, or metadata (Borgman; Kale et al., 2023), through which origin, authorship, and transformation can be rendered stable and intelligible. Contemporary machine learning systems complicate this assumption. Outputs emerge through layered processes of training, optimization, and probabilistic generation whose relations to prior materials cannot be fully captured by established distinctions between copying, derivation, or independent creation (Amoore, 2020; Guzman & Lewis, 2024; Usher, 2025). Rather than treating provenance as something “to be found”, or its stabilization in AI environments as a technical puzzle “to be solved”, this paper examines provenance as a site of sociotechnical construction and contestation. Paying empirical attention to the efforts of the Coalition for Content Provenance and Authenticity, or C2PA, (a global initiative for provenance and authenticity standards), we identify a distinct mode of infrastructural governance enabled by big tech’s “convening power” (Van Der Vlist, Helmond, & Ferrari, 2024). Specifically, we demonstrate how AI actors organize provenance, who assembles to stabilize it, and what kinds of power operate through this assembly. In doing so, we show how provenance and authenticity standards are leveraged to both allow big tech to demonstrate commitments to “responsible AI” to preempt regulatory intervention and to build an infrastructure through which their preexisting dominance is reinforced. Hence, the paper addresses concerns at the core of critical AI scholarship: How is power infrastructurally embedded and exercised in AI contexts? And, more specifically, how do sociotechnological, economic, and legal arrangements reinforce power asymmetry?
A field in formation: What do we mean by ‘critical’ and ‘AI’ in Critical AI Studies?
Session 1