Good Enough Explanations: How Can Local Publics Understand and Explain Civic Predictive Systems?

Author(s)
Gupta, Shubhangi
Editor(s)
Associated Organization(s)
Supplementary to:
Abstract
How can the Explainable Artificial Intelligence (XAI) community support public understanding of the spatial workings and effects of civic predictive systems? Civic processes in the urban smart city are increasingly being governed by automated predictive systems using machine learning models. Despite their widespread use in everyday domains such as education, policing, social services, and economic investments, they continue to remain invisible and inaccessible to local publics, who bear the burden of their effects. XAI and Artificial Intelligence (AI) transparency researchers are increasingly calling for the development of public-centered AI explanations. However, in the context of civic AI, existing techniques fall short in (1) how they understand the consumers and creators of explanations, (2) how they explain the socio-technical assemblages that give rise to AI systems, (3) how they design interactions to create and deliver explanations, and (4) how they conceptualize explanation goals in relation to public action. This dissertation engages in qualitative, participatory, and design-based research to introduce the concept of ‘good enough explanations’ in response to these challenges. Good enough explanations may not be complete or universal. Instead, as this dissertation formulates, such explanations consist of ongoing processes that allow diverse publics to partially engage with features of predictive systems and assess such systems in relation to their communities. This dissertation (1) theorizes qualities underlying good enough explanations, (2) engages in the development of such explanations with diverse publics, and (3) suggests theories and strategies to guide the development of systems for good enough explaining. Ultimately, this dissertation hopes to serve as a guide for XAI researchers, civic organizations, as well as policymakers, as they work together to engage with publics for the democratic oversight, assessment, and regulation of civic AI systems.
Sponsor
Date
2024-07-24
Extent
Resource Type
Text
Resource Subtype
Dissertation
Rights Statement
Rights URI