Identifying the "Right" Level of Explanation in a Given Situation - Equipe Sociologie, Information-Communication Design Access content directly
Conference Papers Year : 2020

Identifying the "Right" Level of Explanation in a Given Situation

Une méthode pour identifier le niveau optimal d'explication de l'IA dans une situation donnée

Abstract

We present a framework for defining the "right" level of AI explainability based on technical, legal and economic considerations. Our approach involves three logical steps: First, define the main con-textual factors, such as who is the audience of the explanation, the operational context, the level of harm that the system could cause, and the legal/regulatory framework. This step will help characterize the operational and legal needs for explanation, and the corresponding social benefits. Second, examine the technical tools available, including post-hoc approaches (input perturbation, saliency maps...) and hybrid AI approaches. Third, as function of the first two steps, choose the right levels of global and local explanation outputs, taking into the account the costs involved. We identify seven kinds of costs and emphasize that explanations are socially useful only when total social benefits exceed costs.
Fichier principal
Vignette du fichier
main.pdf (78.53 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-02507316 , version 1 (13-03-2020)

Identifiers

  • HAL Id : hal-02507316 , version 1

Cite

Valérie Beaudouin, Isabelle Bloch, David Bounie, Stéphan Clémençon, Florence d'Alché-Buc, et al.. Identifying the "Right" Level of Explanation in a Given Situation. 1st International Workshop on New Foundations for Human-Centered AI (NeHuAI) ECAI 2020, Sep 2020, Santiago de Compostella, Spain. pp.63-66. ⟨hal-02507316⟩
1004 View
644 Download

Share

Gmail Facebook X LinkedIn More