Identifying the 'Right' Level of Explanation in a Given Situation
Author(s) -
Winston Maxwell,
Valérie Beaudouin,
Isabelle Bloch,
David Bounie,
Stéphan Clémençon,
Florence d’Alché–Buc,
James Eagan,
Pavlo Mozharovskyi,
Jayneel Parekh
Publication year - 2020
Publication title -
ssrn electronic journal
Language(s) - English
Resource type - Journals
ISSN - 1556-5068
DOI - 10.2139/ssrn.3604924
Subject(s) - harm , computer science , risk analysis (engineering) , context (archaeology) , function (biology) , management science , business , economics , political science , law , geography , archaeology , evolutionary biology , biology
We present a framework for defining the "right" level of explainability based on technical, legal and economic considerations. Our approach involves three logical steps: First, define the main con-textual factors, such as who is the audience of the explanation, the operational context, the level of harm that the system could cause, and the legal/regulatory framework. This step will help characterize the operational and legal needs for explanation, and the corresponding social benefits. Second, examine the technical tools available, including post-hoc approaches (input perturbation, saliency maps...) and hybrid AI approaches. Third, as function of the first two steps, choose the right levels of global and local explanation outputs, taking into the account the costs involved. We identify seven kinds of costs and emphasize that explanations are socially useful only when total social benefits exceed costs.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom