Toward Human-centered XAI in Practice: A survey
-
-
Abstract
Human adoption of artificial intelligence (AI) technique is largely hampered because of the increasing complexity and opacity of AI development. Explainable AI (XAI) techniques with various methods and tools have been developed to bridge this gap between high-performance black-box AI models and human understanding. However, the current adoption of XAI technique still lacks “human-centered” guidance for designing proper solutions to meet different stakeholders′ needs in XAI practice. We first summarize a human-centered demand framework to categorize different stakeholders into five key roles with specific demands by reviewing existing research and then extract six commonly used human-centered XAI evaluation measures which are helpful for validating the effect of XAI. In addition, a taxonomy of XAI methods is developed for visual computing with analysis of method properties. Holding clearer human demands and XAI methods in mind, we take a medical image diagnosis scenario as an example to present an overview of how extant XAI approaches for visual computing fulfil stakeholders′ human-centered demands in practice. And we check the availability of open-source XAI tools for stakeholders′ use. This survey provides further guidance for matching diverse human demands with appropriate XAI methods or tools in specific applications with a summary of main challenges and future work toward human-centered XAI in practice.
-
-