Abstract | The answer to the question "what is a good explanation for lay users" becomes more challenging within a broader, multi-disciplinary context, as the one of our call, from philosophy to sociology, economics and computer sciences. In this context, XAI should entail another level of discussions that need to be addressed: our relationship, as humans, to AI systems, in general. Nonetheless, the term 'explain' derives from the Latin verb 'explanare', which means, literally, 'to make level'. Thus, the discussions and scientific endeavours into XAI could entail the notion of ourselves 'making level' with AI systems, which again brings us to the old question as to our relationship to and with AI systems. More specifically, bringing in our multifaceted cultural notions of AI and human beings, for instance, in terms of master and servant, or who is and will be dominating whom (e.g., Space Odyssey's HAL).Arguably, the intertwined cultural perceptions and (often incorrect, albeit popular) ideas regarding AI and its potentials ultimately also influence lay persons perceived needs for explanation when interacting with AI systems. For example, when interacting with a social robot, ideally, we should not have any perceived need for explanation -or only as much or little as we would have when interacting with any other social, human companion. Given that we are seeing a machine, however, brings up expectations and impressions formed by popular culture, and thus the need for explanation to, maybe, satisfy a need for safety, trust, etc.Therefore, answering the research question "what is a good explanation" is far from obvious. Seeking answers to this research question has been the main incentive for the launch of this research topic. |
---|