When Do People Want an Explanation from a Robot?

Some of the interaction scenarios used in the study

Abstract

Explanations are a critical topic in AI and robotics, and their importance in generating trust and allowing for successful human–robot interactions has been widely recognized. However, it is still an open question when and in what interaction contexts users most want an explanation from a robot. In our pre-registered study with 186 participants, we set out to identify a set of scenarios in which users show a strong need for explanations. Participants are shown 16 videos portraying seven distinct situation types, from successful human–robot interactions to robot errors and robot inabilities. Afterwards, they are asked to indicate if and how they wish the robot to communicate subsequent to the interaction in the video. The results provide a set of interactions, grounded in literature and verified empirically, in which people show the need for an explanation. Moreover, we can rank these scenarios by how strongly users think an explanation is necessary and find statistically significant differences. Comparing giving explanations with other possible response types, such as the robot apologizing or asking for help, we find that why-explanations are always among the two highest-rated responses, with the exception of when the robot simply acts normally and successfully. This stands in stark contrast to the other possible response types that are useful in a much more restricted set of situations. Lastly, we test for factors of an individual that might influence their response preferences, for example, their general attitude towards robots, but find no significant correlations. Our results can guide roboticists in designing more user-centered and transparent interactions and let explainability researchers develop more pinpointed explanations.

Publication
Proceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction