Efficient and Trustworthy Social Navigation Via Explicit and Implicit Robot-Human Communication

Yuhang Che, Allison M. Okamura, Dorsa Sadigh

In this paper, we present a planning framework that uses a combination of implicit (robot motion) and explicit (visual/audio/haptic feedback) communication during mobile robot navigation. First, we developed a model that approximates both continuous movements and discrete behavior modes in human navigation, considering the effects of implicit and explicit communication on human decision making. The model approximates the human as an optimal agent, with a reward function obtained through inverse reinforcement learning. Second, a planner uses this model to generate communicative actions that maximize the robot's transparency and efficiency. We implemented the planner on a mobile robot, using a wearable haptic device for explicit communication. In a user study of an indoor human-robot pair of orthogonal crossing situation, the robot was able to actively communicate its intent to users in order to avoid collisions and facilitate efficient trajectories. Results showed that the planner generated plans that were easier to understand, reduced users' effort, and increased users' trust of the robot, compared to simply performing collision avoidance. The key contribution of this work is the integration and analysis of explicit communication (together with implicit communication) for social navigation.

Knowledge Graph



Sign up or login to leave a comment