The Effects of Non-verbal Turn-Taking Cues in an Open-Domain Human-Robot Conversation
Summary
Older conversational systems like ELIZA and modern digital assistants like Alexa rely on a simple turn-taking system based upon gaps of silence between turns. However, these question-answering and command-and-action systems are very different from open-domain conversational dialogue. Here, the user has no expectations of the capabilities and restrictions of the system, so it is harder to establish common ground for turn-taking. Thus, a system influenced by human turn-taking is needed. The current study proposes to replicate human non- verbal gestures and gaze patterns in social robots. In a between- subject design experiment, the proposed system was evaluated on conversational naturalness and social engagement against a no movement condition and a random head and arm movement condition. 42 adults conversed with Pepper the robot, equipped with a Large Language Model (LLM) to enable open domain conversation. The results show that humanlike non-verbal turn- taking movements can improve objective naturalness, but not on all measures of objective naturalness. The findings offer a new perspective on human-modelling for social robots: before technology advances to the point where robots can authentically mimic human behaviour, the presence of human-like behaviour may reduce the effectiveness of that mimicry.