A blackboard architecture for control
Artificial Intelligence
Explaining effects of eye gaze on mediated group conversations:: amount or synchronization?
CSCW '02 Proceedings of the 2002 ACM conference on Computer supported cooperative work
Motion planning for hand-over between human and robot
IROS '95 Proceedings of the International Conference on Intelligent Robots and Systems-Volume 1 - Volume 1
Journal of Intelligent and Robotic Systems
Museum guide robot based on sociological interaction analysis
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Footing in human-robot conversations: how robots might shape participant roles using gaze cues
Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
Explorations in engagement for humans and robots
Artificial Intelligence
Recognizing engagement in human-robot interaction
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Predictability or adaptivity?: designing robot handoffs modeled from trained dogs and people
Proceedings of the 6th international conference on Human-robot interaction
Using spatial and temporal contrast for fluent robot-human hand-overs
Proceedings of the 6th international conference on Human-robot interaction
Nonverbal robot-group interaction using an imitated gaze cue
Proceedings of the 6th international conference on Human-robot interaction
Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Studies on grounding with gaze and pointing gestures in human-robot-interaction
ICSR'12 Proceedings of the 4th international conference on Social Robotics
A human-inspired object handover controller
International Journal of Robotics Research
Hi-index | 0.00 |
In this paper we provide empirical evidence that using humanlike gaze cues during human-robot handovers can improve the timing and perceived quality of the handover event. Handovers serve as the foundation of many human-robot tasks. Fluent, legible handover interactions require appropriate nonverbal cues to signal handover intent, location and timing. Inspired by observations of human-human handovers, we implemented gaze behaviors on a PR2 humanoid robot. The robot handed over water bottles to a total of 102 naïve subjects while varying its gaze behaviour: no gaze, gaze designed to elicit shared attention at the handover location, and the shared attention gaze complemented with a turn-taking cue. We compared subject perception of and reaction time to the robot-initiated handovers across the three gaze conditions. Results indicate that subjects reach for the offered object significantly earlier when a robot provides a shared attention gaze cue during a handover. We also observed a statistical trend of subjects preferring handovers with turn-taking gaze cues over the other conditions. Our work demonstrates that gaze can play a key role in improving user experience of human-robot handovers, and help make handovers fast and fluent.