Visual Prosody: Facial Movements Accompanying Speech
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
The effect of head-nod recognition in human-robot conversation
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
Development of an android robot for studying human-robot interaction
IEA/AIE'2004 Proceedings of the 17th international conference on Innovations in applied artificial intelligence
Head gestures for perceptual interfaces: The role of context in improving recognition
Artificial Intelligence
Head motions during dialogue speech and nod timing control in humanoid robots
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Rigid Head Motion in Expressive Speech Animation: Analysis and Synthesis
IEEE Transactions on Audio, Speech, and Language Processing
Engaging robots: easing complex human-robot teamwork using backchanneling
Proceedings of the 2013 conference on Computer supported cooperative work
Automatic processing of irrelevant co-speech gestures with human but not robot actors
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Development of a taxonomy to improve human-robot-interaction through multimodal robot feedback
CHI '13 Extended Abstracts on Human Factors in Computing Systems
ACM Transactions on Management Information Systems (TMIS) - Special Issue on Informatics for Smart Health and Wellbeing
Meet me where i'm gazing: how shared attention gaze affects human-robot handover timing
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Conversational gaze aversion for humanlike robots
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Development of minimalist bipedal walking robot with flexible ankle and split-mass balancing systems
International Journal of Automation and Computing
Hi-index | 0.00 |
Head motion occurs naturally and in synchrony with speech during human dialogue communication, and may carry paralinguistic information, such as intentions, attitudes and emotions. Therefore, natural-looking head motion by a robot is important for smooth human-robot interaction. Based on rules inferred from analyses of the relationship between head motion and dialogue acts, this paper proposes a model for generating head tilting and nodding, and evaluates the model using three types of humanoid robot (a very human-like android, "Geminoid F", a typical humanoid robot with less facial degrees of freedom, "Robovie R2", and a robot with a 3-axis rotatable neck and movable lips, "Telenoid R2"). Analysis of subjective scores shows that the proposed model including head tilting and nodding can generate head motion with increased naturalness compared to nodding only or directly mapping people's original motions without gaze information. We also find that an upwards motion of a robot's face can be used by robots which do not have a mouth in order to provide the appearance that utterance is taking place. Finally, we conduct an experiment in which participants act as visitors to an information desk attended by robots. As a consequence, we verify that our generation model performs equally to directly mapping people's original motions with gaze information in terms of perceived naturalness.