Emotion and sociable humanoid robots
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Explorations in engagement for humans and robots
Artificial Intelligence
Interactive humanoid robots for a science museum
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
Bidirectional Eye Contact for Human-Robot Communication
IEICE - Transactions on Information and Systems
How close?: model of proximity control for information-presenting robots
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Precision timing in human-robot interaction: coordination of head movement and utterance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ArtLinks: fostering social awareness and reflection in museums
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Effect of restarts and pauses on achieving a state of mutual orientation between a human and a robot
Proceedings of the 2008 ACM conference on Computer supported cooperative work
Footing in human-robot conversations: how robots might shape participant roles using gaze cues
Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
Revealing Gauguin: engaging visitors in robot guide's explanation in an art museum
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A tag in the hand: supporting semantic, social, and spatial navigation in museums
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Mixing Telerobotics and Virtual Reality for Improving Immersion in Artwork Perception
Edutainment '09 Proceedings of the 4th International Conference on E-Learning and Games: Learning by Playing. Game-based Education System Design and Development
Receptionist or information kiosk: how do people talk with a robot?
Proceedings of the 2010 ACM conference on Computer supported cooperative work
A model of proximity control for information-presenting robots
IEEE Transactions on Robotics
Conversational gaze mechanisms for humanlike robots
ACM Transactions on Interactive Intelligent Systems (TiiS)
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Ripple effects of an embedded social agent: a field study of a social robot in the workplace
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
HBU'12 Proceedings of the Third international conference on Human Behavior Understanding
Designing engagement-aware agents for multiparty conversations
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Meet me where i'm gazing: how shared attention gaze affects human-robot handover timing
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.01 |
We are currently working on a museum guide robot with an emphasis on "friendly" human-robot interaction displayed through nonverbal behaviors. In this paper, we focus on head gestures during explanations of exhibits. The outline of our research is as follows. We first examined human head gestures through an experimental, sociological approach. From this research, we have discovered how human guides coordinate their head movement along with their talk when explaining exhibits. Second, we developed a robot system based on these findings. Third, we evaluated human-robot interaction, again using an experimental, sociological approach, and then modified the robot based on the results. Our experimental results suggest that robot head turning may lead to heightened engagement of museum visitors with the robot. Based on our preliminary findings, we will describe a museum guide robot that first works autonomously and, if necessary, can turn into remote-control mode operated by a human to engage in more complex interaction with visitors.