The media equation: how people treat computers, television, and new media like real people and places
Recognition of Affective Communicative Intent in Robot-Directed Speech
Autonomous Robots
Creating Interactive Virtual Humans: Some Assembly Required
IEEE Intelligent Systems
Designing social presence of social actors in human computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Human - Robot Interaction: Engagement between Humans and Robots for Hosting Activities
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Active eye contact for human-robot communication
CHI '04 Extended Abstracts on Human Factors in Computing Systems
From conversational tooltips to grounded discourse: head poseTracking in interactive dialog systems
Proceedings of the 6th international conference on Multimodal interfaces
Cooperative embodied communication emerged by interactive humanoid robots
International Journal of Human-Computer Studies - Special issue: Subtle expressivity for characters and robots
Towards a model of face-to-face grounding
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
Contextual recognition of head gestures
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Explorations in engagement for humans and robots
Artificial Intelligence
Head gestures for perceptual interfaces: The role of context in improving recognition
Artificial Intelligence
Precision timing in human-robot interaction: coordination of head movement and utterance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The snackbot: documenting the design of a robot for long-term human-robot interaction
Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
Providing route directions: design of robot's utterance, gesture, and timing
Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
Touch and toys: new techniques for interaction with a remote group of robots
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
AAAI'06 proceedings of the 21st national conference on Artificial intelligence - Volume 2
Emotions and Messages in Simple Robot Gestures
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
Head motions during dialogue speech and nod timing control in humanoid robots
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
How many social robots can one operator control?
Proceedings of the 6th international conference on Human-robot interaction
Generation of nodding, head tilting and eye gazing for human-robot dialogue interaction
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Motion design of an interactive small humanoid robot with visual illusion
Proceedings of the 10th asia pacific conference on Computer human interaction
International Journal of Human-Computer Studies
Engaging robots: easing complex human-robot teamwork using backchanneling
Proceedings of the 2013 conference on Computer supported cooperative work
Design and evaluation techniques for authoring interactive and stylistic behaviors
ACM Transactions on Interactive Intelligent Systems (TiiS)
Robotics and Autonomous Systems
Hi-index | 0.00 |
This paper reports on a study of human participants with a robot designed to participate in a collaborative conversation with a human. The purpose of the study was to investigate a particular kind of gestural feedback from human to the robot in these conversations: head nods. During these conversations, the robot recognized head nods from the human participant. The conversations between human and robot concern demonstrations of inventions created in a lab. We briefly discuss the robot hardware and architecture and then focus the paper on a study of the effects of understanding head nods in three different conditions. We conclude that conversation itself triggers head nods by people in human-robot conversations and that telling participants that the robot recognizes their nods as well as having the robot provide gestural feedback of its nod recognition is effective in producing more nods.