Mental models of robotic assistants
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Designing Sociable Robots
Where to look: a study of human-robot engagement
Proceedings of the 9th international conference on Intelligent user interfaces
Presence: Teleoperators and Virtual Environments
Interactions with a moody robot
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
Journal of Cognitive Neuroscience
Computers in Human Behavior
Integrating vision and audition within a cognitive architecture to track conversations
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Whose job is it anyway? a study of human-robot interaction in a collaborative task
Human-Computer Interaction
Body movement analysis of human-robot interaction
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Judging a bot by its cover: an experiment on expectation setting for personal robots
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
No fair!!: an interaction with a cheating robot
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
From manipulation to communicative gesture
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Lead me by the hand: evaluation of a direct physical interface for nursing assistant robots
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Inferring social gaze from conversational structure and timing
Proceedings of the 6th international conference on Human-robot interaction
Proceedings of the 6th international conference on Human-robot interaction
Using spatial and temporal contrast for fluent robot-human hand-overs
Proceedings of the 6th international conference on Human-robot interaction
The Implications of Interactional "Repair" for Human-Robot Interaction Design
WI-IAT '11 Proceedings of the 2011 IEEE/WIC/ACM International Conferences on Web Intelligence and Intelligent Agent Technology - Volume 03
Consistency in physical and on-screen action improves perceptions of telepresence robots
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Eight lessons learned about non-verbal interactions through robot theater
ICSR'11 Proceedings of the Third international conference on Social Robotics
Motion control strategies for humanoids based on ergonomics
ICIRA'11 Proceedings of the 4th international conference on Intelligent Robotics and Applications - Volume Part II
Have you ever lied?: the impacts of gaze avoidance on people's perception of a robot
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Deliberate delays during robot-to-human handovers improve compliance with gaze communication
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
Human communication involves a number of nonverbal cues that are seemingly unintentional, unconscious, and automatic-both in their production and perception-and convey rich information on the emotional state and intentions of an individual. One family of such cues is called "nonverbal leakage." In this paper, we explore whether people can read nonverbal leakage cues-particularly gaze cues-in humanlike robots and make inferences on robots' intentions, and whether the physical design of the robot affects these inferences. We designed a gaze cue for Geminoid-a highly humanlike android-and Robovie-a robot with stylized, abstract humanlike features-that allowed the robots to "leak" information on what they might have in mind. In a controlled laboratory experiment, we asked participants to play a game of guessing with either of the robots and evaluated how the gaze cue affected participants' task performance. We found that the gaze cue did, in fact, lead to better performance, from which we infer that the cue led to attributions of mental states and intentionality. Our results have implications for robot design, particularly for designing expression of intentionality, and for our understanding of how people respond to human social cues when they are enacted by robots.