Embodied conversational interface agents
Communications of the ACM
RobotPHONE: RUI for interpersonal communication
CHI '01 Extended Abstracts on Human Factors in Computing Systems
The effect of head-nod recognition in human-robot conversation
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
Design considerations of expressive bidirectional telepresence robots
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Emotional body language displayed by artificial agents
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special Issue on Affective Interaction in Natural Environments
Hi-index | 0.00 |
Understanding how people interpret robot gestures will aid design of effective social robots. We examine the generation and interpretation of gestures in a simple social robot capable of head and arm movement using two studies. In the first study, four participants created gestures with corresponding messages and emotions based on 12 different scenarios provided to them. The resulting gestures were then shown in the second study to 12 participants who judged which emotions and messages were being conveyed. Knowledge (present or absent) of the motivating scenario (context) for each gesture was manipulated as an experimental factor. Context was found to assist message understanding while providing only modest assistance to emotion recognition. While better than chance, both emotion (22%) and message understanding (40%) accuracies were relatively low. The results obtained are discussed in terms of implied guidelines for designing gestures for social robots.