Enhancement of human computer interaction with facial electromyographic sensors
OZCHI '09 Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7
Design of an android robot head for stage performances
Artificial Life and Robotics
ACM Transactions on Applied Perception (TAP)
The illusion of robotic life: principles and practices of animation for robots
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Hi-index | 0.00 |
This paper explores the process of self-guided learning of realistic facial expression production by a robotic head with 31 degrees of freedom. Facial motor parameters were learned using feedback from real-time facial expression recognition from video. The experiments show that the mapping of servos to expressions was learned in under one-hour of training time. We discuss how our work may help illuminate the computational study of how infants learn to make facial expressions.