A muscle model for animation three-dimensional facial expression
SIGGRAPH '87 Proceedings of the 14th annual conference on Computer graphics and interactive techniques
Pfinder: Real-Time Tracking of the Human Body
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning Patterns of Activity Using Real-Time Tracking
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Context-Dependent Attention System for a Social Robot
IJCAI '99 Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence
Using Adaptive Tracking to Classify and Monitor Activities in a Site
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
The ALIVE system: wireless, full-body interaction with autonomous agents
Multimedia Systems - Special issue on multimedia and multisensory virtual worlds
Usability testing of notification interfaces: are we focused on the best metrics?
ACM-SE 42 Proceedings of the 42nd annual Southeast regional conference
Hi-index | 0.00 |
This paper presents a humanoid computer interface (Jeremiah) that is capable of extracting moving objects from a video stream and responding by directing the gaze of an animated head toward it. It further responds through change of expression reflecting the emotional state of the system as a response to stimuli. As such, the system exhibits similar behavior to a child. The system was originally designed as a robust visual tracking system capable of performing accurately and consistently within a real world visual surveillance arena. As such, it provides a system capable of operating reliably in any environment both indoor and outdoor. Originally designed as a public interface to promote computer vision and the public understanding of science (exhibited in British Science Museum), Jeremiah provides the first step to a new form of intuitive computer interface.