Active vision
Real-time binocular smooth pursuit
International Journal of Computer Vision
Driving saccade to pursuit using image motion
International Journal of Computer Vision
CONDENSATION—Conditional Density Propagation forVisual Tracking
International Journal of Computer Vision
Computer Vision: A Modern Approach
Computer Vision: A Modern Approach
Active Gaze Tracking for Human-Robot Interaction
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
An Adaptive Fusion Architecture for Target Tracking
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Region extraction of a gaze object using the gaze point and view image sequences
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
From a wizard of Oz experiment to a real time speech and gesture multimodal interface
Signal Processing - Special section: Multimodal human-computer interfaces
Hi-index | 0.00 |
We have incorporated interactive skills into an active gaze tracking system. Our active gaze tracking system can identify an object in a cluttered scene that a person is looking at. By following the user's 3-D gaze direction together with a zero-disparity filter, we can determine the object's position. Our active vision system also directs attention to a user by tracking anything with both motion and skin color. A Particle Filter fuses skin color and motion from optical flow techniques together to locate a hand or a face in an image. The active vision then uses stereo camera geometry, Kalman Filtering and position and velocity controllers to track the feature in real-time. These skills are integrated together such that they cooperate with each other in order to track the user's face and gaze at all times. Results and video demos provide interesting insights on how active gaze tracking can be utilized and improved to make human-friendly user interfaces.