Situated information spaces and spatially aware palmtop computers
Communications of the ACM - Special issue on computer augmented environments: back to the real world
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns
IEEE Transactions on Pattern Analysis and Machine Intelligence
Peephole displays: pen interaction on spatially aware handheld computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
ICCV '95 Proceedings of the Fifth International Conference on Computer Vision
Motion Regularization for Model-Based Head Tracking
ICPR '96 Proceedings of the International Conference on Pattern Recognition (ICPR '96) Volume III-Volume 7276 - Volume 7276
A discriminative feature space for detecting and recognizing faces
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Mobile camera-based user interaction
ICCV'05 Proceedings of the 2005 international conference on Computer Vision in Human-Computer Interaction
Using the user's point of view for interaction on mobile devices
23rd French Speaking Conference on Human-Computer Interaction
iRotate: automatic screen rotation based on face orientation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Looking at you: fused gyro and face tracking for viewing large imagery on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
This paper introduces a new face tracking approach for controlling user interfaces in hand-held mobile devices. The proposed method detects the face and the eyes of the user by employing a method based on local texture features and boosting. An extended Kalman filter combines local motion features extracted from the face region and the detected eye positions to estimate the 3-D position and orientation of the camera with respect to the face. The camera position is used as an input for the spatially aware user interface. Experimental results on real image sequences captured with a camera-equipped mobile phone validate the feasibility of the method.