Detecting Faces in Images: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
On Importance of Nose for Face Tracking
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Graph Embedded Analysis for Head Pose Estimation
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Facial feature tracking for cursor control
Journal of Network and Computer Applications
ACM Computing Surveys (CSUR)
EM enhancement of 3D head pose estimated by point at infinity
Image and Vision Computing
Eye-S: a full-screen input modality for pure eye-based communication
Proceedings of the 2008 symposium on Eye tracking research & applications
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Hands-free vision-based interface for computer accessibility
Journal of Network and Computer Applications
Head Pose Estimation in Computer Vision: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
IT 2008: the history of a new computing discipline
Communications of the ACM
Realtime performance-based facial animation
ACM SIGGRAPH 2011 papers
Real time head pose estimation from consumer depth cameras
DAGM'11 Proceedings of the 33rd international conference on Pattern recognition
Simple gaze gestures and the closure of the eyes as an interaction technique
Proceedings of the Symposium on Eye Tracking Research and Applications
Design and evaluation of face tracking user interfaces for accessibility
Proceedings of the 2nd annual conference on Research in information technology
Hi-index | 0.00 |
Using face and head movements to control a computer can be especially helpful for users who, for various reasons, cannot effectively use common input devices with their hands. Using vision-based consumer devices makes such a user interface readily available and allows its use to be non-intrusive. However, a characteristic problem with this system is accurate control. Consumer devices capture already small face movements at a resolution that is usually lower than the screen resolution. Computer vision algorithms and technologies that enable such also introduce noise, adversely affecting usability. This paper describes how different components of this perceptual user interface contribute to the problem of accuracy and presents potential solutions. This interface was implemented with different configurations and was statistically evaluated to support the analysis. The different configurations include, among other things, the use of 2D and depth images from consumer devices, different input styles, and the use of the Kalman filter.