What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Negative inertia: a dynamic pointing function
CHI '95 Conference Companion on Human Factors in Computing Systems
Perceptual user interfaces (introduction)
Communications of the ACM
On Importance of Nose for Face Tracking
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Perceptual user interfaces using vision-based eye tracking
Proceedings of the 5th international conference on Multimodal interfaces
A perceptual user interface for recognizing head gesture acknowledgements
Proceedings of the 2001 workshop on Perceptive user interfaces
Face localization via hierarchical CONDENSATION with fisher boosting feature selection
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Applicability of No-Hands Computer Input Devices for the Certificates for Microsoft Office Software
ICCHP '08 Proceedings of the 11th international conference on Computers Helping People with Special Needs
Testing inertial sensor performance as hands-free human-computer interface
WSEAS Transactions on Computers
MarkerMouse: mouse cursor control using a head-mounted marker
ICCHP'10 Proceedings of the 12th international conference on Computers helping people with special needs
Memory-based particle filter for tracking objects with large variation in pose and appearance
ECCV'10 Proceedings of the 11th European conference on computer vision conference on Computer vision: Part III
LUI: lip in multimodal mobile GUI interaction
Proceedings of the 14th ACM international conference on Multimodal interaction
WiiMS: simulating mouse and keyboard for motor-impaired users
Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference
Design and evaluation of face tracking user interfaces for accessibility
Proceedings of the 2nd annual conference on Research in information technology
Using kernels for a video-based mouse-replacement interface
Personal and Ubiquitous Computing
Hi-index | 0.00 |
This paper introduces a novel camera mouse driven by visual face tracking based on a 3D model. As the camera becomes standard configuration for personal computers (PCs) and computation speed increases, achieving human-machine interaction through visual face tracking becomes a feasible solution to hands-free control. Human facial movements can be broken down into rigid motions, such as rotation and translation, and non-rigid motions such as opening, closing, and stretching of the mouth. First, we describe our face tracking system which can robustly and accurately retrieve these motion parameters from videos in real time [H. Tao, T. Huang, Explanation-based facial motion tracking using a piecewise Bezier volume deformation model, in: Proceedings of IEEE Computer Vision and Pattern Recogintion, vol. 1, 1999, pp. 611-617]. The retrieved (rigid) motion parameters can be employed to navigate the mouse cursor; the detection of mouth (non-rigid) motions triggers mouse events in the operating system. Three mouse control modes are investigated and their usability is compared. Experiments in the Windows XP environment verify the convenience of our camera mouse in hands-free control. This technology can be an alternative input option for people with hand and speech disability, as well as for futuristic vision-based games and interfaces.