Automatic Analysis of Facial Expressions: The State of the Art
IEEE Transactions on Pattern Analysis and Machine Intelligence
Extraction of Visual Features for Lipreading
IEEE Transactions on Pattern Analysis and Machine Intelligence
Comprehensive Database for Facial Expression Analysis
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Recognizing Lower Face Action Units for Facial Expression Analysis
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Face as mouse through visual face tracking
Computer Vision and Image Understanding
Audio-visual speech modeling for continuous speech recognition
IEEE Transactions on Multimedia
Analysis of lip geometric features for audio-visual speech recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Discriminative Analysis of Lip Motion Features for Speaker Identification and Speech-Reading
IEEE Transactions on Image Processing
Hi-index | 0.00 |
Gesture based interactions are commonly used in mobile and ubiquitous environments. Multimodal interaction techniques use lip gestures to enhance speech recognition or control mouse movement on the screen. In this paper we extend the previous work to explore LUI: lip gestures as an alternative input technique for controlling the user interface elements in a ubiquitous environment. In addition to use lips to control cursor movement, we use lip gestures to control music players and activate menus. A LUI Motion-Action library is also provided to guide future interaction design using lip gestures.