Visual tracking for multimodal human computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Detecting Faces in Images: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Designing attentive interfaces
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Robust Real-Time Face Detection
International Journal of Computer Vision
Proceedings of the 17th annual ACM symposium on User interface software and technology
eyeLook: using attention to facilitate mobile media consumption
Proceedings of the 18th annual ACM symposium on User interface software and technology
Face-tracking as an augmented input in video games: enhancing presence, role-playing and control
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
E-conic: a perspective-aware interface for multi-display environments
Proceedings of the 20th annual ACM symposium on User interface software and technology
Lean and zoom: proximity-aware user interface and content magnification
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Eye-gaze interaction for mobile phones
Mobility '07 Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology
Head Pose Estimation in Computer Vision: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Gaze-X: adaptive, affective, multimodal interface for single-user office scenarios
ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
Proxemic interaction: designing for a proximity and orientation-aware environment
ACM International Conference on Interactive Tabletops and Surfaces
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
In this paper we explore and validate the merits of using absolute and relative viewing distances from the screen as complementary input modalities for interactive systems. We motivate the use of viewing distance as a complementary modality by first mapping out its design space and then proposing several new applications that could benefit from it. We demonstrate that both absolute and relative viewing distance can be reliably estimated under controlled circumstances for both desktop and mobile devices using low-cost cameras and readily available computer vision algorithms. In our evaluations we find that viewing distance is a promising complementary input modality that can be reliably estimated using computer vision in environments with constant lighting. For environments with heterogeneous lighting conditions several challenges still exist when designing practical systems. To aid practitioners and researchers we conclude by highlighting several design implications for future systems.