What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Designing attentive interfaces
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
CHI '82 Proceedings of the 1982 Conference on Human Factors in Computing Systems
A mobile application framework for the geospatial web
Proceedings of the 16th international conference on World Wide Web
Personal and Ubiquitous Computing
Wearable augmented reality system using gaze interaction
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
Eye movement analysis for activity recognition
Proceedings of the 11th international conference on Ubiquitous computing
Eye gaze tracking techniques for interactive applications
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Using eye tracking for interaction
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Learning relevance from natural eye movements in pervasive interfaces
Proceedings of the 14th ACM international conference on Multimodal interaction
Multimodal reference resolution for mobile spatial interaction in urban environments
Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Hi-index | 0.00 |
Due to the vast amount of available georeferenced information novel techniques to more intuitively and efficiently interact with such content are increasingly required. In this paper, we introduce KIBITZER, a lightweight wearable system that enables the browsing of urban surroundings for annotated digital information. KIBITZER exploits its user's eye-gaze as natural indicator of attention to identify objects-of-interest and offers speech- and non-speech auditory feedback. Thus, it provides the user with a 6th sense for digital georeferenced information. We present a description of our system's architecture and the interaction technique and outline experiences from first functional trials.