What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards an EOG-based eye tracker for computer control
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Full-time wearable headphone-type gaze detector
CHI '06 Extended Abstracts on Human Factors in Computing Systems
A long-term evaluation of sensing modalities for activity recognition
UbiComp '07 Proceedings of the 9th international conference on Ubiquitous computing
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
A wearable interface for topological mapping and localization in indoor environments
LoCA'06 Proceedings of the Second international conference on Location- and Context-Awareness
Proceedings of the 2nd International Conference on Fun and Games
User-oriented document summarization through vision-based eye-tracking
Proceedings of the 14th international conference on Intelligent user interfaces
Wearable EOG goggles: eye-based interaction in everyday environments
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Look into my eyes!: can you guess my password?
Proceedings of the 5th Symposium on Usable Privacy and Security
Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments
Journal of Ambient Intelligence and Smart Environments
See what i'm saying?: using Dyadic Mobile Eye tracking to study collaborative reference
Proceedings of the ACM 2011 conference on Computer supported cooperative work
Gliding and saccadic gaze gesture recognition in real time
ACM Transactions on Interactive Intelligent Systems (TiiS)
Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments
Journal of Ambient Intelligence and Smart Environments
Observing facial expressions and gaze positions for personalized webpage recommendation
Proceedings of the 12th International Conference on Electronic Commerce: Roadmap for the Future of Electronic Business
Considerations on Strategies to Improve EOG Signal Analysis
International Journal of Artificial Life Research
Hi-index | 0.00 |
In this work we describe the design, implementation and evaluation of a novel eye tracker for context-awareness and mobile HCI applications. In contrast to common systems using video cameras, this compact device relies on Electrooculography (EOG). It consists of goggles with dry electrodes integrated into the frame and a small pocket-worn component with a DSP for real-time EOG signal processing. The device is intended for wearable and standalone use: It can store data locally for long-term recordings or stream processed EOG signals to a remote device over Bluetooth. We describe how eye gestures can be efficiently recognised from EOG signals for HCI purposes. In an experiment conducted with 11 subjects playing a computer game we show that 8 eye gestures of varying complexity can be continuously recognised with equal performance to a state-of-the-art video-based system. Physical activity leads to artefacts in the EOG signal. We describe how these artefacts can be removed using an adaptive filtering scheme and characterise this approach on a 5-subject dataset. In addition to explicit eye movements for HCI, we discuss how the analysis of unconscious eye movements may eventually allow to deduce information on user activity and context not available with current sensing modalities.