What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards an EOG-based eye tracker for computer control
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
BT Technology Journal
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pervasive games: bringing computer entertainment back to the real world
Computers in Entertainment (CIE) - Theoretical and Practical Computer Applications in Entertainment
Use of eye movements for video game control
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
It's in your eyes: towards context-awareness and mobile HCI using wearable EOG goggles
UbiComp '08 Proceedings of the 10th international conference on Ubiquitous computing
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Hi-index | 0.00 |
Physical activity has emerged as a novel input modality for so-called active video games. Input devices such as music instruments, dance mats or the Wii accessories allow for novel ways of interaction and a more immersive gaming experience. In this work we describe how eye movements recognised from electrooculographic (EOG) signals can be used for gaming purposes in three different scenarios. In contrast to common video-based systems, EOG can be implemented as a wearable and light-weight system which allows for long-term use with unconstrained simultaneous physical activity. In a stationary computer game we show that eye gestures of varying complexity can be recognised online with equal performance to a state-of-the-art video-based system. For pervasive gaming scenarios, we show how eye movements can be recognised in the presence of signal artefacts caused by physical activity such as walking. Finally, we describe possible future context-aware games which exploit unconscious eye movements and show which possibilities this new input modality may open up.