The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
New technological windows into mind: there is more in eyes and brains for human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Chinese input with keyboard and eye-tracking: an anatomical study
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Phidgets: easy development of physical interfaces through physical widgets
Proceedings of the 14th annual ACM symposium on User interface software and technology
TOG on Interface
Explaining effects of eye gaze on mediated group conversations:: amount or synchronization?
CSCW '02 Proceedings of the 2002 ACM conference on Computer supported cooperative work
Eye-R, a glasses-mounted eye motion detection interface
CHI '01 Extended Abstracts on Human Factors in Computing Systems
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Pointing gesture recognition based on 3D-tracking of face, hands and head orientation
Proceedings of the 5th international conference on Multimodal interfaces
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ViewPointer: lightweight calibration-free eye tracking for ubiquitous handsfree deixis
Proceedings of the 18th annual ACM symposium on User interface software and technology
Visual resonator: interface for interactive cocktail party phenomenon
CHI '06 Extended Abstracts on Human Factors in Computing Systems
OZCHI '06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments
Recognition of hearing needs from body and eye movements to improve hearing instruments
Pervasive'11 Proceedings of the 9th international conference on Pervasive computing
Identification of relevant multimodal cues to enhance context-aware hearing instruments
Proceedings of the 6th International Conference on Body Area Networks
Hi-index | 0.00 |
An often-heard complaint about hearing aids is that their amplification of environmental noise makes it difficult for users to focus on one particular speaker. In this paper, we present a new prototype Attentive Hearing Aid (AHA) based on ViewPointer, a wearable calibration-free eye tracker. With AHA, users need only look at the person they are listening to, to amplify that voice in their hearing aid. We present a preliminary evaluation of the use of eye input by hearing impaired users for switching between simultaneous speakers. We compared eye input with manual source selection through pointing and remote control buttons. Results show eye input was 73% faster than selection by pointing and 58% faster than button selection. In terms of recall of the material presented, eye input performed 80% better than traditional hearing aids, 54% better than buttons, and 37% better than pointing. Participants rated eye input as highest in the "easiest", "most natural", and "best overall" categories.