A gaze-responsive self-disclosing display
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Communications of the ACM - Special issue on graphical user interfaces
Eye tracking in advanced interface design
Virtual environments and advanced interface design
A tool for creating eye-aware applications that adapt to changes in user behaviors
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
101 spots, or how do users read menus?
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Inferring intent in eye-based interfaces: tracing eye movements with process models
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The GAZE groupware system: mediating joint attention in multiparty communication and collaboration
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The impact of fluid documents on reading and browsing: an observational study
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Intelligent gaze-added interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Development of an Eye-Movement Enhanced Translation Support System
APCHI '98 Proceedings of the Third Asian Pacific Computer and Human Interaction
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
Mapping eye movements to cognitive processes
Mapping eye movements to cognitive processes
A non-projective dependency parser
ANLC '97 Proceedings of the fifth conference on Applied natural language processing
Attentional Object Spotting by Integrating Multimodal Input
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Resolving ambiguities of a gaze and speech interface
Proceedings of the 2004 symposium on Eye tracking research & applications
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Implementing eye-based user-aware e-learning
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Learning to interact with a computer by gaze
Behaviour & Information Technology - Work with Computing Systems WWCS 2007, Stockholm
Inferring object relevance from gaze in dynamic scenes
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Modeling dwell-based eye pointing target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gazemarks: gaze-based visual placeholders to ease attention switching
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Effects of different visual feedback forms on eye cursor's stabilities
IDGD'11 Proceedings of the 4th international conference on Internationalization, design and global development
Speed-accuracy trade-off in dwell-based eye pointing tasks at different cognitive levels
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Cognitive Systems Research
Exploiting eye tracking in advanced e-learning systems
Proceedings of the 13th International Conference on Computer Systems and Technologies
The effect of subject familiarity on comprehension and eye movements during reading
Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration
Hi-index | 0.00 |
Eye-aware applications have existed for long, but mostly for very special and restricted target populations. We have designed and are currently implementing an eye-aware application, called iDict, which is a general-purpose translation aid aimed at mass markets. iDict monitors the user's gaze path while s/he is reading text written in a foreign language. When the reader encounters difficulties, iDict steps in and provides assistance with the translation. To accomplish this, the system makes use of information obtained from reading research, a language model, and the user profile. This paper describes the idea of the iDict application, the design problems and the key solutions for resolving these problems.