CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Eye tracking in advanced interface design
Virtual environments and advanced interface design
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Inferring intent in eye-based interfaces: tracing eye movements with process models
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
A software model and specification language for non-WIMP user interfaces
ACM Transactions on Computer-Human Interaction (TOCHI)
On Building Intelligence into EagleEyes
Assistive Technology and Artificial Intelligence, Applications in Robotics, User Interfaces and Natural Language Processing
Wheelesley: A Robotic Wheelchair System: Indoor Navigation and User Interface
Assistive Technology and Artificial Intelligence, Applications in Robotics, User Interfaces and Natural Language Processing
Mapping eye movements to cognitive processes
Mapping eye movements to cognitive processes
Dasher—a data entry interface using continuous gestures and language models
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Design issues of iDICT: a gaze-assisted translation aid
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Text input methods for eye trackers using off-screen targets
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Zooming interfaces!: enhancing the performance of eye controlled pointing devices
Proceedings of the fifth international ACM conference on Assistive technologies
Where is "it"? Event Synchronization in Gaze-Speech Input Systems
Proceedings of the 5th international conference on Multimodal interfaces
Gaze typing compared with input by head and hand
Proceedings of the 2004 symposium on Eye tracking research & applications
Gazing and frowning as a new human--computer interaction technique
ACM Transactions on Applied Perception (TAP)
An embedded system for an eye-detection sensor
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Eye-tracking to model and adapt to user meta-cognition in intelligent learning environments
Proceedings of the 11th international conference on Intelligent user interfaces
A comparative usability study of two Japanese gaze typing systems
Proceedings of the 2006 symposium on Eye tracking research & applications
Modelling "user understanding" in simple communication tasks
Proceedings of the international workshop in conjunction with AVI 2006 on Context in advanced interfaces
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
GUIDe: gaze-enhanced UI design
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation
Knowledge-Based Systems
Evaluating requirements for gaze-based interaction in a see-through head mounted display
Proceedings of the 2008 symposium on Eye tracking research & applications
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Improving eye cursor's stability for eye pointing tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Evaluation of gaze-added target selection methods suitable for general GUIs
International Journal of Computer Applications in Technology
Automated eye-movement protocol analysis
Human-Computer Interaction
Human-Computer Interaction
BC(eye): Combining Eye-Gaze Input with Brain-Computer Interaction
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Using eye-tracking data for high-level user modeling in adaptive interfaces
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 2
An embedded system for an eye-detection sensor
Computer Vision and Image Understanding - Special issue on eye detection and tracking
Instantaneous saccade driven eye gaze interaction
Proceedings of the International Conference on Advances in Computer Enterntainment Technology
Semantic processing based on eye-tracking metrics
WSEAS Transactions on Computers
Gazemarks: gaze-based visual placeholders to ease attention switching
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 2010 workshop on Eye gaze in intelligent human machine interaction
Exploring eye tracking to increase bandwidth in user modeling
UM'05 Proceedings of the 10th international conference on User Modeling
Look & touch: gaze-supported target acquisition
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Use of saccadic latency for visual inspection system
Human Factors in Ergonomics & Manufacturing
Evaluating the robustness of an appearance-based gaze estimation method for multimodal interfaces
Proceedings of the 15th ACM on International conference on multimodal interaction
Intent capturing through multimodal inputs
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Hi-index | 0.01 |
We discuss a novel type of interface, the intelligent gaze-added interface, and describe the design and evaluation of a sample gaze-added operating-system interface. Gaze-added interfaces, like current gaze-based systems, allow users to execute commands using their eyes. However, while most gaze-based systems replace the functionality of other inputs with that of gaze, gaze-added interfaces simply add gaze functionality that the user can employ if and when desired. Intelligent gaze-added interfaces utilize a probabilistic algorithm and user model to interpret gaze focus and alleviate typical problems with eye-taking data. We extended a standard WIMP operating-system interface into a new interface, IGO, that incorporates intelligent gaze-added input. In a user study, we found that users quickly adapted to the new interface and utilized gaze effectively both alone and with other inputs.