What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Exploring 3D navigation: combining speed-coupled flying with orbiting
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Twenty years of eye typing: systems and design issues
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
The Psychology of Human-Computer Interaction
The Psychology of Human-Computer Interaction
Web accessibility for low bandwidth input
Proceedings of the fifth international ACM conference on Assistive technologies
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Effects of feedback and dwell time on eye typing speed and accuracy
Universal Access in the Information Society
Dynamically adapting GUIs to diverse input devices
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
WebInSight:: making web images accessible
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
Use of eye movements for video game control
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
International Journal of Human-Computer Studies
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaze-based interaction with massively multiplayer on-line games
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Exploring Visual and Motor Accessibility in Navigating a Virtual World
ACM Transactions on Accessible Computing (TACCESS)
TextEntry '03 Proceedings of the 2003 EACL Workshop on Language Modeling for Text Entry Methods
Universal Access in the Information Society - Special Issue: Communication by Gaze Interaction
Small-target selection with gaze alone
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Assessing Fit of Nontraditional Assistive Technologies
ACM Transactions on Accessible Computing (TACCESS)
Gaze interaction with virtual on-line communities: levelling the playing field for disabled users
Universal Access in the Information Society - Special Issue: Designing Inclusive Futures
Ability-Based Design: Concept, Principles and Examples
ACM Transactions on Accessible Computing (TACCESS)
Universal Access in the Information Society
Representing users in accessibility research
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Navigating a 3D avatar using a single switch
Proceedings of the 6th International Conference on Foundations of Digital Games
The validity of using non-representative users in gaze communication research
Proceedings of the Symposium on Eye Tracking Research and Applications
Hi-index | 0.00 |
Young people with severe physical disabilities may benefit greatly from participating in immersive computer games. In-game tasks can be fun, engaging, educational, and socially interactive. But for those who are unable to use traditional methods of computer input such as a mouse and keyboard, there is a barrier to interaction that they must first overcome. Eye-gaze interaction is one method of input that can potentially achieve the levels of interaction required for these games. How we use eye-gaze or the gaze interaction technique depends upon the task being performed, the individual performing it, and the equipment available. To fully realize the impact of participation in these environments, techniques need to be adapted to the person’s abilities. We describe an approach to designing and adapting a gaze interaction technique to support locomotion, a task central to immersive game playing. This is evaluated by a group of young people with cerebral palsy and muscular dystrophy. The results show that by adapting the interaction technique, participants are able to significantly improve their in-game character control.