Integrating simultaneous input from speech, gaze, and hand gestures
Intelligent multimedia interfaces
Eye tracking in advanced interface design
Virtual environments and advanced interface design
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
A robust selection system using real-time multi-modal user-agent interactions
IUI '99 Proceedings of the 4th international conference on Intelligent user interfaces
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Mutual disambiguation of recognition errors in a multimodel architecture
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Resolving ambiguities of a gaze and speech interface
Proceedings of the 2004 symposium on Eye tracking research & applications
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
GUIDe: gaze-enhanced UI design
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Integrated speech and gaze control for realistic desktop environments
Proceedings of the 2008 symposium on Eye tracking research & applications
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Improving eye cursor's stability for eye pointing tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A framework for gaze selection techniques
Proceedings of the 2008 annual research conference of the South African Institute of Computer Scientists and Information Technologists on IT research in developing countries: riding the wave of technology
Disambiguating ninja cursors with eye gaze
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 3rd ACM International Workshop on Context-Awareness for Self-Managing Systems
Proceedings of the 4th ACM International Workshop on Context-Awareness for Self-Managing Systems
Using eye gaze and speech to simulate a pointing device
Proceedings of the Symposium on Eye Tracking Research and Applications
Move it there, or not?: the design of voice commands for gaze with speech
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
Mutual disambiguation of eye gaze and speech for sight translation and reading
Proceedings of the 6th workshop on Eye gaze in intelligent human machine interaction: gaze in multimodal interaction
Hi-index | 0.00 |
Eye trackers have been used as pointing devices for a number of years. Due to inherent limitations in the accuracy of eye gaze, however, interaction is limited to objects spanning at least one degree of visual angle. Consequently, targets in gaze-based interfaces have sizes and layouts quite distant from "natural settings". To accommodate accuracy constraints, we developed a multimodal pointing technique combining eye gaze and speech inputs. The technique was tested in a user study on pointing at multiple targets. Results suggest that in terms of a footprint-accuracy tradeoff, pointing performance is best (~93%) for targets subtending 0.85 degrees with 0.3-degree gaps between them. User performance is thus shown to approach the limit of practical pointing. Effectively, developing a user interface that supports hands-free interaction and has a design similar to today's common interfaces is feasible.