A gaze-responsive self-disclosing display
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tailor: creating custom user interfaces based on gesture
UIST '90 Proceedings of the 3rd annual ACM SIGGRAPH symposium on User interface software and technology
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Human-computer interaction: input devices
ACM Computing Surveys (CSUR)
ACM Computing Surveys (CSUR) - Special issue: position statements on strategic directions in computing research
A robust selection system using real-time multi-modal user-agent interactions
IUI '99 Proceedings of the 4th international conference on Intelligent user interfaces
VIDEOPLACE—an artificial reality
CHI '85 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interacting with eye movements in virtual environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Evaluation of eye gaze interaction
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Interactive Editing Systems: Part II
ACM Computing Surveys (CSUR)
Design and applications of a high-resolution insert head-mounted-display
VRAIS '95 Proceedings of the Virtual Reality Annual International Symposium (VRAIS'95)
ICARE: a component-based approach for multimodal interaction
UbiMob '04 Proceedings of the 1st French-speaking conference on Mobility and ubiquity computing
EyeWindows: evaluation of eye-controlled zooming windows for focus selection
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
OverHear: augmenting attention in remote social gatherings through computer-mediated hearing
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Media eyepliances: using eye tracking for remote control focus selection of appliances
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Distant freehand pointing and clicking on very large, high resolution displays
Proceedings of the 18th annual ACM symposium on User interface software and technology
GUIDe: gaze-enhanced UI design
CHI '07 Extended Abstracts on Human Factors in Computing Systems
An improved likelihood model for eye tracking
Computer Vision and Image Understanding
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
All eyes on the monitor: gaze based interaction in zoomable, multi-scaled information-spaces
Proceedings of the 13th international conference on Intelligent user interfaces
A Fitts Law comparison of eye tracking and manual input in the selection of visual targets
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Multi-modal Interface in Multi-Display Environment for Multi-users
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Gazemarks: gaze-based visual placeholders to ease attention switching
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interacting with the computer using gaze gestures
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Proceedings of the 1st Conference on Novel Gaze-Controlled Applications
Measuring and visualizing attention in space with 3D attention volumes
Proceedings of the Symposium on Eye Tracking Research and Applications
Learning relevance from natural eye movements in pervasive interfaces
Proceedings of the 14th ACM international conference on Multimodal interaction
Evaluating an eye tracking interface for a two-dimensional sketch editor
Computer-Aided Design
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
Consider a large-format display before the user, bearing a multiplicity of “windows,” like little movies, the majority dynamic and in color. There are upwards of 20 windows, say, more than a person can ordinarily absorb at once. Some of the windows come and go, reflecting their nature as direct TV linkages into real-time, real-world events. Others are non-real-time, some dynamic, others static but capable of jumping into motion. Such an ensemble of information inputs reflects the managerial world of the top-level executive of the not too distant electronic future: a world of brevity, fragmentation, variety, above all one of an overwhelming onslaught of events. The multiplicity and simultaneity of such a display situation ordinarily would make coping with it untenable. The intent of the reported research is to introduce order and control, through the creation of a dynamic, gaze-interactive interface. Making the behavior and reactivity of the “windows” contingent upon measured eyemovements - the point-of-regard of the observer - aims both to help the observer to cope with the onslaught of events on the one hand, yet enable on the other hand continuing close contact with that everchanging ensemble. A simulation of such a world is described and demonstrated in the composite medium of computer, videodisc, and video special effects. Eye-tracking technology, integrated with speech and manual inputs, controls the display's visual dynamics, and orchestrates its sound accompaniments. All elements are combined to form a testbed for the conception generally, and to explore the associated human factors and stagecraft.