Memory representations in natural tasks
Journal of Cognitive Neuroscience
Head movement estimation for wearable eye tracker
Proceedings of the 2004 symposium on Eye tracking research & applications
The determinants of web page viewing behavior: an eye-tracking study
Proceedings of the 2004 symposium on Eye tracking research & applications
Computational mechanisms for gaze direction in interactive visual environments
Proceedings of the 2006 symposium on Eye tracking research & applications
Applying computational tools to predict gaze direction in interactive visual environments
ACM Transactions on Applied Perception (TAP)
An efficient method for eye tracking and eye-gazed FOV estimation
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
3D Virtual worlds and the metaverse: Current status and future possibilities
ACM Computing Surveys (CSUR)
Recognition of understanding level and language skill using measurements of reading behavior
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
Our knowledge of the way that the visual system operates in everyday behavior has, until recently, been very limited. This information is critical not only for understanding visual function, but also for understanding the consequences of various kinds of visual impairment, and for the development of interfaces between human and artificial systems. The development of eye trackers that can be mounted on the head now allows monitoring of gaze without restricting the observer's movements. Observations of natural behavior have demonstrated the highly task-specific and directed nature of fixation patterns, and reveal considerable regularity between observers. Eye, head, and hand coordination also reveals much greater flexibility and task-specificity than previously supposed. Experimental examination of the issues raised by observations of natural behavior requires the development of complex virtual environments that can be manipulated by the experimenter at critical points during task performance. Experiments where we monitored gaze in a simulated driving environment demonstrate that visibility of task relevant information depends critically on active search initiated by the observer according to an internally generated schedule, and this schedule depends on learnt regularities in the environment. In another virtual environment where observers copied toy models we showed that regularities in the spatial structure are used by observers to control eye movement targeting. Other experiments in a virtual environment with haptic feedback show that even simple visual properties like size are not continuously available or processed automatically by the visual system, but are dynamically acquired and discarded according to the momentary task demands.