Identifying fixations and saccades in eye-tracking protocols
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Frequency analysis of task evoked pupillary response and eye-movement
Proceedings of the 2004 symposium on Eye tracking research & applications
Leveraging data complexity: Pupillary behavior of older adults with visual impairment during HCI
ACM Transactions on Computer-Human Interaction (TOCHI)
Measuring the task-evoked pupillary response with a remote eye tracker
Proceedings of the 2008 symposium on Eye tracking research & applications
What do you want to do next: a novel approach for intent prediction in gaze-based interaction
Proceedings of the Symposium on Eye Tracking Research and Applications
Student progress assessment with the help of an intelligent pupil analysis system
Engineering Applications of Artificial Intelligence
Computational approaches to visual attention for interaction inference
Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companion
Hi-index | 0.00 |
We propose a new way of analyzing pupil measurements made in conjunction with eye tracking: fixation-aligned pupillary response averaging, in which short windows of continuous pupil measurements are selected based on patterns in eye tracking data, temporally aligned, and averaged together. Such short pupil data epochs can be selected based on fixations on a particular spot or a scan path. The windows of pupil data thus selected are aligned by temporal translation and linear warping to place corresponding parts of the gaze patterns at corresponding times and then averaged together. This approach enables the measurement of quick changes in cognitive load during visual tasks, in which task components occur at unpredictable times but are identifiable via gaze data. We illustrate the method through example analyses of visual search and map reading. We conclude with a discussion of the scope and limitations of this new method.