CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Intelligent gaze-added interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Effective eye-gaze input into Windows
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Identifying fixations and saccades in eye-tracking protocols
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Hand eye coordination patterns in target selection
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Automated eye-movement protocol analysis
Human-Computer Interaction
Multimodal integration of natural gaze behavior for intention recognition during object manipulation
Proceedings of the 2009 international conference on Multimodal interfaces
Eye gaze assisted human-computer interaction in a hand gesture controlled multi-display environment
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
Hi-index | 0.00 |
Gaze data contains valuable information about user's cognitive processes during execution of a task. In order to use this information, e.g., for studying user's strategies or for designing new gaze-based interaction techniques for HCI, gaze data needs to be aligned with the task executed by the user. In this paper we propose a novel framework based on the theory of Markov Decision Processes for putting gaze data into context, allowing for automated interpretation of gaze position and movement with respect to the task performed by the user. The model can be used for both, offline and online analysis of gaze data. We evaluate the proposed model with an indirect object manipulation task and demonstrate how it can be used for intention recognition and/or detection of a mismatch of the user's mental model.