A gaze-responsive self-disclosing display
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Intelligent gaze-added interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The reading assistant: eye gaze triggered auditory prompting for reading remediation
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Probabilistic Student Modelling to Improve Exploratory Behaviour
User Modeling and User-Adapted Interaction
Eye-tracking to model and adapt to user meta-cognition in intelligent learning environments
Proceedings of the 11th international conference on Intelligent user interfaces
Unsupervised and supervised machine learning in user modeling for intelligent learning environments
Proceedings of the 12th international conference on Intelligent user interfaces
Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation
Knowledge-Based Systems
Detecting the Learner's Motivational States in An Interactive Learning Environment
Proceedings of the 2005 conference on Artificial Intelligence in Education: Supporting Learning through Intelligent and Socially Informed Technology
Hi-index | 0.00 |
In recent years, there has been substantial research on exploring how AI can contribute to Human-Computer Interaction by enabling an interface to understand a user's needs and act accordingly. Understanding user needs is especially challenging when it involves assessing the user's high-level mental states not easily reflected by interface actions. In this paper, we present our results on using eye-tracking data to model such mental states during interaction with adaptive educational software. We then discuss the implications of our research for Intelligent User Interfaces.