The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
The nature of statistical learning theory
The nature of statistical learning theory
An evaluation of an eye tracker as a device for computer input2
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Communications of the ACM
Gaze-orchestrated dynamic windows
SIGGRAPH '81 Proceedings of the 8th annual conference on Computer graphics and interactive techniques
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Eye Tracking Methodology: Theory and Practice
Eye Tracking Methodology: Theory and Practice
Gaze-enhanced scrolling techniques
Proceedings of the 20th annual ACM symposium on User interface software and technology
Perceptual image retrieval using eye movements
International Journal of Computer Mathematics - Computer Vision and Pattern Recognition
Wearable augmented reality system using gaze interaction
ISMAR '08 Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality
GaZIR: gaze-based zooming interface for image retrieval
Proceedings of the 2009 international conference on Multimodal interfaces
Inferring object relevance from gaze in dynamic scenes
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Advanced gaze visualizations for three-dimensional virtual environments
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
KIBITZER: a wearable system for eye-gaze-based mobile urban exploration
Proceedings of the 1st Augmented Human International Conference
Aided eyes: eye activity sensing for daily life
Proceedings of the 1st Augmented Human International Conference
Object selection in gaze controlled systems: What you don't look at is what you get
ACM Transactions on Applied Perception (TAP)
Gaussian Processes for Machine Learning (GPML) Toolbox
The Journal of Machine Learning Research
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
An augmented reality interface to contextual information
Virtual Reality - Special Issue on Augmented Reality
Investigating gaze-supported multimodal pan and zoom
Proceedings of the Symposium on Eye Tracking Research and Applications
Hi-index | 0.00 |
We study the feasibility of the following idea: Could a system learn to use the user's natural eye movements to infer relevance of real-world objects, if the user produced a set of learning data by clicking a "relevance" button during a learning session? If the answer is yes, the combination of eye tracking and machine learning would give a basis of "natural" interaction with the system by normally looking around, which would be very useful in mobile proactive setups. We measured the eye movements of the users while they were exploring an artificial art gallery. They labeled the relevant paintings by clicking a button while looking at them. The results show that a Gaussian process classifier accompanied by a time series kernel on the eye movements within an object predicts whether that object is relevant with better accuracy than dwell-time thresholding and random guessing.