Learning relevance from natural eye movements in pervasive interfaces

  • Authors:
  • Melih Kandemir;Samuel Kaski

  • Affiliations:
  • Aalto University School of Science, Espoo, Finland;Aalto University School of Science, Espoo, Finland

  • Venue:
  • Proceedings of the 14th ACM international conference on Multimodal interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the feasibility of the following idea: Could a system learn to use the user's natural eye movements to infer relevance of real-world objects, if the user produced a set of learning data by clicking a "relevance" button during a learning session? If the answer is yes, the combination of eye tracking and machine learning would give a basis of "natural" interaction with the system by normally looking around, which would be very useful in mobile proactive setups. We measured the eye movements of the users while they were exploring an artificial art gallery. They labeled the relevant paintings by clicking a button while looking at them. The results show that a Gaussian process classifier accompanied by a time series kernel on the eye movements within an object predicts whether that object is relevant with better accuracy than dwell-time thresholding and random guessing.