Learning relevant eye movement feature spaces across users

  • Authors:
  • Zakria Hussain;Kitsuchart Pasupa;John Shawe-Taylor

  • Affiliations:
  • University College London;University of Southampton;University College London

  • Venue:
  • Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper we predict the relevance of images based on a lowdimensional feature space found using several users' eye movements. Each user is given an image-based search task, during which their eye movements are extracted using a Tobii eye tracker. The users also provide us with explicit feedback regarding the relevance of images. We demonstrate that by using a greedy Nyström algorithm on the eye movement features of different users, we can find a suitable low-dimensional feature space for learning. We validate the suitability of this feature space by projecting the eye movement features of a new user into this space, training an online learning algorithm using these features, and showing that the number of mistakes (regret over time) made in predicting relevant images is lower than when using the original eye movement features. We also plot Recall-Precision and ROC curves, and use a sign test to verify the statistical significance of our results.