Guiding eye movements for better communication and augmented vision

  • Authors:
  • Erhardt Barth;Michael Dorr;Martin Böhme;Karl Gegenfurtner;Thomas Martinetz

  • Affiliations:
  • Institute for Neuro- and Bioinformatics, University of Lübeck, Lübeck, Germany;Institute for Neuro- and Bioinformatics, University of Lübeck, Lübeck, Germany;Institute for Neuro- and Bioinformatics, University of Lübeck, Lübeck, Germany;Allgemeine Psychologie, Justus-Liebig-University, Gießen, Germany;Institute for Neuro- and Bioinformatics, University of Lübeck, Lübeck, Germany

  • Venue:
  • PIT'06 Proceedings of the 2006 international tutorial and research conference on Perception and Interactive Technologies
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper briefly summarises our results on gaze guidance such as to complement the demonstrations that we plan to present at the workshop. Our goal is to integrate gaze into visual communication systems by measuring and guiding eye movements. Our strategy is to predict a set of about ten salient locations and then change the probability for one of these candidates to be attended: for one candidate the probability is increased, for the others it is decreased. To increase saliency, in our current implementation, we show a natural-scene movie and overlay red dots very briefly such that they are hardly perceived consciously. To decrease the probability, for example, we locally reduce the temporal frequency content of the movie. We here present preliminary results, which show that the three steps of our above strategy are feasible. The long-term goal is to find the optimal real-time video transformation that minimises the difference between the actual and the desired eye movements without being obtrusive. Applications are in the area of vision-based communication, augmented vision, and learning.