Eye-tracking dynamic scenes with humans and animals

  • Authors:
  • Ljiljana Skrba;Ian O'Connell;Carol O'Sullivan

  • Affiliations:
  • -;-;-

  • Venue:
  • Proceedings of the 5th symposium on Applied perception in graphics and visualization
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

In our research, we are interested in simulating realistic quadrupeds [2008]. Previous eye-tracking results have shown that faces are particularly salient for static images of animals and humans [2005; 2004]. To explore whether similar eye-movement patterns are found for dynamic scenes depicting animals, we displayed multiple 4-second (56 frame) grey-scale video clips of farm animals (goat, horse, sheep) walking and trotting. Using an EyelinkII eye-tracker, we recorded the eye-movements of 7 participants who were instructed to view the experiments with a view to subsequently answering questions about the movements. As it has been shown that human and animal motions activate different areas of the brain in children [2003], we also showed the participants the same number of videos showing humans walking and running. Figure 2 shows several frames of three of the video clips, with the eye-fixations of one participant overlaid. This depicts a very typical eye-movement pattern found in most of the videos, in that participants first looked at the head of the animal, then looked along the torso, finishing at the hips.