3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker

  • Authors:
  • Susan M. Munn;Jeff B. Pelz

  • Affiliations:
  • Rochester Institute of Technology, Rochester, NY;Rochester Institute of Technology, Rochester, NY

  • Venue:
  • Proceedings of the 2008 symposium on Eye tracking research & applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

To study an observer's eye movements during realistic tasks, the observer should be free to move naturally throughout our three-dimensional world. Therefore, a technique to determine an observer's point-of-regard (POR) as well as his/her motion throughout a scene in three dimensions with minor user input is proposed. This requires robust feature tracking and calibration of the scene camera in order to determine the 3D location and orientation of the scene camera in the world. With this information, calibrated 2D PORs can be triangulated to 3D positions in the world; the scale of the world coordinate system can be obtained via input of the distance between two known points in the scene. Information about scene camera movement and tracked features can also be used to obtain observer position and head orientation for all video frames. The final observer motion -- including the observer's positions and head orientations -- and PORs are expressed in 3D world coordinates. The result is knowledge of not only eye movements but head movements as well allowing for the evaluation of how an observer combines head and eye movements to perform a visual task. Additionally, knowledge of 3D information opens the door for many more options for visualization of eye-tracking results.