3D from looking: using wearable gaze tracking for hands-free and feedback-free object modelling

  • Authors:
  • Teesid Leelasawassuk;Walterio W. Mayol-Cuevas

  • Affiliations:
  • University of Bristol, Bristol, United Kingdom;University of Bristol, Bristol, United Kingdom

  • Venue:
  • Proceedings of the 2013 International Symposium on Wearable Computers
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a method for estimating the 3D shape of an object being observed using wearable gaze tracking. Starting from a sparse environment map generated by a simultaneous localization and mapping algorithm (SLAM), we use the gaze direction positioned in 3D to extract the model of the object under observation. By letting the user look at the object of interest, and without any feedback, the method determines 3D point-of-regards by back-projecting the user's gaze rays into the map. The 3D point-of-regards are then used as seed points for segmenting the object from captured images and the calculated silhouettes are used to estimate the 3D shape of the object. We explore methods to remove outlier gaze points that result from the user saccading to non object points and methods for reducing the error in the shape estimation. Being able to exploit gaze information in this way, enables the user of wearable gaze trackers to be able to do things as complex as object modelling in a hands-free and even feedback-free manner.