A facial feature tracker for human-computer interaction based on 3D Time-Of-Flight cameras

  • Authors:
  • Martin Bohme;Martin Haker;Thomas Martinetz;Erhardt Barth

  • Affiliations:
  • Institute for Neuro- and Bioinformatics, University of Lubeck, Ratzeburger Allee 160, D-23538 Lubeck, Germany.;Institute for Neuro- and Bioinformatics, University of Lubeck, Ratzeburger Allee 160, D-23538 Lubeck, Germany.;Institute for Neuro- and Bioinformatics, University of Lubeck, Ratzeburger Allee 160, D-23538 Lubeck, Germany.;Institute for Neuro- and Bioinformatics, University of Lubeck, Ratzeburger Allee 160, D-23538 Lubeck, Germany

  • Venue:
  • International Journal of Intelligent Systems Technologies and Applications
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a facial feature tracker based on the combined range and amplitude data provided by a 3D Time-Of-Flight camera. We use this tracker to implement a head mouse, an alternative input device for people who have limited use of their hands. The facial feature tracker is based on geometric features that are related to the intrinsic dimensionality of multidimensional signals. We show how the position of the nose in the image can be determined robustly using a very simple bounding-box classifier, trained on a set of labelled sample images. Despite its simplicity, the classifier generalises well to subjects that it was not trained on. An important result is that the combination of range and amplitude data dramatically improves robustness compared to a single type of data. The tracker runs in real time at around 30 frames per second. We demonstrate its potential as an input device by using it to control Dasher, an alternative text input tool.