Face as mouse through visual face tracking

  • Authors:
  • Jilin Tu;Hai Tao;Thomas Huang

  • Affiliations:
  • Electrical and Computer Engineering Department, University of Illinois at Urbana and Champaign Urbana, IL 61801, USA;Department of Computer Engineering, University of California at Santa Cruz, Santa Cruz, CA 95064, USA;Electrical and Computer Engineering Department, University of Illinois at Urbana and Champaign Urbana, IL 61801, USA

  • Venue:
  • Computer Vision and Image Understanding
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper introduces a novel camera mouse driven by visual face tracking based on a 3D model. As the camera becomes standard configuration for personal computers (PCs) and computation speed increases, achieving human-machine interaction through visual face tracking becomes a feasible solution to hands-free control. Human facial movements can be broken down into rigid motions, such as rotation and translation, and non-rigid motions such as opening, closing, and stretching of the mouth. First, we describe our face tracking system which can robustly and accurately retrieve these motion parameters from videos in real time [H. Tao, T. Huang, Explanation-based facial motion tracking using a piecewise Bezier volume deformation model, in: Proceedings of IEEE Computer Vision and Pattern Recogintion, vol. 1, 1999, pp. 611-617]. The retrieved (rigid) motion parameters can be employed to navigate the mouse cursor; the detection of mouth (non-rigid) motions triggers mouse events in the operating system. Three mouse control modes are investigated and their usability is compared. Experiments in the Windows XP environment verify the convenience of our camera mouse in hands-free control. This technology can be an alternative input option for people with hand and speech disability, as well as for futuristic vision-based games and interfaces.