Tracking hand and finger movements for behaviour analysis

  • Authors:
  • Enrica Dente;Anil Anthony Bharath;Jeffrey Ng;Aldert Vrij;Samantha Mann;Anthony Bull

  • Affiliations:
  • Faculty of Engineering, Department of Bioengineering, Imperial College London, Vision Research Group, London, United Kingdom;Faculty of Engineering, Department of Bioengineering, Imperial College London, Vision Research Group, London, United Kingdom;Faculty of Engineering, Department of Bioengineering, Imperial College London, Vision Research Group, London, United Kingdom;University of Portsmouth, United Kingdom;University of Portsmouth, United Kingdom;Faculty of Engineering, Department of Bioengineering, Imperial College London, Vision Research Group, London, United Kingdom

  • Venue:
  • Pattern Recognition Letters - Special issue on vision for crime detection and prevention
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we describe ongoing work into methods for the automated tracking of hand and finger movements in interview situations. The aim of this work is to aid visual behaviour analysis in studies of deception detection. Existing techniques for tracking hand and linger movements are reviewed to place current and future work into context. Posterior probability maps of skin tone, based on Parzen colour space probability density estimates, are used for initial hand segmentation. Blob features are then used to produce a markup of hand-states. A complex wavelet decomposition, coupled to weightings provided by the posterior probability map, is applied to detect small hand and finger movements. We discuss our hand tracking algorithm based on blob feature extraction and the results obtained from motion and orientation parameters in a "high-stakes experiment", designed around a real-life situation. We suggest the role of kinematic models of upper body, limb and finger motion for future work.