Real-time hands, face and facial features detection and tracking: Application to cognitive rehabilitation tests monitoring

  • Authors:
  • D. González-Ortega;F. J. Díaz-Pernas;M. Martínez-Zarzuela;M. Antón-Rodríguez;J. F. Díez-Higuera;D. Boto-Giralda

  • Affiliations:
  • Department of Signal Theory, Communications and Telematics Engineering, Telecommunications Engineering School, University of Valladolid, Campus Miguel Delibes, Valladolid 47011, Spain;Department of Signal Theory, Communications and Telematics Engineering, Telecommunications Engineering School, University of Valladolid, Campus Miguel Delibes, Valladolid 47011, Spain;Department of Signal Theory, Communications and Telematics Engineering, Telecommunications Engineering School, University of Valladolid, Campus Miguel Delibes, Valladolid 47011, Spain;Department of Signal Theory, Communications and Telematics Engineering, Telecommunications Engineering School, University of Valladolid, Campus Miguel Delibes, Valladolid 47011, Spain;Department of Signal Theory, Communications and Telematics Engineering, Telecommunications Engineering School, University of Valladolid, Campus Miguel Delibes, Valladolid 47011, Spain;Department of Signal Theory, Communications and Telematics Engineering, Telecommunications Engineering School, University of Valladolid, Campus Miguel Delibes, Valladolid 47011, Spain

  • Venue:
  • Journal of Network and Computer Applications
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, a marker-free computer vision system for cognitive rehabilitation tests monitoring is presented. The system monitors and analyzes the correct and incorrect realization of a set of psicomotricity exercises in which a hand has to touch a facial feature. The monitoring requires different human body parts detection and tracking. Detection of face, eyes, nose, and hands is achieved with a set of classifiers built independently based on the AdaBoost algorithm. Comparisons with other detection approaches, regarding performance and applicability to the monitoring system, are presented. Face and hands tracking is accomplished through the CAMShift algorithm with independent and adaptive two-dimensional histograms of the chromaticity components of the TSL color space for the pixels inside these three regions. The TSL color space was selected after a study of five color spaces regarding skin color characterization. The system is easily implemented with a consumer-grade computer and a camera, unconstrained background and illumination and runs at more than 23 frames per second. The system was tested and achieved a successful monitoring percentage of 97.62%. The automation of the human body parts motion monitoring, its analysis in relation to the psicomotricity exercise indicated to the patient and the storage of the result of the realization of a set of exercises free the rehabilitation experts of doing such demanding tasks. The vision-based system is potentially applicable to other human-computer interface tasks with minor changes.