Tracking HoG Descriptors for Gesture Recognition

  • Authors:
  • Mohamed Bécha Kaaniche;François Bremond

  • Affiliations:
  • -;-

  • Venue:
  • AVSS '09 Proceedings of the 2009 Sixth IEEE International Conference on Advanced Video and Signal Based Surveillance
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

We introduce a new HoG (Histogram of Oriented Gradients) tracker for Gesture Recognition. Our main contribution is to build HoG trajectory descriptors (representing local motion) which are used for gesture recognition. First,we select for each individual in the scene a set of corner points to determine textured regions where to compute 2DHoG descriptors. Second, we track these 2D HoG descriptors in order to build temporal HoG descriptors. Lost descriptors are replaced by newly detected ones. Finally, we extract the local motion descriptors to learn offline a set of given gestures.Then, a new video can be classified according to the gesture occurring in the video. Results shows that the tracker performs well compared to KLT tracker [1]. The generated local motion descriptors are validated through gesture learning-classification using the KTH action database [2].