Online human gesture recognition from motion data streams

  • Authors:
  • Xin Zhao;Xue Li;Chaoyi Pang;Xiaofeng Zhu;Quan Z. Sheng

  • Affiliations:
  • The University of Queensland, Australia, Brisbane, Australia;The University of Queensland, Australia, Brisbane, Australia;CSIRO, Brisbane, Australia;Guangxi Normal University, Guilin, China;The University of Adelaide, Adelaide, Australia

  • Venue:
  • Proceedings of the 21st ACM international conference on Multimedia
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Online human gesture recognition has a wide range of applications in computer vision, especially in human-computer interaction applications. Recent introduction of cost-effective depth cameras brings on a new trend of research on body-movement gesture recognition. However, there are two major challenges: i) how to continuously recognize gestures from unsegmented streams, and ii) how to differentiate different styles of a same gesture from other types of gestures. In this paper, we solve these two problems with a new effective and efficient feature extraction method that uses a dynamic matching approach to construct a feature vector for each frame and improves sensitivity to the features of different gestures and decreases sensitivity to the features of gestures within the same class. Our comprehensive experiments on MSRC-12 Kinect Gesture and MSR-Action3D datasets have demonstrated a superior performance than the stat-of-the-art approaches.