An interaction system using mixed hand gestures

  • Authors:
  • Zhong Yang;Yi Li;Yang Zheng;Weidong Chen;Xiaoxiang Zheng

  • Affiliations:
  • Zhejiang University, Hangzhou, Zhejiang, China;Zhejiang University, Hangzhou, Zhejiang, China;Zhejiang University, Hangzhou, Zhejiang, China;Zhejiang University, Hangzhou, Zhejiang, China;Zhejiang University, Hangzhou, Zhejiang, China

  • Venue:
  • Proceedings of the 10th asia pacific conference on Computer human interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a mixed hand gesture interaction system in virtual environment, in which "mixed" means static and dynamic hand gestures are combined for both navigation and object manipulation. Firstly, a simple average background model and skin color are used for hand area segmentation. Then a state-based spotting algorithm is employed to automatically identify two types of hand gestures. A voting-based method is used for quick classification of static gestures. And we use the hidden Markov model (HMM) to recognize dynamic gestures. Since the training of HMM requires the consistency of the training data, outputted by the feature extraction, a data aligning algorithm is raised. Through our mixed hand gesture system, users can perform complicated operating commands in a natural way. The experimental results demonstrate that our methods are effective and accurate.