Upper body gesture recognition for human-robot interaction

  • Authors:
  • Chi-Min Oh;Md. Zahidul Islam;Jun-Sung Lee;Chil-Woo Lee;In-So Kweon

  • Affiliations:
  • Chonnam National University, Korea, Korea Advanced Institute of Science and Technology, Korea;Chonnam National University, Korea, Korea Advanced Institute of Science and Technology, Korea;Chonnam National University, Korea, Korea Advanced Institute of Science and Technology, Korea;Chonnam National University, Korea, Korea Advanced Institute of Science and Technology, Korea;Chonnam National University, Korea, Korea Advanced Institute of Science and Technology, Korea

  • Venue:
  • HCII'11 Proceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a vision-based human-robot interaction system for mobile robot platform. A mobile robot first finds an interested person who wants to interact with it. Once it finds a subject, the robot stops in the front of him or her and finally interprets her or his upper body gestures. We represent each gesture as a sequence of body poses and the robot recognizes four upper body gestures: "Idle", "I love you", "Hello left", and "Hello right". A key posebased particle filter determines the pose sequence and key poses are sparsely collected from the pose space. Pictorial Structure-based upper body model represents key poses and these key poses are used to build an efficient proposal distribution for the particle filtering. Thus, the particles are drawn from key pose-based proposal distribution for the effective prediction of upper body pose. The Viterbi algorithm estimates the gesture probabilities with a hidden Markov model. The experimental results show the robustness of our upper body tracking and gesture recognition system.