Automatic Gesture Recognition for Intelligent Human-Robot Interaction

  • Authors:
  • Seong-Whan Lee

  • Affiliations:
  • Korea University

  • Venue:
  • FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

An intelligent robot requires natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI researches were focused on issues such as hand gesture, sign language, and command gesture recognition. However, automatic recognition of whole body gestures is required in order to operate HRI naturally. This can be a challenging problem because describing and modeling meaningful gesture patterns from whole body gestures are complex tasks. This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot. Our method is simultaneously used with other HRI approaches such as speech recognition, face recognition, and so forth. In this regard, both of execution speed and recognition performance should be considered. For efficient and natural operation, we used several approaches at each step of gesture recognition; learning and extraction of articulated joint information, representing gesture as a sequence of clusters, spotting and recognizing a gesture with HMM. In addition, we constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile robot.