Vision-based arm gesture recognition for a long-range human---robot interaction

  • Authors:
  • Dohyung Kim;Jaeyeon Lee;Ho-Sub Yoon;Jaehong Kim;Joochan Sohn

  • Affiliations:
  • Electronics and Telecommunications Research Institute, Daejeon, Korea;Electronics and Telecommunications Research Institute, Daejeon, Korea;Electronics and Telecommunications Research Institute, Daejeon, Korea;Electronics and Telecommunications Research Institute, Daejeon, Korea;Electronics and Telecommunications Research Institute, Daejeon, Korea

  • Venue:
  • The Journal of Supercomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a vision-based human arm gesture recognition method for human---robot interaction, particularly at a long distance where speech information is not available. We define four meaningful arm gestures for a long-range interaction. The proposed method is capable of recognizing the defined gestures only with 320脳240 pixel-sized low-resolution input images captured from a single camera at a long distance, approximately five meters from the camera. In addition, the system differentiates the target gestures from the users' normal actions that occur in daily life without any constraints. For human detection at a long distance, the proposed approach combines results from mean-shift color tracking, short- and long-range face detection, and omega shape detection. The system then detects arm blocks using a background subtraction method with a background updating module and recognizes the target gestures based on information about the region, periodical motion, and shape of the arm blocks. From experiments using a large realistic database, a recognition rate of 97.235% is achieved, which is a sufficiently practical level for various pervasive and ubiquitous applications based on human gestures.