Real-time gesture recognition for the high-level teleoperation interface of a mobile manipulator

  • Authors:
  • Yerbolat Khassanov;Nursultan Imanberdiyev;Huseyin Atakan Varol

  • Affiliations:
  • Nazarbayev University, Astana, Kazakhstan;Nazarbayev University, Astana, Kazakhstan;Nazarbayev University, Astana, Kazakhstan

  • Venue:
  • Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes an inertial motion capture based arm gesture recognition system for the high-level control of a mobile manipulator. Left arm kinematic data of the user is acquired by an inertial motion capture system (Xsens MVN) in real-time and processed to extract supervisory user interface commands such as "Manipulator On/Off", "Base On/Off" and "Operation Pause/Resume" for a mobile manipulator system (KUKA youBot). Principal Component Analysis and Linear Discriminant Analysis are employed for dimension reduction and classification of the user kinematic data, respectively. The classification accuracy for the six class gesture recognition problem is 95.6 percent. In order to increase the reliability of the gesture recognition framework in real-time operation, a consensus voting scheme involving the last ten classification results is implemented. During the five-minute long teleoperation experiment, a total of 25 high-level commands were recognized correctly by the consensus voting enhanced gesture recognizer. The experimental subject stated that the user interface was easy to learn and did not require extensive mental effort to operate.