A Real-Time Continuous Gesture Recognition System for Sign Language
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Integration of a low-cost RGB-D sensor in a social robot for gesture recognition
Proceedings of the 6th international conference on Human-robot interaction
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
This paper describes an inertial motion capture based arm gesture recognition system for the high-level control of a mobile manipulator. Left arm kinematic data of the user is acquired by an inertial motion capture system (Xsens MVN) in real-time and processed to extract supervisory user interface commands such as "Manipulator On/Off", "Base On/Off" and "Operation Pause/Resume" for a mobile manipulator system (KUKA youBot). Principal Component Analysis and Linear Discriminant Analysis are employed for dimension reduction and classification of the user kinematic data, respectively. The classification accuracy for the six class gesture recognition problem is 95.6 percent. In order to increase the reliability of the gesture recognition framework in real-time operation, a consensus voting scheme involving the last ten classification results is implemented. During the five-minute long teleoperation experiment, a total of 25 high-level commands were recognized correctly by the consensus voting enhanced gesture recognizer. The experimental subject stated that the user interface was easy to learn and did not require extensive mental effort to operate.