Gesture-based control of highly articulated biomechatronic systems

  • Authors:
  • Zhiqiang Luo;I-Ming Chen;Shusong Xing;Henry Been-Lirn Duh

  • Affiliations:
  • Nanyang Technological Unveristy, Singapore;Nanyang Technological Unveristy, Singapore;School of Software, Nankai University, China;Nanyang Technological Unveristy, Singapore

  • Venue:
  • Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
  • Year:
  • 2006

Quantified Score

Hi-index 0.03

Visualization

Abstract

A robotic puppet is developed for studying motion generation and control of highly articulated biomimic mechatronic systems with anatomical motion data of human in real time. The system is controlled by a pair of data gloves tracking human fingers' actions. With the primitives designed in a multilayered motion synthesis structure, the puppet can realize some complex human-like actions. Continuous full body movements are produced on the robotic puppet by combining and sequencing the actions on different body parts using temporal and spatial information provided by the data gloves. Human is involved in the interactive design of the coordination and timing of the body movements of the robotic puppet in a natural and intuitive manner. The methods of motion generation exhibited on the robotic puppet may be applied to the interactive media, entertainment and biomedical engineering.