Sensory integration with articulated motion on a humanoid robot

  • Authors:
  • J. Rojas;R. A. Peters, II

  • Affiliations:
  • Center for Intelligent Systems, Vanderbilt University, Nashville TN, USA;Center for Intelligent Systems, Vanderbilt University, Nashville TN, USA

  • Venue:
  • Applied Bionics and Biomechanics
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes the integration of articulated motion with auditory and visual sensory information that enables a humanoid robot to achieve certain reflex actions that mimic those of people. Reflexes such as reach-and-grasp behavior enables the robot to learn, through experience, its own state and that of the world. A humanoid robot with binaural audio input, stereo vision, and pneumatic arms and hands exhibited tightly coupled sensory-motor behaviors in four different demonstrations. The complexity of successive demonstrations was increased to show that the reflexive sensory-motor behaviors combine to perform increasingly complex tasks. The humanoid robot executed these tasks effectively and established the groundwork for the further development of hardware and software systems, sensory-motor vector-space representations, and coupling with higher-level cognition.