Auditory learning: a developmental method

  • Authors:
  • Yilu Zhang;Juyang Weng;Wey-Shiuan Hwang

  • Affiliations:
  • Res. Center, Gen. Motors Corp., Warren, MI, USA;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Motivated by the human autonomous development process from infancy to adulthood, we have built a robot that develops its cognitive and behavioral skills through real-time interactions with the environment. We call such a robot a developmental robot. In this paper, we present the theory and the architecture to implement a developmental robot and discuss the related techniques that address an array of challenging technical issues. As an application, experimental results on a real robot, self-organizing, autonomous, incremental learner (SAIL), are presented with emphasis on its audition perception and audition-related action generation. In particular, the SAIL robot conducts the auditory learning from unsegmented and unlabeled speech streams without any prior knowledge about the auditory signals, such as the designated language or the phoneme models. Neither available before learning starts are the actions that the robot is expected to perform. SAIL learns the auditory commands and the desired actions from physical contacts with the environment including the trainers.