Modeling tool-body assimilation using second-order recurrent neural network

  • Authors:
  • Shun Nishide;Tatsuhiro Nakagawa;Tetsuya Ogata;Jun Tani;Toru Takahashi;Hiroshi G. Okuno

  • Affiliations:
  • Department of Intelligence Science and Technology, Graduate School of Informatics, Kyoto University, Kyoto, Japan;Graduate School of Information Science, Nara Institute of Science and Technology, Nara, Japan;Department of Intelligence Science and Technology, Graduate School of Informatics, Kyoto University, Kyoto, Japan;Brain Science Institute, RIKEN, Saitama, Japan;Department of Intelligence Science and Technology, Graduate School of Informatics, Kyoto University, Kyoto, Japan;Department of Intelligence Science and Technology, Graduate School of Informatics, Kyoto University, Kyoto, Japan

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Tool-body assimilation is one of the intelligent human abilities. Through trial and experience, humans are capable of using tools as if they are part of their own bodies. This paper presents a method to apply a robot's active sensing experience for creating the tool-body assimilation model. The model is composed of a feature extraction module, dynamics learning module, and a tool recognition module. Self-Organizing Map (SOM) is used for the feature extraction module to extract object features from raw images. Multiple Time-scales Recurrent Neural Network (MTRNN) is used as the dynamics learning module. Parametric Bias (PB) nodes are attached to the weights of MTRNN as second-order network to modulate the behavior of MTRNN based on the tool. The generalization capability of neural networks provide the model the ability to deal with unknown tools. Experiments are performed with HRP-2 using no tool, I-shaped, T-shaped, and L-shaped tools. The distribution of PB values have shown that the model has learned that the robot's dynamic properties change when holding a tool. The results of the experiment show that the tool-body assimilation model is capable of applying to unknown objects to generate goal-oriented motions.