Determining correspondences between sensory and motor signals

  • Authors:
  • Kento Nishibori;Jinji Chen;Yoshinori Takeuchi;Tetsuya Matsumoto;Hiroaki Kudo;Noboru Ohnishi

  • Affiliations:
  • Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Japan;Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Japan;Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Japan;Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Japan;Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Japan;Graduate School of Information Science, Nagoya University, Furo-cho, Chikusa-ku, Japan

  • Venue:
  • PCM'04 Proceedings of the 5th Pacific Rim conference on Advances in Multimedia Information Processing - Volume Part I
  • Year:
  • 2004

Quantified Score

Hi-index 0.01

Visualization

Abstract

A human being understands the environment by integrating information obtained by the senses of sight, hearing and touch.To integrate information across different senses, a human being must find the correspondence of events observed by different senses. We obtain image, sound signals by the senses of sight and hearing from the external world as afferent signals, and the copy of efferent signals (a command to the motor system), which are the information of the internal world. In this paper, we propose a method for relating multiple audio-visual events to an efferent signal (motor command to hand) according to general laws without object-specific knowledge. As corresponding cues, we use Gestalt’s grouping law; simultaneity of sound onsets and changes in movement, similarity of repetition between sound and movement. We conducted experiments in the real environment and obtained satisfactory results showing the effectiveness of the proposed method.