Learning spontaneous nonverbal behavior using a three layers hierarchy

  • Authors:
  • Yasser F. O. Mohammad;Toyoaki Nishida

  • Affiliations:
  • Assiut University, Department of Electrical Engineering, Assiut, Egypt;Kyoto University, Intelligence Science and Technology Department, Kyoto, Japan

  • Venue:
  • ACS'10 Proceedings of the 10th WSEAS international conference on Applied computer science
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Designing robots that can interact with humans using natural nonverbal behavior is becoming an active area of research in HRI. Recently, the authors proposed a two stages process that enables the robot to learn spontaneous interaction protocols using unsupervised learning techniques. One problem of the proposed method was that it generates a large number of processes that must be run in parallel. Another problem was that the Interaction Structure Learning (ISL) algorithm, used in the second stage of the development process, can work only on single dimensional data and so limits the possible combinations of interaction primitive that can be used. In this paper we alleviate both problems by modifying the interaction protocol representation and the algorithm utilized in the second stage of development. The proposed representation uses processes that cut across interaction roles that require lower computational power. The proposed learning algorithm utilizes Granger-causality in a novel way to discover rules with variable delays in multidimensional data. The proposed algorithm was evaluated in comparison to the original ISL in gaze control during listening and is shown to provide the same level of performance in with less computational resources required.