Evaluating crossmodal awareness of daily-partner robot to user's behaviors with gaze and utterance detection

  • Authors:
  • Tomoko Yonezawa;Hirotake Yamazoe;Akira Utsumi;Shinji Abe

  • Affiliations:
  • ATR IRC Lab., Kyoto, Japan;ATR IRC Lab., Kyoto, Japan;ATR IRC Lab., Kyoto, Japan;ATR IRC Lab., Kyoto, Japan

  • Venue:
  • Proceedings of the 3rd ACM International Workshop on Context-Awareness for Self-Managing Systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a daily-partner robot, that is aware of the user's situation or behavior by using gaze and utterance detection. For appropriate and familiar anthropomorphic interaction, the robot should wait for a timing to talk something to the user corresponding to the situation of her/him while she/he doing a task or thinking. According to the need, our proposed robot i) estimates the user's context by detecting her/his gaze and utterance, such as the target of the user's speech, ii) tries to notify the need to speak to the user by silent (i.e. without making an utterance) gazeturns toward the user and joint attention with taking advantage of the attentiveness, and iii) tell the message when the user talks to the robot. The results of experiments combining subjects' daily tasks with/without the above steps show that the crossmodal-aware behaviors of the robot are important in respectful communications without disturbing the user's ongoing task by adopting silent behaviors showing the robot's intention to speak and for drawing the user's attention.