Overlay what Humanoid Robot Perceives and Thinks to the Real-world by Mixed Reality System

  • Authors:
  • Kazuhiko Kobayashi;Koichi Nishiwaki;Shinji Uchiyama;Hiroyuki Yamamoto;Satoshi Kagami;Takeo Kanade

  • Affiliations:
  • Human Machine Perception Lab, Canon Inc., Japan. e-mail: kobayashi.kazuhiko@canon.co.jp;Digital Human Research Center, AIST, Japan. e-mail: k.nishiwaki@aist.go.jp;Human Machine Perception Lab, Canon Inc., Japan. e-mail: uchiyama.shinji@canon.co.jp;Human Machine Perception Lab, Canon Inc., Japan. e-mail: yamamoto.hiroyuki125@canon.co.jp;Digital Human Research Center, AIST, Japan. e-mail: s.kagami@aist.go.jp;Digital Human Research Center, AIST, Japan. e-mail: t.kanade@aist.go.jp

  • Venue:
  • ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

One of the problems in developing a humanoid robot is caused by the fact that intermediate results, such as what the robot perceives the environment, and how it plans its moving path are hard to be observed online in the physical environment. What developers can see is only the behavior. Therefore, they usually investigate logged data afterwards, to analyze how well each component worked, or which component was wrong in the total system. In this paper, we present a novel environment for robot development, in which intermediate results of the system are overlaid on physical space using Mixed Reality technology. Real-time observation enables the developers to see intuitively, in what situation the specific intermediate results are generated, and to understand how results of a component affected the total system. This feature makes the development efficient and precise. This environment also gives a human-robot interface that shows the robot internal state intuitively, not only in development, but also in operation.