Human three-dimensional modeling based on intelligent sensor fusion for a tele-operated mobile robot

  • Authors:
  • Naoyuki Kubota;Masashi Satomi;Kazuhiko Taniguchi;Yasutsugu Nogawa

  • Affiliations:
  • Dept. of System Design, Tokyo Metropolitan University, Hino, Tokyo, Japan and SORST, Japan Science and Technology Agency;Dept. of System Design, Tokyo Metropolitan University, Hino, Tokyo, Japan;Kinden Corporation, Kyoto R and D Center;Kinden Corporation, Kyoto R and D Center

  • Venue:
  • KES'07/WIRN'07 Proceedings of the 11th international conference, KES 2007 and XVII Italian workshop on neural networks conference on Knowledge-based intelligent information and engineering systems: Part III
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we discuss a robot vision in order to perceive humans and the environment around a mobile robot. We developed a tele-operated mobile robot with a pan-tilt mechanism composed of a camera and a laser range finder (LRF). The output from the camera is color information, and the output of LRF is distance information to objects from the robot. In this paper, we propose a method of sensor fusion to extract a human from the measured data by integrating these outputs based on the concept of synthesis. Finally, we show experimental results of the proposed method.