A probabilistic multimodal sensor aggregation scheme applied for a mobile robot

  • Authors:
  • Erik Schaffernicht;Christian Martin;Andrea Scheidig;Horst-Michael Gross

  • Affiliations:
  • Department of Neuroinformatics and Cognitive Robotics, Ilmenau Technical University;Department of Neuroinformatics and Cognitive Robotics, Ilmenau Technical University;Department of Neuroinformatics and Cognitive Robotics, Ilmenau Technical University;Department of Neuroinformatics and Cognitive Robotics, Ilmenau Technical University

  • Venue:
  • KI'05 Proceedings of the 28th annual German conference on Advances in Artificial Intelligence
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Dealing with methods of human-robot interaction and using a real mobile robot, stable methods for people detection and tracking are fundamental features of such a system and require information from different sensory. In this paper, we discuss a new approach for integrating several sensor modalities and we present a multimodal people detection and tracking system and its application using the different sensory systems of our mobile interaction robot Horos working in a real office environment. These include a laser-range-finder, a sonar system, and a fisheye-based omnidirectional camera. For each of these sensory information, a separate Gaussian probability distribution is generated to model the belief of the observation of a person. These probability distributions are further combined using a flexible probabilistic aggregation scheme. The main advantages of this approach are a simple integration of further sensory channels, even with different update frequencies and the usability in real-world environments. Finally, promising experimental results achieved in a real office environment will be presented.