A flexible framework for multisensor data fusion using data stream management technologies

  • Authors:
  • André Bolles

  • Affiliations:
  • University of Oldenburg

  • Venue:
  • Proceedings of the 2009 EDBT/ICDT Workshops
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many applications use sensors to capture an image of the real world, which is needed for automatic processes. E. g. future driver assistance systems will be based on dynamic information about the car's environment, the car's state and the driver's state. Since there exists no single sensor that can sense the required information, different sensors like radar, video and eye-tracker are used. Typically some provide redundant information about the same real world entity, while others measure different things. Thus, the fusion of information from different sensors is necessary to get a consistent image of the real world. In most sensor fusion systems the sensor configuration is known a priori and the fusion algorithms are adapted for these sensor configurations. Thus, changing a sensor fusion system to enable it to process sensor readings from another sensor configuration is hardly possible or completely impossible. Since in development processes of automotive applications different sensor equipment and environmental requirements exist and change frequently a new approach for adapting sensor fusion systems is necessary. Hence, in this work a framework for sensor fusion systems will be developed that allows a flexible adaptation of fusion mechanisms. Due to real-time requirements of automotive applications and the flexibility of query processing technologies, data stream management technology will be used to develop a flexible framework for multisensor data fusion.