Visualization of Spatial Sensor Data in the Context of Automotive Environment Perception Systems

  • Authors:
  • Marcus Tonnis;Rudi Lindl;Leonhard Walchshausl;Gudrun Klinker

  • Affiliations:
  • Technische Universität München, Fakultät für Informatik, Boltzmannstraβe 3, 85748 Garching b. München, Germany. Tel/Fax: +49 89 289 17083 /-17059, e-mail: toennis@in. ...;BMW Forschung&Technik GmbH, Hanauer Straββe 46, 80992 München. Tel.: +49 (0) 89 382 -13243 / -13242, Fax: +49 (0) 89 382 44988, e-mail: Rudi.Lindl@bmw.de;BMW Forschung&Technik GmbH, Hanauer Straββe 46, 80992 München. Tel.: +49 (0) 89 382 -13243 / -13242, Fax: +49 (0) 89 382 44988, e-mail: Leonhard.Walchshaeusl@bmw.de;Technische Universität München, Fakultät für Informatik, Boltzmannstraβe 3, 85748 Garching b. München, Germany. Tel/Fax: +49 89 289 18215 /-17059, e-mail: klinker@in. ...

  • Venue:
  • ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Spatial sensor systems in cars are gaining more and more importance. Such sensor systems are the foundation of future safety systems, such as automatic emergency brakes, as well as for interactive driver assistance systems. We have developed a system that can visualize such spatial sensor data. Two environments are supported: A laboratory setup for off-line experience and a car setup that enables live experience of spatially aligned laser scanner and video data in real traffic. We have used two visualization devices, a video see-through LCD Flat Panel (TFT) and an optical see-through Head-Mounted Display (HMD) in both setups. For the laboratory setup, a back-projection table has been integrated as well. To present data in correct spatial alignment, we have installed tracking systems in both environments. Visualization schemes for spatial sensor data and for geometric models that outline recognized objects have been developed. We report on our system and discuss experiences from the development and realization phases. The system is not intended to be used as a component of real driver assistance systems. Rather, it can bridge the gap between Human Machine Interface (HMI) designers and sensing engineers during the development phase. Furthermore, it can be both a debugging tool for the realization of environmental perception systems and an experimental platform for the design of presentation schemes for upcoming driver assistance systems.