Data fusion with a multisensor system for damage control and situational awareness

  • Authors:
  • Christian P. Minor;Kevin J. Johnson;Susan L. Rose-Pehrsson;Jeffrey C. Owrutsky;Stephen C. Wales;Daniel A. Steinhurst;Daniel T. Gottuk

  • Affiliations:
  • Nova Research, Inc., Alexandria, VA, USA;US Naval Research Laboratory, Washington, DC, USA;US Naval Research Laboratory, Washington, DC, USA;US Naval Research Laboratory, Washington, DC, USA;US Naval Research Laboratory, Washington, DC, USA;Nova Research, Inc., Alexandria, VA, USA;Hughes Associates, Inc., Baltimore, MD, USA

  • Venue:
  • AVSS '07 Proceedings of the 2007 IEEE Conference on Advanced Video and Signal Based Surveillance
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

The U.S. Naval Research Laboratory has developed an affordable, multisensory, real-time detection system for damage control and situational awareness, called “Volume Sensor.” The system provides standoff identification of events within a space (e.g. flaming and smoldering fires, pipe ruptures, and gas releases) for U.S. Navy vessels. A data fusion approach was used to integrate spectral sensors, acoustic sensors, and video image detection algorithms. Bayesian-based decision algorithms improved event detection rates while reducing false positives. Full scale testing demonstrated that the prototype Volume Sensor performed as well or better than commercial video image detection and point-detection systems in critical quality metrics for fire detection while also providing additional situational awareness. The design framework developed for Volume Sensor can serve as a template for the integration of heterogeneous sensors into networks for a variety of real-time sensing and situational awareness applications.