A case study with design of experiments: Performance evaluation methodology for Level 1 distributed data fusion processes

  • Authors:
  • Kedar Sambhoos;Christopher Bowman;James Llinas

  • Affiliations:
  • CUBRC, 4455 Genesee Street, Buffalo, NY, USA;Data Fusion & Neural Networks, 1643 Hemlock Way, Broomfield, CO, USA;Center for Multisource Information Fusion, State University of New York at Buffalo, Buffalo, NY, USA

  • Venue:
  • Information Fusion
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

The emphasis of this paper is to design a performance evaluation methodology for Level 1 distributed data fusion processes. However, there is little empirical research been done so far to define a rigorous, technically fair, yet affordable method for performance evaluation for data fusion processes. Within this methodology Performance Evaluation process is treated as a completely new and different fusion process. Here we address the distributed Level 1 fusion problem and give quantitative insights into the interdependencies and the consistency measures between distributed fusion measures of performance. Based on our prior research, our suggested performance evaluation methodology is based upon the Dual Node Network Data Fusion & Resource Management Architecture. Our case study involves track picture consistency across multiple airborne platforms and sensors for what we label as Tier 0, Tier 1 and Tier 2 Level 1 fusion (i.e., entity or object assessment). The highlight of the paper is a proposed approach for an overarching performance evaluation methodology for distributed Level 1 fusion that is meticulous, accounts for the complexities of the ''Track-to-Truth'' association problems, and permits the effects and interactions of various independent variables (factors) to be analyzed. This research also focuses on analyzing the measures of performances by setting up Design of Experiments.