Nonparametri information fusion for motion estimation

  • Authors:
  • Dorin Comaniciu

  • Affiliations:
  • Real-Time Vision and Modeling Department, Siemens Corporat Research, Princeton, NJ

  • Venue:
  • CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

The problem of information fusion appears in many forms in vision. Tasks such as motion estimation, multimodal registration, tracking, and robot localization, often require the synergy of estimates coming from multiple sources. Most of the fusion algorithms, however, assume a single source model and are not robust to outliers. If the data to be fused follow different underlying models, the traditional algorithms would produce poor estimates. We present in this paper a nonparametric approach to information fusion called Variable-Bandwidth Densitybased Fusion (VBDF). The fusion estimator is computed as the location of the most significant mode of a density function which takes into account the uncertainty of the estimates to be fused. A novel mode detection scheme is presented, which relies on variable-bandwidth mean shift computed at multiple scales. We show that the proposed estimator is consistent and conservative, while handling naturally outliers in the data and multiple source models. The new theory is tested for the task of multiple motion estimation. Numerous experiments validate the theory and provide very competitive results.