Divergence Based Feature Selection for Multimodal Class Densities

  • Authors:
  • Jana Novovicová;Pavel Pudil;Josef Kittler

  • Affiliations:
  • -;-;-

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 1996

Quantified Score

Hi-index 0.15

Visualization

Abstract

A new feature selection procedure based on the Kullback J-divergence between two class conditional density functions approximated by a finite mixture of parameterized densities of a special type is presented. This procedure is suitable especially for multimodal data. Apart from finding a feature subset of any cardinality without involving any search procedure, it also simultaneously yields a pseudo-Bayes decision rule. Its performance is tested on real data.