A Hellinger-based discretization method for numeric attributes in classification learning
Knowledge-Based Systems
Digital matched filters for detecting Gaussian signals in Gaussian noise
Information Sciences: an International Journal
The role of eigenvalues in linear feature selection theory
Journal of Computational and Applied Mathematics
A unified approach to optimal feature selection
Pattern Recognition Letters
Hi-index | 754.84 |
Consider the problem of discriminating two Gaussian signals by using only a finite number of linear observables. How to choose the set of n observables to minimize the error probabilityP_{e}, is a difficult problem. BecauseH, the Hellinger integral, andH^{2}form an upper and a lower bound forP_{e}, we minimizeHinstead. We find that the set of observables that minimizesHis a set of coefficients of the simultaneously orthogonal expansions of the two signals. The same set of observables maximizes the HájekJ-divergence as well.