Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners
IEEE Transactions on Pattern Analysis and Machine Intelligence
Elements of information theory
Elements of information theory
Machine Learning - Special issue on learning with probabilistic representations
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the Choice of Smoothing Parameters for Parzen Estimators of Probability Density Functions
IEEE Transactions on Computers
Pattern Analysis & Applications - Special Issue: Non-parametric distance-based classification techniques and their applications
Gait feature subset selection by mutual information
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans - Special section: Best papers from the 2007 biometrics: Theory, applications, and systems (BTAS 07) conference
Asymptotic theory of greedy approximations to minimal k-point random graphs
IEEE Transactions on Information Theory
Feature Selection for Gender Classification
IbPRIA'11 Proceedings of the 5th Iberian conference on Pattern recognition and image analysis
Feature subset selection applied to model-free gait recognition
Image and Vision Computing
Hi-index | 0.00 |
Guo and Nixon proposed a feature selection method based on maximizing I(x; Y ), the multidimensional mutual information between feature vector x and class variable Y. Because computing I(x; Y) can be difficult in practice, Guo and Nixon proposed an approximation of I(x; Y) as the criterion for feature selection. We show that Guo and Nixon's criterion originates from approximating the joint probability distributions in I(x; Y ) by second-order product distributions. We remark on the limitations of the approximation and discuss computationally attractive alternatives to compute I(x; Y).