Information-theoretic algorithm for feature selection
Pattern Recognition Letters
Content-Based Classification, Search, and Retrieval of Audio
IEEE MultiMedia
Construction and Evaluation of a Robust Multifeature Speech/Music Discriminator
ICASSP '97 Proceedings of the 1997 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP '97)-Volume 2 - Volume 2
IEEE Transactions on Pattern Analysis and Machine Intelligence
A hybrid genetic algorithm for feature selection wrapper based on mutual information
Pattern Recognition Letters
Feature selection with dynamic mutual information
Pattern Recognition
Normalized mutual information feature selection
IEEE Transactions on Neural Networks
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Feature selection using hierarchical feature clustering
Proceedings of the 20th ACM international conference on Information and knowledge management
Efficient feature selection filters for high-dimensional data
Pattern Recognition Letters
Nearest neighbor estimate of conditional mutual information in feature selection
Expert Systems with Applications: An International Journal
Information-theoretic selection of high-dimensional spectral features for structural recognition
Computer Vision and Image Understanding
Computers and Electrical Engineering
Time-efficient estimation of conditional mutual information for variable selection in classification
Computational Statistics & Data Analysis
Hi-index | 0.10 |
This paper proposes a novel criterion for estimating the redundancy information of selected feature sets in multi-dimensional pattern classification. An appropriate feature selection process typically maximizes the relevancy of features to each class and minimizes the redundancy of features between selected features. Unlike to the relevancy information that can be measured by mutual information, however, it is difficult to estimate the redundancy information because its dynamic range is varied by the characteristics of features and classes. By utilizing the conceptual diagram of the relationship between candidate features, selected features, and class variables, this paper proposes a new criterion to accurately compute the amount of redundancy. Specifically, the redundancy term is estimated by conditional mutual information between selected and candidate features to each class variable, which does not need a cumbersome normalization process as the conventional algorithm does. The proposed algorithm is implemented into a speech/music discrimination system to evaluate classification performance. Experimental results by varying the number of selected features verify that the proposed method shows higher classification accuracy than conventional algorithms.