Hybrid inductive machine learning: an overview of CLIP algorithms
New learning paradigms in soft computing
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Consistency-based search in feature selection
Artificial Intelligence
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Comparative Study of Feature-Salience Ranking Techniques
Neural Computation
Pattern Recognition, Fourth Edition
Pattern Recognition, Fourth Edition
A maximum entropy approach to feature selection in knowledge-based authentication
Decision Support Systems
Modeling wine preferences by data mining from physicochemical properties
Decision Support Systems
Normalized mutual information feature selection
IEEE Transactions on Neural Networks
Relevant, irredundant feature selection and noisy example elimination
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Knowledge discovery approach to automated cardiac SPECT diagnosis
Artificial Intelligence in Medicine
Input feature selection for classification problems
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The elimination process aims to reduce the size of the input feature set and at the same time to retain the class discriminatory information for classification problems. This paper investigates the approaches to solve classification problems of the feature selection and proposes a new feature selection algorithm using the mutual information (MI) concept in information theory for the classification problems. The proposed algorithm calculates the MI between the combinations of input features and the class instead of the MI between a single input feature and the class for both continuous-valued and discrete-valued features. Three experimental tests are conducted to evaluate the proposed algorithm. Comparison studies of the proposed algorithm with the previously published classification algorithms indicate that the proposed algorithm is robust, stable and efficient.