Letter Recognition Using Holland-Style Adaptive Classifiers
Machine Learning
Multiresolution Estimates of Classification Complexity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Artificial Intelligence in Medicine
Normalized mutual information feature selection
IEEE Transactions on Neural Networks
On the Feature Selection Criterion Based on an Approximation of Multidimensional Mutual Information
IEEE Transactions on Pattern Analysis and Machine Intelligence
Quadratic Programming Feature Selection
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Variable Selection: A Statistical Dependence Perspective
ICMLA '10 Proceedings of the 2010 Ninth International Conference on Machine Learning and Applications
Hi-index | 0.00 |
Feature selection is a topic of growing interest mainly due to the increasing amount of information, being an essential task in many machine learning problems with high dimensional data. The selection of a subset of relevant features help to reduce the complexity of the problem and the building of robust learning models. This work presents an adaptation of a recent quadratic programming feature selection technique that identifies in one-fold the redundancy and relevance on data. Our approach introduces a non-probabilistic measure to capture the relevance based on Minimum Spanning Trees. Three different real datasets were used to assess the performance of the adaptation. The results are encouraging and reflect the utility of feature selection algorithms.