LAPACK: a portable linear algebra library for high-performance computers
Proceedings of the 1990 ACM/IEEE conference on Supercomputing
Elements of information theory
Elements of information theory
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Theoretical and Empirical Analysis of ReliefF and RReliefF
Machine Learning
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
An introduction to variable and feature selection
The Journal of Machine Learning Research
Distributional word clusters vs. words for text categorization
The Journal of Machine Learning Research
An extensive empirical study of feature selection metrics for text classification
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Real-time classification of polymers with NIR spectral imaging and blob analysis
Real-Time Imaging - Special issue on spectral imaging
IEEE Transactions on Pattern Analysis and Machine Intelligence
A trainable feature extractor for handwritten digit recognition
Pattern Recognition
The Journal of Machine Learning Research
BNS feature scaling: an improved representation over tf-idf for svm text classification
Proceedings of the 17th ACM conference on Information and knowledge management
Feature Selection for Gene Expression Using Model-Based Entropy
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Real-time classification of ECGs on a PDA
IEEE Transactions on Information Technology in Biomedicine
Pattern Recognition Letters
Feature selection with complexity measure in a quadratic programming setting
IbPRIA'11 Proceedings of the 5th Iberian conference on Pattern recognition and image analysis
Fuzzy rough based regularization in Generalized Multiple Kernel Learning
Computers & Mathematics with Applications
Hi-index | 0.00 |
Identifying a subset of features that preserves classification accuracy is a problem of growing importance, because of the increasing size and dimensionality of real-world data sets. We propose a new feature selection method, named Quadratic Programming Feature Selection (QPFS), that reduces the task to a quadratic optimization problem. In order to limit the computational complexity of solving the optimization problem, QPFS uses the Nyström method for approximate matrix diagonalization. QPFS is thus capable of dealing with very large data sets, for which the use of other methods is computationally expensive. In experiments with small and medium data sets, the QPFS method leads to classification accuracy similar to that of other successful techniques. For large data sets, QPFS is superior in terms of computational efficiency.