Matrix analysis
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Kernel PCA and de-noising in feature spaces
Proceedings of the 1998 conference on Advances in neural information processing systems II
A Practical Approach to Feature Selection
ML '92 Proceedings of the Ninth International Workshop on Machine Learning
A Comparative Study on Feature Selection in Text Categorization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
A study of instance-based algorithms for supervised learning tasks: mathematical, empirical, and psychological evaluations
An introduction to variable and feature selection
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
OCFS: optimal orthogonal centroid feature selection for text categorization
Proceedings of the 28th annual international ACM SIGIR conference on Research and development in information retrieval
Building Sparse Large Margin Classifiers
ICML '05 Proceedings of the 22nd international conference on Machine learning
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Iterative RELIEF for feature weighting
ICML '06 Proceedings of the 23rd international conference on Machine learning
Feature selection for linear support vector machines
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 02
Gene subset selection in kernel-induced feature space
Pattern Recognition Letters
Stable feature selection via dense feature groups
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Structure feature selection for graph classification
Proceedings of the 17th ACM conference on Information and knowledge management
RV-SVM: An Efficient Method for Learning Ranking SVM
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Expert Systems with Applications: An International Journal
Feature selection for fast speech emotion recognition
MM '09 Proceedings of the 17th ACM international conference on Multimedia
Multi-class f-score feature selection approach to classification of obstructive sleep apnea syndrome
Expert Systems with Applications: An International Journal
Efficient feature weighting methods for ranking
Proceedings of the 18th ACM conference on Information and knowledge management
Optimal feature selection for support vector machines
Pattern Recognition
Sparse kernel-based feature weighting
PAKDD'08 Proceedings of the 12th Pacific-Asia conference on Advances in knowledge discovery and data mining
TAKES: a fast method to select features in the kernel space
Proceedings of the 20th ACM international conference on Information and knowledge management
Weighted generalized kernel discriminant analysis using fuzzy memberships
WSEAS Transactions on Mathematics
Kernel-based feature extraction under maximum margin criterion
Journal of Visual Communication and Image Representation
An efficient method for learning nonlinear ranking SVM functions
Information Sciences: an International Journal
Hi-index | 0.00 |
We address the problem of feature selection in a kernel space to select the most discriminative and informative features for classification and data analysis. This is a difficult problem because the dimension of a kernel space may be infinite. In the past, little work has been done on feature selection in a kernel space. To solve this problem, we derive a basis set in the kernel space as a first step for feature selection. Using the basis set, we then extend the margin-based feature selection algorithms that are proven effective even when many features are dependent. The selected features form a subspace of the kernel space, in which different state-of-the-art classification algorithms can be applied for classification. We conduct extensive experiments over real and simulated data to compare our proposed method with four baseline algorithms. Both theoretical analysis and experimental results validate the effectiveness of our proposed method.