A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Artificial Intelligence Review - Special issue on lazy learning
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Iterative RELIEF for feature weighting
ICML '06 Proceedings of the 23rd international conference on Machine learning
Feature selection in a kernel space
Proceedings of the 24th international conference on Machine learning
Feature extraction through local learning
Statistical Analysis and Data Mining
A robust elastic net approach for feature learning
Journal of Visual Communication and Image Representation
Hi-index | 0.00 |
In this paper, we study the problem of feature extraction for pattern classification applications. RELIEF is considered as one of the best-performed algorithms for assessing the quality of features for pattern classification. Its extension, local feature extraction (LFE), was proposed recently and was shown to outperform RELIEF. In this paper, we extend LFE to the nonlinear case, and develop a new algorithm called kernel LFE (KLFE). Compared with other feature extraction algorithms, KLFE enjoys nice properties such as low computational complexity, and high probability of identifying relevant features; this is because KLFE is a nonlinear wrapper feature extraction method and consists of solving a simple convex optimization problem. The experimental results have shown the superiority of KLFE over the existing algorithms.