A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
An introduction to variable and feature selection
The Journal of Machine Learning Research
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection in a kernel space
Proceedings of the 24th international conference on Machine learning
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
The success of many learning algorithms hinges on the reliable selection or construction of a set of highly predictive features. Kernel-based feature weighting bridges the gap between feature extraction and subset selection. This paper presents a rigorous derivation of the Kernel-Relief algorithm and assesses its effectiveness in comparison with other state-of-art techniques. For practical considerations, an online sparsification procedure is incorporated into the basis construction process by assuming that the kernel bases form a causal series. The proposed sparse Kernel-Relief algorithm not only produces nonlinear features with extremely sparse kernel expressions but also reduces the computational complexity significantly.