The nature of statistical learning theory
The nature of statistical learning theory
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Machine Learning
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Overfitting in making comparisons between variable selection methods
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Combined SVM-Based Feature Selection and Classification
Machine Learning
FS_SFS: A novel feature selection method for support vector machines
Pattern Recognition
Multiclass SVM-RFE for product form feature selection
Expert Systems with Applications: An International Journal
A wrapper method for feature selection using Support Vector Machines
Information Sciences: an International Journal
Feature selection for multi-label naive Bayes classification
Information Sciences: an International Journal
Information Sciences: an International Journal
Linear penalization support vector machines for feature selection
PReMI'05 Proceedings of the First international conference on Pattern Recognition and Machine Intelligence
Information Sciences: an International Journal
Information Sciences: an International Journal
Integrated classifier hyperplane placement and feature selection
Expert Systems with Applications: An International Journal
Embedded feature selection for support vector machines: state-of-the-art and future challenges
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Online independent reduced least squares support vector regression
Information Sciences: an International Journal
Association rule-based feature selection method for Alzheimer's disease diagnosis
Expert Systems with Applications: An International Journal
A compact hybrid feature vector for an accurate secondary structure prediction
Information Sciences: an International Journal
Efficient classifiers for multi-class classification problems
Decision Support Systems
An efficient method for learning nonlinear ranking SVM functions
Information Sciences: an International Journal
Fusion of supervised and unsupervised learning for improved classification of hyperspectral images
Information Sciences: an International Journal
A reduced support vector machine approach for interval regression analysis
Information Sciences: an International Journal
Probabilistic support vector machines for classification of noise affected data
Information Sciences: an International Journal
Gender classification from unaligned facial images using support subspaces
Information Sciences: an International Journal
The CASH algorithm-cost-sensitive attribute selection using histograms
Information Sciences: an International Journal
HyDR-MI: A hybrid algorithm to reduce dimensionality in multiple instance learning
Information Sciences: an International Journal
Feature subset selection using separability index matrix
Information Sciences: an International Journal
A novel divide-and-merge classification for high dimensional datasets
Computational Biology and Chemistry
A scalable approach to simultaneous evolutionary instance and feature selection
Information Sciences: an International Journal
Optimized bi-dimensional data projection for clustering visualization
Information Sciences: an International Journal
An efficient classification approach for large-scale mobile ubiquitous computing
Information Sciences: an International Journal
Feature selection for medical diagnosis: Evaluation for cardiovascular diseases
Expert Systems with Applications: An International Journal
Class-indexing-based term weighting for automatic text classification
Information Sciences: an International Journal
Image denoising using SVM classification in nonsubsampled contourlet transform domain
Information Sciences: an International Journal
Probabilistic generative ranking method based on multi-support vector domain description
Information Sciences: an International Journal
Ant intelligence inspired blind data detection for ultra-wideband radar sensors
Information Sciences: an International Journal
Optimal control location for the customer-oriented design of smart phones
Information Sciences: an International Journal
Automatic field data analyzer for closed-loop vehicle design
Information Sciences: an International Journal
White box radial basis function classifiers with component selection for clinical prediction models
Artificial Intelligence in Medicine
Robust classification of imbalanced data using one-class and two-class SVM-based multiclassifiers
Intelligent Data Analysis - Business Analytics and Intelligent Optimization
Feature subset selection using improved binary gravitational search algorithm
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Embedded local feature selection within mixture of experts
Information Sciences: an International Journal
Hi-index | 0.08 |
We introduce an embedded method that simultaneously selects relevant features during classifier construction by penalizing each feature's use in the dual formulation of support vector machines (SVM). This approach called kernel-penalized SVM (KP-SVM) optimizes the shape of an anisotropic RBF Kernel eliminating features that have low relevance for the classifier. Additionally, KP-SVM employs an explicit stopping condition, avoiding the elimination of features that would negatively affect the classifier's performance. We performed experiments on four real-world benchmark problems comparing our approach with well-known feature selection techniques. KP-SVM outperformed the alternative approaches and determined consistently fewer relevant features.