The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Stable feature selection via dense feature groups
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Partially supervised feature selection with regularized linear models
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Large Margin Subspace Learning for feature selection
Pattern Recognition
Coherence functions with applications in large-margin classification methods
The Journal of Machine Learning Research
Hi-index | 0.00 |
The large number of genes and the relatively small number of samples are typical characteristics for microarray data. These characteristics pose challenges for both sample classification and relevant gene selection. The support vector machine (SVM) is a widely used classification technique, and previous studies have demonstrated its superior classification performance in microarray analysis. However, a major limitation is that the SVM can not perform automatic gene selection. To overcome this limitation, we propose the hybrid huberized support vector machine (HHSVM). The HHSVM uses the huberized hinge loss function and the elastic-net penalty. It has two major benefits: 1. automatic gene selection; 2. the grouping effect, where highly correlated genes tend to be selected/removed together. We also develop an efficient algorithm that computes the entire regularized solution path for HHSVM. We have applied our method to real microarray data and achieved promising results.