A Maximum Class Distance Support Vector Machine-Based Algorithm for Recursive Dimension Reduction
ISNN 2009 Proceedings of the 6th International Symposium on Neural Networks: Advances in Neural Networks - Part II
Analysis of survival data having time-dependent covariates
IEEE Transactions on Neural Networks
An optimal set of uncorrelated margin discriminant vector
ICNC'09 Proceedings of the 5th international conference on Natural computation
Behavior-constrained support vector machines for fMRI data analysis
IEEE Transactions on Neural Networks
Serendipitous learning: learning beyond the predefined label space
Proceedings of the 17th ACM SIGKDD international conference on Knowledge discovery and data mining
A regularization for the projection twin support vector machine
Knowledge-Based Systems
Hi-index | 0.00 |
The usual dimensionality reduction technique in supervised learning is mainly based on linear discriminant analysis (LDA), but it suffers from singularity or undersampled problems. On the other hand, a regular support vector machine (SVM) separates the data only in terms of one single direction of maximum margin, and the classification accuracy may be not good enough. In this letter, a recursive SVM (RSVM) is presented, in which several orthogonal directions that best separate the data with the maximum margin are obtained. Theoretical analysis shows that a completely orthogonal basis can be derived in feature subspace spanned by the training samples and the margin is decreasing along the recursive components in linearly separable cases. As a result, a new dimensionality reduction technique based on multilevel maximum margin components and then a classifier with high accuracy are achieved. Experiments in synthetic and several real data sets show that RSVM using multilevel maximum margin features can do efficient dimensionality reduction and outperform regular SVM in binary classification problems.