The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Support vector machine active learning for image retrieval
MULTIMEDIA '01 Proceedings of the ninth ACM international conference on Multimedia
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Support Vector Machine Active Learning with Application sto Text Classification
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
SVM-KM: Speeding SVMs Learning with a priori Cluster Selection and k-Means
SBRN '00 Proceedings of the VI Brazilian Symposium on Neural Networks (SBRN'00)
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Component-Based Face Recognition with 3D Morphable Models
CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 5 - Volume 05
Invariance of neighborhood relation under input space to feature space mapping
Pattern Recognition Letters
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Three-Dimensional Face Recognition Using Shapes of Facial Curves
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rapid and brief communication: Active learning for image retrieval with Co-SVM
Pattern Recognition
Twin Support Vector Machines for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Semi-supervised bilinear subspace learning
IEEE Transactions on Image Processing
Training data selection for support vector machines
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
Face Recognition Using Spatially Constrained Earth Mover's Distance
IEEE Transactions on Image Processing
Reconstruction and Recognition of Tensor-Based Objects With Concurrent Subspaces Analysis
IEEE Transactions on Circuits and Systems for Video Technology
Convergent 2-D Subspace Learning With Null Space Analysis
IEEE Transactions on Circuits and Systems for Video Technology
Face recognition/detection by probabilistic decision-based neural network
IEEE Transactions on Neural Networks
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Computer Vision and Image Understanding
Approximate convex hulls family for one-class classification
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Hi-index | 0.03 |
Support Vector Machine (SVM) is an effective classifier for classification task, but a vital shortcoming of SVM is that it needs huge computation for large-scale learning tasks. Sample selection is a feasible strategy to overcome the problem. In order to reduce training samples without sacrificing recognition accuracy, this paper presents a novel sample selection approach named Kernel Subclass Convex Hull (KSCH) sample selection approach, which tries to select boundary samples of each class convex hull. The sample selection idea is derived from the geometrical explanation of SVM. In geometry, constructing a SVM problem can be converted to a problem of computing the nearest points between two convex hulls. Therefore, each class convex hull virtually determines the separating plane of SVM. Since a convex hull of a set can be only constructed by boundary samples of the convex hull, using boundary samples of each class to train SVM will be equivalent to using all training samples to train the classifier. Based on the idea, KSCH method iteratively select boundary samples of each class convex hull in high-dimensional space (induced by kernel trick). The convex hull of chosen set is called subclass convex hull. With the increasing of the size of chosen set, each subclass convex hull can rapidly approximate each class convex hull. So the samples selected by our method can efficiently represent original training set and support SVM classification. Experimental results on MIT-CBCL face database and UMIST face database show that KSCH sample selection method can select fewer high-quality samples to maintain the recognition accuracy of SVM.