Machine Learning
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Machine Learning
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
A novel SVM Geometric Algorithm based on Reduced Convex Hulls
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 02
IEEE Transactions on Neural Networks
A geometric approach to Support Vector Machine (SVM) classification
IEEE Transactions on Neural Networks
A Geometric Nearest Point Algorithm for the Efficient Solution of the SVM Classification Task
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Geometric methods provide an intuitive and theoretically solid viewpoint for the solution of many optimization problems in the fields of pattern recognition and machine learning. The support vector machine (SVM) classification is a typical optimization task that has achieved excellent generalization performance in a wide variety of applications. In this paper, the notion of ''scaled convex hull'' (SCH) is presented, through which the nonseparable SVM classifications can be approximately transformed to separable ones: by a suitable selection of the reduction factor, the initially overlapping SCHs (each is generated by the training patterns of each class) can be reduced to become separable, then the maximal margin classifier between them can be trained, which is an approximation of the standard nonseparable SVM. As a practical application of the SCH framework, the popular Gilbert's algorithm has been generalized to approximately solve general (linear and nonlinear, separable and nonseparable) SVM classification problems both accurately and efficiently. The experiments show that the proposed method may achieve better performance than the state-of-the-art methods, an improved sequential minimal optimization and Gilbert's algorithm based on the reduced convex hull (RCH), in terms of the number of kernel evaluations and the execution time.