Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A tutorial on support vector regression
Statistics and Computing
A generalized S-K algorithm for learning v-SVM classifiers
Pattern Recognition Letters
Dual unification of bi-class support vector machine formulations
Pattern Recognition
A general soft method for learning SVM classifiers with L1-norm penalty
Pattern Recognition
On the Equivalence of the SMO and MDM Algorithms for SVM Training
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Cycle-breaking acceleration of SVM training
Neurocomputing
Rapid and brief communication: Unified dual for bi-class SVM approaches
Pattern Recognition
A common framework for the convergence of the GSK, MDM and SMO algorithms
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
Simple solvers for large quadratic programming tasks
PR'05 Proceedings of the 27th DAGM conference on Pattern Recognition
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
A geometric approach to Support Vector Machine (SVM) classification
IEEE Transactions on Neural Networks
A Geometric Nearest Point Algorithm for the Efficient Solution of the SVM Classification Task
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The nearest point problem (NPP), i.e., finding the closest points between two disjoint convex hulls, has two classical solutions, the Gilbert-Schlesinger-Kozinec (GSK) and Mitchell-Dem'yanov-Malozemov (MDM) algorithms. When the convex hulls do intersect, NPP has to be stated in terms of reduced convex hulls (RCHs), made up of convex pattern combinations whose coefficients are bound by a @m