A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
The Kernel-Adatron Algorithm: A Fast and Simple Learning Procedure for Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Neural Computation
A data-mining approach to improving polycythemia vera diagnosis
Computers and Industrial Engineering
SVM '02 Proceedings of the First International Workshop on Pattern Recognition with Support Vector Machines
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Soft sensor modeling based on rough set and least squares support vector machines
IMCAS'07 Proceedings of the 6th WSEAS International Conference on Instrumentation, Measurement, Circuits and Systems
Support vector machine for functional data classification
Neurocomputing
Improved noninvasive intracranial pressure assessment with nonlinear kernel regression
IEEE Transactions on Information Technology in Biomedicine
L2-SVM: Dependence on the regularization parameter
Pattern Recognition and Image Analysis
A sparse Gaussian process regression model for tourism demand forecasting in Hong Kong
Expert Systems with Applications: An International Journal
An adaptive network intrusion detection method based on PCA and support vector machines
ADMA'05 Proceedings of the First international conference on Advanced Data Mining and Applications
Properties of the solution of L2-Support Vector Machine as a function of regularization parameter
Pattern Recognition and Image Analysis
MLDM'12 Proceedings of the 8th international conference on Machine Learning and Data Mining in Pattern Recognition
Hi-index | 0.00 |
In this article, we discuss issues about formulations of support vector machines (SVM) from an optimization point of view. First, SVMs map training data into a higher- (maybe infinite-) dimensional space. Currently primal and dual formulations of SVM are derived in the finite dimensional space and readily extend to the infinite-dimensional space. We rigorously discuss the primal-dual relation in the infinite-dimensional spaces. Second, SVM formulations contain penalty terms, which are different from unconstrained penalty functions in optimization. Traditionally unconstrained penalty functions approximate a constrained problem as the penalty parameter increases. We are interested in similar properties for SVM formulations. For two of the most popular SVM formulations, we show that one enjoys properties of exact penalty functions, but the other is only like traditional penalty functions, which converge when the penalty parameter goes to infinity.