The Minimization of Semicontinuous Functions: Mollifier Subgradients
SIAM Journal on Control and Optimization
Machine Learning
Convergence of the simulated annealing algorithm for continuous global optimization
Journal of Optimization Theory and Applications
Global random optimization by simultaneous perturbation stochastic approximation
Proceedings of the 33nd conference on Winter simulation
Simulated Annealing: A Proof of Convergence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Support vector and Kernel methods
Intelligent data analysis
Nonsmooth Optimization Techniques for Semisupervised Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Oblique Support Vector Machines
Informatica
Optimization and Knowledge-Based Technologies
Informatica
Hi-index | 0.00 |
In this paper, we consider the problem of semi-supervised binary classification by Support Vector Machines (SVM). This problem is explored as an unconstrained and non-smooth optimization task when part of the available data is unlabelled. We apply non-smooth optimization techniques to classification where the objective function considered is non-convex and non-differentiable and so difficult to minimize. We explore and compare the properties of Simulated Annealing and of Simultaneous Perturbation Stochastic Approximation (SPSA) algorithms (SPSA with the Lipschitz Perturbation Operator, SPSA with the Uniform Perturbation Operator, Standard Finite Difference Approximation) for semi-supervised SVM classification. Numerical results are given, obtained by running the proposed methods on several standard test problems drawn from the binary classification literature. The performance of the classifiers were evaluated by analyzing Receiver Operating Characteristics (ROC).