A variable order Runge-Kutta method for initial value problems with rapidly varying right-hand sides
ACM Transactions on Mathematical Software (TOMS)
Introduction to algorithms
Machine Learning
Sliding Modes in Solving Convex Programming Problems
SIAM Journal on Control and Optimization
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Neural Networks for Optimization and Signal Processing
Neural Networks for Optimization and Signal Processing
A parallel solver for large quadratic programs in training support vector machines
Parallel Computing - Special issue: Parallel computing in numerical optimization
Fast SVM Training Algorithm with Decomposition on Very Large Data Sets
IEEE Transactions on Pattern Analysis and Machine Intelligence
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
A one-layer recurrent neural network for support vector machine learning
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
On the optimal parameter choice for ν-support vector machines
IEEE Transactions on Pattern Analysis and Machine Intelligence
A new gradient-based neural network for solving linear and quadratic programming problems
IEEE Transactions on Neural Networks
Improved neural network for SVM learning
IEEE Transactions on Neural Networks
A digital architecture for support vector machines: theory, algorithm, and FPGA implementation
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Linear and quadratic programming neural network analysis
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Gradient dynamical systems with discontinuous righthand sides are designed using Persidskii-type nonsmooth Lyapunov functions to work as support vector machines (SVMs) for the discrimination of nonseparable classes. The gradient systems are obtained from an exact penalty method applied to the constrained quadratic optimization problems, which are formulations of two well known SVMs. Global convergence of the trajectories of the gradient dynamical systems to the solution of the corresponding constrained problems is shown to be independent of the penalty parameters and of the parameters of the SVMs. The proposed gradient systems can be implemented as simple analog circuits as well as using standard software for integration of ODEs, and in order to use efficient integration methods with adaptive stepsize selection, the discontinuous terms are smoothed around a neighborhood of the discontinuity surface by means of the boundary layer technique. The scalability of the proposed gradient systems is also shown by means of an implementation using parallel computers, resulting in smaller processing times when compared with traditional SVM packages.