On affine scaling algorithms for nonconvex quadratic programming
Mathematical Programming: Series A and B
The Dynamics of Nonlinear Relaxation Labeling Processes
Journal of Mathematical Imaging and Vision
Learning to Classify Text Using Support Vector Machines: Methods, Theory and Algorithms
Learning to Classify Text Using Support Vector Machines: Methods, Theory and Algorithms
On Standard Quadratic Optimization Problems
Journal of Global Optimization
On Copositive Programming and Standard Quadratic Optimization Problems
Journal of Global Optimization
Solving Standard Quadratic Optimization Problems via Linear, Semidefinite and Copositive Programming
Journal of Global Optimization
Interior-Point Methods for Massive Support Vector Machines
SIAM Journal on Optimization
Efficient svm training using low-rank kernel representations
The Journal of Machine Learning Research
Convex Optimization
Convergence Properties of Dikin"s Affine Scaling Algorithm for Nonconvex Quadratic Minimization
Journal of Global Optimization
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Newton-KKT interior-point methods for indefinite quadratic programming
Computational Optimization and Applications
A convergent decomposition algorithm for support vector machines
Computational Optimization and Applications
A Conic Duality Frank--Wolfe-Type Theorem via Exact Penalization in Quadratic Optimization
Mathematics of Operations Research
On the copositive representation of binary and continuous nonconvex quadratic programs
Mathematical Programming: Series A and B
On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
Operations Research Letters
Graph-based quadratic optimization: A fast evolutionary approach
Computer Vision and Image Understanding
Journal of Global Optimization
Standard bi-quadratic optimization problems and unconstrained polynomial reformulations
Journal of Global Optimization
A note on set-semidefinite relaxations of nonconvex quadratic programs
Journal of Global Optimization
Hi-index | 0.00 |
A Standard Quadratic Optimization Problem (StQP) consists of maximizing a (possibly indefinite) quadratic form over the standard simplex. Likewise, in a multi-StQP we have to maximize a (possibly indefinite) quadratic form over the Cartesian product of several standard simplices (of possibly different dimensions). Among many other applications, multi-StQPs occur in Machine Learning Problems. Several converging monotone interior point methods are established, which differ from the usual ones used in cone programming. Further, we prove an exact cone programming reformulation for establishing rigorous yet affordable bounds and finding improving directions.