A new polynomial-time algorithm for linear programming
Combinatorica
On the worst-case arithmetic complexity of approximating zeros of polynomials
Journal of Complexity
On affine scaling algorithms for nonconvex quadratic programming
Mathematical Programming: Series A and B
Some perturbation theory for linear programming
Mathematical Programming: Series A and B
Randomized algorithms
Linear programming, complexity theory and elementary functional analysis
Mathematical Programming: Series A and B
Condition-Based Complexity of Convex Optimization in Conic Linear Form via the Ellipsoid Algorithm
SIAM Journal on Optimization
Solving convex programs by random walks
Journal of the ACM (JACM)
A simple polynomial-time rescaling algorithm for solving linear programs
Mathematical Programming: Series A and B
On the Second-Order Feasibility Cone: Primal-Dual Representation and Efficient Projection
SIAM Journal on Optimization
Statistical algorithms and a lower bound for detecting planted cliques
Proceedings of the forty-fifth annual ACM symposium on Theory of computing
Hi-index | 0.00 |
The classical perceptron algorithm is an elementary row-action/relaxation algorithm for solving a homogeneous linear inequality system Ax 0. A natural condition measure associated with this algorithm is the Euclidean width τ of the cone of feasible solutions, and the iteration complexity of the perceptron algorithm is bounded by 1/τ2 [see Rosenblatt, F. 1962. Principles of Neurodynamics. Spartan Books, Washington, DC]. Dunagan and Vempala [Dunagan, J., S. Vempala. 2007. A simple polynomial-time rescaling algorithm for solving linear programs. Math. Programming114(1) 101--114] have developed a rescaled version of the perceptron algorithm with an improved complexity of O(n ln (1/τ)) iterations (with high probability), which is theoretically efficient in τ and, in particular, is polynomial time in the bit-length model. We explore extensions of the concepts of these perceptron methods to the general homogeneous conic system Ax ∈ intK, where K is a regular convex cone. We provide a conic extension of the rescaled perceptron algorithm based on the notion of a deep-separation oracle of a cone, which essentially computes a certificate of strong separation. We show that the rescaled perceptron algorithm is theoretically efficient if an efficient deep-separation oracle is available for the feasible region. Furthermore, when K is the cross-product of basic cones that are either half-spaces or second-order cones, then a deep-separation oracle is available and, hence, the rescaled perceptron algorithm is theoretically efficient. When the basic cones of K include semidefinite cones, then a probabilistic deep-separation oracle for K can be constructed that also yields a theoretically efficient version of the rescaled perceptron algorithm.