An Efficient Rescaled Perceptron Algorithm for Conic Systems

  • Authors:
  • Alexandre Belloni;Robert M. Freund;Santosh Vempala

  • Affiliations:
  • Fuqua School of Business, Duke University, Durham, North Carolina 27708;Sloan School of Management, Massachusetts Institute of Technology, Cambridge, Massachusetts 02142;College of Computing, Georgia Tech, Atlanta, Georgia 30332

  • Venue:
  • Mathematics of Operations Research
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The classical perceptron algorithm is an elementary row-action/relaxation algorithm for solving a homogeneous linear inequality system Ax 0. A natural condition measure associated with this algorithm is the Euclidean width τ of the cone of feasible solutions, and the iteration complexity of the perceptron algorithm is bounded by 1/τ2 [see Rosenblatt, F. 1962. Principles of Neurodynamics. Spartan Books, Washington, DC]. Dunagan and Vempala [Dunagan, J., S. Vempala. 2007. A simple polynomial-time rescaling algorithm for solving linear programs. Math. Programming114(1) 101--114] have developed a rescaled version of the perceptron algorithm with an improved complexity of O(n ln (1/τ)) iterations (with high probability), which is theoretically efficient in τ and, in particular, is polynomial time in the bit-length model. We explore extensions of the concepts of these perceptron methods to the general homogeneous conic system Ax ∈ intK, where K is a regular convex cone. We provide a conic extension of the rescaled perceptron algorithm based on the notion of a deep-separation oracle of a cone, which essentially computes a certificate of strong separation. We show that the rescaled perceptron algorithm is theoretically efficient if an efficient deep-separation oracle is available for the feasible region. Furthermore, when K is the cross-product of basic cones that are either half-spaces or second-order cones, then a deep-separation oracle is available and, hence, the rescaled perceptron algorithm is theoretically efficient. When the basic cones of K include semidefinite cones, then a probabilistic deep-separation oracle for K can be constructed that also yields a theoretically efficient version of the rescaled perceptron algorithm.