A new polynomial-time algorithm for linear programming
Combinatorica
Learning linear threshold functions in the presence of classification noise
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Some perturbation theory for linear programming
Mathematical Programming: Series A and B
Linear programming, complexity theory and elementary functional analysis
Mathematical Programming: Series A and B
Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Condition-Based Complexity of Convex Optimization in Conic Linear Form via the Ellipsoid Algorithm
SIAM Journal on Optimization
Solving convex programs by random walks
Journal of the ACM (JACM)
Perceptrons: An Introduction to Computational Geometry
Perceptrons: An Introduction to Computational Geometry
A simple polynomial-time rescaling algorithm for solving linear programs
Mathematical Programming: Series A and B
Hi-index | 0.00 |
The classical perceptron algorithm is an elementary algorithm for solving a homogeneous linear inequality system Ax 0, with many important applications in learning theory (e.g., [11,8]). A natural condition measure associated with this algorithm is the Euclidean width τ of the cone of feasible solutions, and the iteration complexity of the perceptron algorithm is bounded by 1/τ2. Dunagan and Vempala [5] have developed a re-scaled version of the perceptron algorithm with an improved complexity of O(n ln(1/τ)) iterations (with high probability), which is theoretically efficient in τ, and in particular is polynomial-time in the bit-length model. We explore extensions of the concepts of these perceptron methods to the general homogeneous conic system Ax ∈ int K where K is a regular convex cone. We provide a conic extension of the re-scaled perceptron algorithm based on the notion of a deep-separation oracle of a cone, which essentially computes a certificate of strong separation. We give a general condition under which the re-scaled perceptron algorithm is theoretically efficient, i.e., polynomial-time; this includes the cases when K is the cross-product of half-spaces, second-order cones, and the positive semi-definite cone.