Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
An elementary proof of a theorem of Johnson and Lindenstrauss
Random Structures & Algorithms
On generalization bounds, projection profile, and margin distribution
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
An Algorithmic Theory of Learning: Robust Concepts and Random Projection
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.00 |
We revisit compressed learning in the PAC learning framework. Specifically, we derive error bounds for learning halfspace concepts with compressed data. We propose the regularity assumption over a pair of concept and data distribution to greatly generalize former assumptions. For a regular concept we define a robust factor to characterize the margin distribution and show that such a factor tightly controls the generalization error of a learned classifier. Moreover, we extend our analysis to the more general linearly non-separable case. Empirical results on both toy and real world data validate our analysis.