Compressed learning with regular concept

  • Authors:
  • Jiawei Lv;Jianwen Zhang;Fei Wang;Zheng Wang;Changshui Zhang

  • Affiliations:
  • State Key Laboratory on Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing, China;State Key Laboratory on Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing, China;Department of Statistical Science, Cornell University;State Key Laboratory on Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing, China;State Key Laboratory on Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing, China

  • Venue:
  • ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We revisit compressed learning in the PAC learning framework. Specifically, we derive error bounds for learning halfspace concepts with compressed data. We propose the regularity assumption over a pair of concept and data distribution to greatly generalize former assumptions. For a regular concept we define a robust factor to characterize the margin distribution and show that such a factor tightly controls the generalization error of a learned classifier. Moreover, we extend our analysis to the more general linearly non-separable case. Empirical results on both toy and real world data validate our analysis.