The minimum L-complexity algorithm and its applications to learning non-parametric rules

  • Authors:
  • Kenji Yamanishi

  • Affiliations:
  • NEC Research Institute, Inc., 4 Independence Way, Princeton, NJ

  • Venue:
  • COLT '94 Proceedings of the seventh annual conference on Computational learning theory
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes the minimum L-complexity algorithm (MLC), which can be thought of as an extension of the minimum description length (MDL) principle-based algorithm to the case where general real-valued functions are used as hypotheses and general loss functions are used as distortion measures. MLC is also closely related to Barron's complexity regularization algorithm and Vapnik's structural risk minimization. We demonstrate the effectiveness of MLC in terms of sample complexity within the decision theoretic PAC learning model. Specifically using MLC, we develop a unifying method of deriving upper bounds on target-dependent (non-uniform) sample complexity both for parametric and non-parametric settings. We further introduce a method for evaluating average-case sample complexity where the average is taken with respect to a prior probability over the parametric target class. These target-dependent and average-case sample complexity bounds offer a new view of sample complexity analysis, while most of previous work focusing on the worst-case sample complexity. As applications of MLC, we consider the issues of learning and non-parametric rules in terms of 1) stochastic rules with finite partitioning, 2) finite Hermite series, and 3) finite Fourier series. We use MLC to improve the previously-known best results on sample complexity for these issues.