Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion

  • Authors:
  • Shun Gokita;Masashi Sugiyama;Keisuke Sakurai

  • Affiliations:
  • -;-;-

  • Venue:
  • IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In order to obtain better learning results in supervised learning, it is important to choose model parameters appropriately. Model selection is usually carried out by preparing a finite set of model candidates, estimating a generalization error for each candidate, and choosing the best one from the candidates. If the number of candidates is increased in this procedure, the optimization quality may be improved. However, this in turn increases the computational cost. In this paper, we focus on a generalization error estimator called the regularized subspace information criterion and derive an analytic form of the optimal model parameter over a set of infinitely many model candidates. This allows us to maximize the optimization quality while the computational cost is kept moderate.