Orthogonal Shrinkage Methods for Nonparametric Regression under Gaussian Noise

  • Authors:
  • Katsuyuki Hagiwara

  • Affiliations:
  • Faculty of Education, Mie University, Tsu, Japan 514-8507

  • Venue:
  • Neural Information Processing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this article, for regression problems, we proposed shrinkage methods in training a machine in terms of a regularized cost function. The machine considered in this article is represented by a linear combination of fixed basis functions, in which the number of basis functions, or equivalently, the number of adjustable weights is identical to the number of training data. This setting can be viewed as a nonparametric regression method in statistics. In the regularized cost function employed in this article, the error function is defined by the sum of squared errors and the regularization term is defined by the quadratic form of the weight vector. By assuming i.i.d. Gaussian noise, we proposed three thresholding methods for the orthogonal components which are obtained by eigendecomposition of the Gram matrix of the vectors of basis function outputs. The final weight values are obtained by a linear transformation of the thresholded orthogonal components and are shrinkage estimators. The proposed methods are quite simple and automatic, in which the regularization parameter is enough to be fixed for a small constant value. Simple numerical experiments showed that, by comparing the leave-one-out cross validation method, the computational costs of the proposed methods are strictly low and the generalization capabilities of the trained machines are comparable when the number of data is relatively large.