Universal learning using free multivariate splines

  • Authors:
  • Yunwen Lei;Lixin Ding;Weili Wu

  • Affiliations:
  • -;-;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper discusses the problem of universal learning using free multivariate splines of order 1. Universal means that the learning algorithm does not involve a priori assumption on the regularity of the target function. We characterize the complexity of the space of free multivariate splines by the remarkable notion called Rademacher complexity, based on which a penalized empirical risk is constructed as an estimation of the expected risk for the candidate model. Our Rademacher complexity bounds are tight within a logarithmic factor. It is shown that the prediction rule minimizing the penalized empirical risk achieves a favorable balance between the approximation and estimation error. By resorting to the powerful techniques in approximation theory to approach the approximation error, we also derive bounds on the generalization error in terms of the sample size, for a large class of loss functions.