Moving least-square method in learning theory

  • Authors:
  • Hong-Yan Wang;Dao-Hong Xiang;Ding-Xuan Zhou

  • Affiliations:
  • School of Statistics and Mathematics, Zhejiang Gongshang University, Hangzhou, Zhejiang 310018, China;Department of Mathematics, Chinese University of Hong Kong, Shatin, N. T., Hong Kong, China;Department of Mathematics, City University of Hong Kong, Tat Chee Avenue, Kowloon, Hong Kong, China

  • Venue:
  • Journal of Approximation Theory
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Moving least-square (MLS) is an approximation method for data interpolation, numerical analysis and statistics. In this paper we consider the MLS method in learning theory for the regression problem. Essential differences between MLS and other common learning algorithms are pointed out: lack of a natural uniform bound for estimators and the pointwise definition. The sample error is estimated in terms of the weight function and the finite dimensional hypothesis space. The approximation error is dealt with for two special cases for which convergence rates for the total L^2 error measuring the global approximation on the whole domain are provided.