Online Gradient Descent Learning Algorithms

  • Authors:
  • Yiming Ying;Massimiliano Pontil

  • Affiliations:
  • Department of Computer Science, University College London, Gower Street, London, WC1E 6BT, England, UK;Department of Computer Science, University College London, Gower Street, London, WC1E 6BT, England, UK

  • Venue:
  • Foundations of Computational Mathematics
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper considers the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) without an explicit regularization term. We present a novel capacity independent approach to derive error bounds and convergence results for this algorithm. The essential element in our analysis is the interplay between the generalization error and a weighted cumulative error which we define in the paper. We show that, although the algorithm does not involve an explicit RKHS regularization term, choosing the step sizes appropriately can yield competitive error rates with those in the literature.