Bound the learning rates with generalized gradients

  • Authors:
  • Sheng Baohuai;Xiang Daohong

  • Affiliations:
  • Department of Mathematics, Shaoxing College of Arts and Sciences, Shaoxing, Zhejiang, P.R. China;Department of Mathematics, Zhejiang Normal University, Jinhua, Zhejiang, P.R.China

  • Venue:
  • WSEAS Transactions on Signal Processing
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper considers the error bounds for the coefficient regularized regression schemes associated with Lipschitz loss. Our main goal is to study the convergence rates for this algorithm with non-smooth analysis. We give an explicit expression of the solution with generalized gradients of the loss which induces a capacity independent bound for the sample error. A kind of approximation error is provided with possibility theory.