Noise conditions for prespecified convergence rates of stochastic approximation algorithms

  • Authors:
  • E. K.P. Chong;I-Jeng Wang;S. R. Kulkarni

  • Affiliations:
  • Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN;-;-

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

We develop deterministic necessary and sufficient conditions on individual noise sequences of a stochastic approximation algorithm for the error of the iterates to converge at a given rate. Specifically, suppose {ρn} is a given positive sequence converging monotonically to zero. Consider a stochastic approximation algorithm x n+1=xn-an(Anxn-b n)+anen, where {xn} is the iterate sequence, {an} is the step size sequence, {en } is the noise sequence, and x* is the desired zero of the function f(x)=Ax-b. Then, under appropriate assumptions, we show that x n-x*=o(ρn) if and only if the sequence {en} satisfies one of five equivalent conditions. These conditions are based on well-known formulas for noise sequences: Kushner and Clark's (1978) condition, Chen's (see Proc. IFAC World Congr., p.375-80, 1996) condition, Kulkarni and Horn's (see IEEE Trails Automat. Contr., vol.41, p.419-24, 1996) condition, a decomposition condition, and a weighted averaging condition. Our necessary and sufficient condition on {en} to achieve a convergence rate of {ρn} is basically that the sequence {en/ρ n} satisfies any one of the above five well-known conditions. We provide examples to illustrate our result. In particular, we easily recover the familiar result that if an=a/n and {en} is a martingale difference process with bounded variance, then xn-x*=o(n-1/2(log(n))β ) for any β>1/2