Asymptotically convergent modified recursive least-squares with data-dependent updating and forgetting factor for systems with bounded noise

  • Authors:
  • Soura Dasgupta;Yih-Fang Huang

  • Affiliations:
  • Univ. of Iowa, Iowa City;Univ. of Notre Dame, Notre Dame, IN

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 1987

Quantified Score

Hi-index 754.85

Visualization

Abstract

Continual updating of estimates required by most recursive estimation schemes often involves redundant usage of information and may result in system instabilities in the presence of bounded output disturbances. An algorithm which eliminates these difficulties is investigated. Based on a set theoretic assumption, the algorithm yields modified least-squares estimates with a forgetting factor. It updates the estimates selectively depending on whether the observed data contain sufficient information. The information evaluation required at each step involves very simple computations. In addition, the parameter estimates are shown to converge asymptotically, at an exponential rate, to a region around the true parameter.