Robust recursive TLS (total least square) method using regularized UDU decomposed for FNN (feedforward neural network) training

  • Authors:
  • JunSeok Lim;Nakjin Choi;KoengMo Sung

  • Affiliations:
  • Department of Electronics Engineering, Sejong University, Seoul, Korea;School of Electrical Engineering and Computer Science, Seoul National University;School of Electrical Engineering and Computer Science, Seoul National University

  • Venue:
  • ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a robust recursive total least squares (RRTLS) algorithm for multilayer feed-forward neural networks. So far, recursive least squares (RLS) has been successfully applied to training multilayer feed-forward neural networks. However, if input data has additive noise, the results from RLS could be biased. Theoretically, such biased results can be avoided by using the recursive total least squares (RTLS) algorithm based on Power Method. In this approach, Power Method uses rank-1 update. and thus is apt to be in ill condition. In this paper, therefore, we propose a robust RTLS algorithm using regularized UDU factorization. This method gives better performance than RLS based training over a wide range of SNRs.