Generalized RLS approach to the training of neural networks

  • Authors:
  • Yong Xu;Kwok-Wo Wong;Chi-Sing Leung

  • Affiliations:
  • City Univ. of Hong Kong, China;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recursive least square (RLS) is an efficient approach to neural network training. However, in the classical RLS algorithm, there is no explicit decay in the energy function. This will lead to an unsatisfactory generalization ability for the trained networks. In this paper, we propose a generalized RLS (GRLS) model which includes a general decay term in the energy function for the training of feedforward neural networks. In particular, four different weight decay functions, namely, the quadratic weight decay, the constant weight decay and the newly proposed multimodal and quartic weight decay are discussed. By using the GRLS approach, not only the generalization ability of the trained networks is significantly improved but more unnecessary weights are pruned to obtain a compact network. Furthermore, the computational complexity of the GRLS remains the same as that of the standard RLS algorithm. The advantages and tradeoffs of using different decay functions are analyzed and then demonstrated with examples. Simulation results show that our approach is able to meet the design goals: improving the generalization ability of the trained network while getting a compact network.