An improved algorithm for learning long-term dependency problems in adaptive processing of data structures

  • Authors:
  • Siu-Yeung Cho;Zheru Chi;Wan-Chi Siu;Ah Chung Tsoi

  • Affiliations:
  • Dept. of Electron. & Inf. Eng., Hong Kong Polytech. Univ., Kowloon, China;-;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2003

Quantified Score

Hi-index 0.01

Visualization

Abstract

Many researchers have explored the use of neural-network representations for the adaptive processing of data structures. One of the most popular learning formulations of data structure processing is backpropagation through structure (BPTS). The BPTS algorithm has been successful applied to a number of learning tasks that involve structural patterns such as logo and natural scene classification. The main limitations of the BPTS algorithm are attributed to slow convergence speed and the long-term dependency problem for the adaptive processing of data structures. In this paper, an improved algorithm is proposed to solve these problems. The idea of this algorithm is to optimize the free learning parameters of the neural network in the node representation by using least-squares-based optimization methods in a layer-by-layer fashion. Not only can fast convergence speed be achieved, but the long-term dependency problem can also be overcome since the vanishing of gradient information is avoided when our approach is applied to very deep tree structures.