Guaranteed two-pass convergence for supervised and inferential learning

  • Authors:
  • M. J. Healy;T. P. Caudell

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., New Mexico Univ., Albuquerque, NM;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a theoretical analysis of a version of the LAPART adaptive inferencing neural network. Our main result is a proof that the new architecture, called LAPART 2, converges in two passes through a fixed training set of inputs. We also prove that it does not suffer from template proliferation. For comparison, Georgiopoulos et al. (1994) have proved the upper bound n-1 on the number of passes required for convergence for the ARTMAP architecture, where n is the size of the binary pattern input space. If the ARTMAP result is regarded as an n-pass, or finite-pass, convergence result, ours is then a two-pass, or fixed-pass, convergence result. Our results have added significance in that they apply to set-valued mappings, as opposed to the usual supervised learning model of affixing labels to classes