A Fast Simplified Fuzzy ARTMAP Network
Neural Processing Letters
Hi-index | 0.00 |
We present a theoretical analysis of a version of the LAPART adaptive inferencing neural network. Our main result is a proof that the new architecture, called LAPART 2, converges in two passes through a fixed training set of inputs. We also prove that it does not suffer from template proliferation. For comparison, Georgiopoulos et al. (1994) have proved the upper bound n-1 on the number of passes required for convergence for the ARTMAP architecture, where n is the size of the binary pattern input space. If the ARTMAP result is regarded as an n-pass, or finite-pass, convergence result, ours is then a two-pass, or fixed-pass, convergence result. Our results have added significance in that they apply to set-valued mappings, as opposed to the usual supervised learning model of affixing labels to classes