A Fast Simplified Fuzzy ARTMAP Network

  • Authors:
  • Mohammad-Taghi Vakil-Baghmisheh;Nikola Pavešić

  • Affiliations:
  • Laboratory of Artificial Perception, Systems and Cybernetics, Faculty of Electrical Engineering, University of Ljubljana, Slovenia. e-mail: vakil@luz.fe.uni-lj.si;Laboratory of Artificial Perception, Systems and Cybernetics, Faculty of Electrical Engineering, University of Ljubljana, Slovenia. e-mail: nikola.pavesic@fe.uni-lj.si

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present an algorithmic variant of the simplified fuzzy ARTMAP (SFAM) network, whose structure resembles those of feed-forward networks. Its difference with Kasuba's model is discussed, and their performances are compared on two benchmarks. We show that our algorithm is much faster than Kasuba's algorithm, and by increasing the number of training samples, the difference in speed grows enormously.The performances of the SFAM and the MLP (multilayer perceptron) are compared on three problems: the two benchmarks, and the Farsi optical character recognition (OCR) problem. For training the MLP two different variants of the backpropagation algorithm are used: the BPLRF algorithm (backpropagation with plummeting learning rate factor) for the benchmarks, and the BST algorithm (backpropagation with selective training) for the Farsi OCR problem.The results obtained on all of the three case studies with the MLP and the SFAM, embedded in their customized systems, show that the SFAM's convergence in fast-training mode, is faster than that of MLP, and online operation of the MLP is faster than that of the SFAM. On the benchmark problems the MLP has much better recognition rate than the SFAM. On the Farsi OCR problem, the recognition error of the SFAM is higher than that of the MLP on ill-engineered datasets, but equal on well-engineered ones. The flexible configuration of the SFAM, i.e. its capability to increase the size of the network in order to learn new patterns, as well as its simple parameter adjustment, remain unchallenged by the MLP.