Optimal Synthesis Method for Binary Neural Network Using NETLA

  • Authors:
  • Sang-Kyu Sung;Jong-Won Jung;Joon-Tark Lee;Woo-Jin Choi;Seok-Jun Ji

  • Affiliations:
  • -;-;-;-;-

  • Venue:
  • AFSS '02 Proceedings of the 2002 AFSS International Conference on Fuzzy Systems. Calcutta: Advances in Soft Computing
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes an optimal synthesis method of binary neural network for pattern recognition. Our object is to minimize the number of connections and the number of neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm (NETLA) for the multilayered neural networks. The synthesis method in NETLA uses the Expanded Sum of Product (ESP) of the boolean expressions and is based on multilayer perceptron. It has an ability to optimize a given binary neural network in the binary space without any iterative learning as the conventional Error Back Propagation (EBP) algorithm. Furthermore, NETLA can reduce the number of the required neurons in hidden layer and the number of connections. Therefore, this learning algorithm can speed up training for pattern recognition problems. The superiority of NETLA to other learning algorithms is demonstrated by an application to the approximation problem of a circular region.