Switching and Finite Automata Theory: Computer Science Series
Switching and Finite Automata Theory: Computer Science Series
The geometrical learning of binary neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper describes an optimal synthesis method of binary neural network for pattern recognition. Our object is to minimize the number of connections and the number of neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm (NETLA) for the multilayered neural networks. The synthesis method in NETLA uses the Expanded Sum of Product (ESP) of the boolean expressions and is based on multilayer perceptron. It has an ability to optimize a given binary neural network in the binary space without any iterative learning as the conventional Error Back Propagation (EBP) algorithm. Furthermore, NETLA can reduce the number of the required neurons in hidden layer and the number of connections. Therefore, this learning algorithm can speed up training for pattern recognition problems. The superiority of NETLA to other learning algorithms is demonstrated by an application to the approximation problem of a circular region.