A New Backpropagation Learning Algorithm for Layered Neural Networks with Nondifferentiable Units

  • Authors:
  • Takahumi Oohori;Hidenori Naganuma;Kazuhisa Watanabe

  • Affiliations:
  • Department of Information Design, Hokkaido Institute of Technology, Sapporo 006-8585, Japan oohori@hit.ac.jp;Division of Electrical Engineering, Graduate School of Engineering, Hokkaido Institute of Technology, Sapporo 006-8585, Japan q04305@hit.ac.jp;Department of Information Network Engineering, Hokkaido Institute of Technology, Sapporo 006-8585, Japan nabek@hit.ac.jp

  • Venue:
  • Neural Computation
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We propose a digital version of the backpropagation algorithm (DBP) for three-layered neural networks with nondifferentiable binary units. This approach feeds teacher signals to both the middle and output layers, whereas with a simple perceptron, they are given only to the output layer. The additional teacher signals enable the DBP to update the coupling weights not only between the middle and output layers but also between the input and middle layers. A neural network based on DBP learning is fast and easy to implement in hardware. Simulation results for several linearly nonseparable problems such as XOR demonstrate that the DBP performs favorably when compared to the conventional approaches. Furthermore, in large-scale networks, simulation results indicate that the DBP provides high performance.