Universal perceptron and DNA-like learning algorithm for binary neural networks: non-LSBF implementation

  • Authors:
  • Fangyue Chen;Guanrong Chen;Qinbin He;Guolong He;Xiubin Xu

  • Affiliations:
  • School of Science, Hangzhou Dianzi University, Zhejiang, China;Department of Electronic Engineering, City University of Hong Kong, Hong Kong, China;Department of Mathematics, Taizhou University, Linhai, Zhejiang, China;Department of Mathematics, Zhejiang Normal University, Jinhua, Zhejiang, China;Department of Mathematics, Zhejiang Normal University, Jinhua, Zhejiang, China

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

Implementing linearly nonseparable Boolean functions (non-LSBF) has been an important and yet challenging task due to the extremely high complexity of this kind of functions and the exponentially increasing percentage of the number of non-LSBF in the entire set of Boolean functions as the number of input variables increases. In this paper, an algorithm named DNA-like learning and decomposing algorithm (DNA-like LDA) is proposed, which is capable of effectively implementing non-LSBF. The novel algorithm first trains the DNA-like offset sequence and decomposes non-LSBF into logic XOR operations of a sequence of LSBF, and then determines the weight-threshold values of the multilayer perceptron (MLP) that perform both the decompositions of LSBF and the function mapping the hidden neurons to the output neuron. The algorithm is validated by two typical examples about the problem of approximating the circular region and the well-known n-bit parity Boolean function (PBF).