Bernoulli Neural Network with Weights Directly Determined and with the Number of Hidden- Layer Neurons Automatically Determined

  • Authors:
  • Yunong Zhang;Gongqin Ruan

  • Affiliations:
  • School of Information Science and Technology, Sun Yat-sen University, Guangzhou, China 510275;School of Information Science and Technology, Sun Yat-sen University, Guangzhou, China 510275

  • Venue:
  • ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Conventional back-propagation (BP) neural networks have some inherent weaknesses such as slow convergence and local-minima existence. Based on the polynomial interpolation and approximation theory, a special type of feedforward neural-network is constructed in this paper with hidden-layer neurons activated by Bernoulli polynomials. Different from conventional BP and gradient-based training algorithms, a weights-direct-determination (WDD) method is proposed for the Bernoulli neural network (BNN) as well, which determines the neural-network weights directly (just in one general step), without a lengthy iterative BP-training procedure. Moreover, by analyzing the relationship between BNN performance and its different number of hidden-layer neurons, a structure-automatic-determination (SAD) algorithm is further proposed, which could obtain the optimal number of hidden-layer neurons in a constructed Bernoulli neural network in the sense of achieving the highest learning-accuracy for a specific data problem or target function/system. Computer-simulations further substantiate the efficacy of such a Bernoulli neural network and its deterministic algorithms.