A Sigma-Pi-Sigma Neural Network (SPSNN)

  • Authors:
  • Chien-Kuo Li

  • Affiliations:
  • Department of Information Management, Shih Chien University, Taipei, Taiwan, ROC e-mail: ckli@mail.usc.edu.tw

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

This letter presents a sigma-pi-sigma neural network (SPSNN) structure. The SPSNN can learn to implement static mapping that multilayer neural networks and radial basis function networks usually do. The output of the SPSNN has the sum of product-of-sum form \mathop{\sum}\nolimits_{n = 1}^{\rm K} \mathop{\prod}\nolimits_{i = 1}^n \mathop{\sum}\nolimits_{j = 1}^{N_{v}}f_{nij(x_j), where xj's are inputs, Nv is the number of inputs, fnij() is a function to be generated through the network training, and K is the number of pi-sigma network (PSN) which is the basic building block for SPSNN. A linear memory array can be used to implement fnij(). The function fnij(xj) can be expressed as \mathop{\sum}\nolimits_{k = 1}^{N_q + N_e - 1}w_{nijk} B_{ijk} (x_j), where Bijk() is a single-variable basis function, wnijk's are weight values stored in memory, Nq is the quantized element number for xj, and Ne is the number of basis functions in the neighborhood used for storing information for xj. If all Bijk()'s are Gaussian functions, the new neural network degenerates to a Gaussian function network. This paper focuses on the use of overlapped rectangular pulses as the basis functions. With such basis functions, w_{nijk} B_{ijk}(x_j) will equal either zero or wnijk, and the computation of fnij (xj) becomes a simple addition of retrieved wnijk's. The new neural network structure demonstrates excellent learning convergence characteristics and requires small memory space. It has merits over multilayer neural networks, radial basis function networks and CMAC.