On the Computational Power of Max-Min Propagation Neural Networks

  • Authors:
  • Pablo A. Esté/vez;Yoichi Okabe

  • Affiliations:
  • Department of Electrical Engineering, Universidad de Chile, Casilla 412-3, Santiago 6513027, Chile/ e-mail: pestevez@cec.uchile.cl;Graduate School of Engineering, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-8656, Japan/ e-mail: okabe@itc.u-tokyo.ac.jp

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

We investigate the computational power of max-min propagation (MMP) neural networks, composed of neurons with maximum (Max) or minimum (Min) activation functions, applied over the weighted sums of inputs. The main results presented are that a single-layer MMP network can represent exactly any pseudo-Boolean function F:{0,1}n → [0,1], and that two-layer MMP neural networks are universal approximators. In addition, it is shown that several well-known fuzzy min-max (FMM) neural networks, such as Simpson's FMM, are representable by MMP neural networks.