Learning with Permutably Homogeneous Multiple-Valued Multiple-Threshold Perceptrons

  • Authors:
  • Alioune Ngom;Corina Reischer;Dan A. Simovici;Ivan Stojmenović

  • Affiliations:
  • Computer Science Department, Lakehead University, 955 Oliver Road, Thunder Bay, Ontario P7B 5E1, Canada. E-mail: angom@ice.lakeheadu.ca;Department of Mathematics and Computer Science, University of Quebec at Trois-Rivieres, Trois-Rivieres, Quebec G9A 5H7, Canada. E-mail: corina_reischer@uqtr.uquebec.ca;Department of Mathematics and Computer Science, University of Massachusetts at Boston, Boston, Massachusetts 02125, USA. E-mail: dsim@umb.edu;Department of Computer Science, School of Information Technology and Engineering, University of Ottawa, Ottawa, Ontario K1N 9B4, Canada.

  • Venue:
  • Neural Processing Letters
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

The (n,k,s)-perceptrons partition the input space V \subset Rn into s+1 regions using s parallel hyperplanes. Their learning abilities are examined in this research paper. The previously studied homogeneous (n,k,k−1)-perceptron learning algorithm is generalized to the permutably homogeneous (n,k,s)-perceptron learning algorithm with guaranteed convergence property. We also introduce a high capacity learning method that learns any permutably homogeneously separable k-valued function given as input.