Generalization in partially connected layered neural networks

  • Authors:
  • Kyung-Hoon Kwon;Kukjin Kang;Jong-Hoon Oh

  • Affiliations:
  • Korea Basic Science Center, Tae-Jon, Korea;Department of Physics, POSTECH, Pohang, Kyongbuk, Korea;Department of Physics, POSTECH, Pohang, Kyongbuk, Korea

  • Venue:
  • COLT '94 Proceedings of the seventh annual conference on Computational learning theory
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

We study the learning from examples in a partially connected single layer perceptron and a two-layer network. Partially connected student networks learn from fully connected teacher networks. We study the generalization in the annealed approximation. We consider a single layer perceptron with binary weights. When a student is weakly diluted, there is a first order phase transition from the poor learning to the good learning state similar to that of fully connected perceptron. With a strong dilution, the first order phase transition disappears and the generalization error decreases continuously. We also study learning of a two-layer committee machine with binary weights. Contrary to the perceptron learning, there always exist a first order transition irrespective of dilution. The permutation symmetry is broken at the transition point and the generalization error is reduced to a non-zero minimum value.