Generalization error of faulty MLPs with weight decay regularizer

  • Authors:
  • Chi Sing Leung;John Sum;Shue Kwan Mak

  • Affiliations:
  • Department of Electronic Engineering, City University of Hong Kong, Hong Kong;Institute of Technology Management, National Chung Hsing University, Taiwan;Department of Electronic Engineering, City University of Hong Kong, Hong Kong

  • Venue:
  • ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Weight decay is a simple regularization method to improve the generalization ability of multilayered perceptrons (MLPs). Besides, the weight decay method can also improve the fault tolerance of MLPs. However, most existing generalization error results of using the weight decay method focus on fault-free MLPs only. For faulty MLPs, using a test set to study the generalization ability is not practice because there are huge number of possible faulty networks for a trained network. This paper develops a prediction error formula for predicting the performance of faulty MLPs. Our prediction error results allows us to select an appropriate model for MLPs under open node fault situation.