Analysis of neural networks with redundancy
Neural Computation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
An Accurate Measure for Multilayer Perceptron Tolerance to Weight Deviations
Neural Processing Letters
Obtaining Fault Tolerant Multilayer Perceptrons Using an Explicit Regularization
Neural Processing Letters
Investigating the Fault Tolerance of Neural Networks
Neural Computation
Hi-index | 14.98 |
We propose a method for evaluating and comparing the fault tolerance of a wide variety of parallel distributed processing networks (more commonly referred to as artificial neural networks). Despite the fact that these computing networks are biologically inspired and share many features of biological neural networks, they are not inherently tolerant of the loss of processing elements. We examine two classes of networks, multilayer perceptrons and Gaussian radial basis function networks, and show that there is a marked difference in their operational fault tolerance. Furthermore, we show that fault tolerance is influenced by the training algorithm used and even the initial state of the network. Using an idea due to Sequin and Clay (1990), we show that training with intermittent, randomly selected faults can dramatically enhance the fault tolerance of radial basis function networks, while it yields only marginal improvement when used with multilayer perceptrons.