Sensitivity analysis of multilayer perceptron to input and weight perturbations

  • Authors:
  • Xiaoqin Zeng;D. S. Yeung

  • Affiliations:
  • Dept. of Comput., Hong Kong Polytech. Univ.;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

An important issue in the design and implementation of a neural network is the sensitivity of its output to input and weight perturbations. In this paper, we discuss the sensitivity of the most popular and general feedforward neural networks-multilayer perceptron (MLP). The sensitivity is defined as the mathematical expectation of the output errors of the MLP due to input and weight perturbations with respect to all input and weight values in a given continuous interval. The sensitivity for a single neuron is discussed first and an analytical expression that is a function of the absolute values of input and weight perturbations is approximately derived. Then an algorithm is given to compute the sensitivity for the entire MLP. As intuitively expected, the sensitivity increases with input and weight perturbations, but the increase has an upper bound that is determined by the structural configuration of the MLP, namely the number of neurons per layer and the number of layers. There exists an optimal value for the number of neurons in a layer, which yields the highest sensitivity value. The effect caused by the number of layers is quite unexpected. The sensitivity of a neural network may decrease at first and then almost keeps constant while the number increases