Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Reformulated radial basis neural networks trained by gradient descent
IEEE Transactions on Neural Networks
Sensitivity analysis of multilayer perceptron to input and weight perturbations
IEEE Transactions on Neural Networks
The selection of weight accuracies for Madalines
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Due to the existence of redundant features, the Radial-Basis Function Neural Network (RBFNN) which is trained from a dataset is likely to be huge. Sensitivity analysis technique usually could help to reduce the features by deleting insensitive features. Considering the perturbation of network output as a random variable, this paper defines a new sensitivity formula which is the limit of variance of output perturbation with respect to the input perturbation going to zero. To simplify the sensitivity expression and computation, we prove that the exchange between limit and variance is valid. A formula for computing the new sensitivity of individual features is derived. Numerical simulations show that the new sensitivity definition can be used to remove irrelevant features effectively.