Training multilayer perceptrons with the extended Kalman algorithm
Advances in neural information processing systems 1
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Bayesian regularization and pruning using a Laplace prior
Neural Computation
Parameter identification for uncertain plants using H∞ methods
Automatica (Journal of IFAC)
Indefinite-quadratic estimation and control: a unified approach to H2 and H∞ theories
Indefinite-quadratic estimation and control: a unified approach to H2 and H∞ theories
A pruning method for the recursive least squared algorithm
Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Investigating the role of saliency analysis with a neural network rainfall-runoff model
Computers & Geosciences - Special issue on GeoComp 99- GeoComputation and the Geosciences
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Artificial neural networks in wave predictions at the west coast of Portugal
Computers & Geosciences
On the Kalman filtering method in neural network training and pruning
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
An efficient training and pruning method based on H∞ filtering algorithm is proposed for Feedforward neural networks (FNNs). A FNNs’ weight importance measure linking up prediction error sensitivity obtained from H∞ filtering training and a weight salience based pruning technique are derived. The results of extensive experimentation indicate that the proposed method provides better pruning results during the training process of the network without losing its generalization capacity, also provides a robust global optimization training algorithm for given arbitrary network structures.