Modeling with constructive backpropagation
Neural Networks
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Adaptive growing-and-pruning neural network control for a linear piezoelectric ceramic motor
Engineering Applications of Artificial Intelligence
A new adaptive merging and growing algorithm for designing artificial neural networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Application of two Hopfield neural networks for automatic four-element LED inspection
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A growing and pruning method for radial basis function networks
IEEE Transactions on Neural Networks
Boundedness and convergence of online gradient method with penalty for feedforward neural networks
IEEE Transactions on Neural Networks
Error minimized extreme learning machine with growth of hidden nodes and incremental learning
IEEE Transactions on Neural Networks
OP-ELM: optimally pruned extreme learning machine
IEEE Transactions on Neural Networks
Sensitivity versus accuracy in multiclass problems using memetic Pareto evolutionary neural networks
IEEE Transactions on Neural Networks
Design for Self-Organizing Fuzzy Neural Networks Based on Genetic Algorithms
IEEE Transactions on Fuzzy Systems
A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
IEEE Transactions on Neural Networks
A node pruning algorithm based on a Fourier amplitude sensitivity test method
IEEE Transactions on Neural Networks
Universal approximation using incremental constructive feedforward networks with random hidden nodes
IEEE Transactions on Neural Networks
A New Jacobian Matrix for Optimal Learning of Single-Layer Neural Networks
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper proposes a constructing-and-pruning (CP) approach to optimise the structure of a feedforward neural network (FNN) with a single hidden layer. The number of hidden nodes or neurons is determined by their contribution ratios, which are calculated using a Fourier decomposition of the variance of the FNN's output. Hidden nodes with sufficiently small contribution ratios will be eliminated, while new nodes will be added when the FNN cannot satisfy certain design objectives. This procedure is similar to the growing and pruning processes observed in biological neural networks. The performance of the proposed method is evaluated using a number of examples: real-life date classification, dynamic system identification, and the key variables modelling in a wastewater treatment system. Experimental results show that the proposed method effectively optimises the network structure and performs better than some existing algorithms.