Advances in neural information processing systems 2
A Simple Neural Network Pruning Algorithm with Application to Filter Synthesis
Neural Processing Letters
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Fast Constructive-Covering Algorithm for neural networks and its implement in classification
Applied Soft Computing
Active Evaluation and Ranking of Multiple-Attribute Items Using Feedforward Neural Networks
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
A fast approach for automatic generation of fuzzy rules by generalized dynamic fuzzy neural networks
IEEE Transactions on Fuzzy Systems
A new pruning heuristic based on variance analysis of sensitivity information
IEEE Transactions on Neural Networks
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
A constructive algorithm for training cooperative neural network ensembles
IEEE Transactions on Neural Networks
A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
IEEE Transactions on Neural Networks
Constructive feedforward neural networks using Hermite polynomial activation functions
IEEE Transactions on Neural Networks
Comparing Support Vector Machines and Feedforward Neural Networks With Similar Hidden-Layer Weights
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, a novel pruning algorithm is proposed for self-organizing the feed-forward neural network based on the sensitivity analysis, named novel pruning feed-forward neural network (NP-FNN). In this study, the number of hidden neurons is determined by the output's sensitivity to the hidden nodes. This technique determines the relevance of the hidden nodes by analyzing the Fourier decomposition of the variance. Then each hidden node can obtain a contribution ratio. The connected weights of the hidden nodes with small ratio will be set as zeros. Therefore, the computational cost of the training process will be reduced significantly. It is clearly shown that the novel pruning algorithm minimizes the complexity of the final feed-forward neural network. Finally, computer simulation results are carried out to demonstrate the effectiveness of the proposed algorithm.