Advances in neural information processing systems 2
A penalty-function approach for pruning feedforward neural networks
Neural Computation
Data mining: concepts and techniques
Data mining: concepts and techniques
Introduction to Artificial Neural Systems
Introduction to Artificial Neural Systems
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Generalization Performance of Overtrained Back-Propagation Networks
Proceedings of the EURASIP Workshop 1990 on Neural Networks
Optimizing feedforward artificial neural network architecture
Engineering Applications of Artificial Intelligence
Understanding neural networks via rule extraction
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Two-phase construction of multilayer perceptrons using information theory
IEEE Transactions on Neural Networks
Neural network topology optimization
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
The dependence identification neural network construction algorithm
IEEE Transactions on Neural Networks
An iterative pruning algorithm for feedforward neural networks
IEEE Transactions on Neural Networks
A formal selection and pruning algorithm for feedforward artificial neural network optimization
IEEE Transactions on Neural Networks
A new pruning heuristic based on variance analysis of sensitivity information
IEEE Transactions on Neural Networks
A node pruning algorithm based on a Fourier amplitude sensitivity test method
IEEE Transactions on Neural Networks
Use of a quasi-Newton method in a feedforward neural network construction algorithm
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Optimizing the structure of neural networks is an essential step for the discovery of knowledge from data. This paper deals with a new approach which determines the insignificant input and hidden neurons to detect the optimum structure of a feedforward neural network. The proposed pruning algorithm, called as neural network pruning by significance (N2PS), is based on a new significant measure which is calculated by the Sigmoidal activation value of the node and all the weights of its outgoing connections. It considers all the nodes with significance value below the threshold as insignificant and eliminates them. The advantages of this approach are illustrated by implementing it on six different real datasets namely iris, breast-cancer, hepatitis, diabetes, ionosphere and wave. The results show that the proposed algorithm is quite efficient in pruning the significant number of neurons on the neural network models without sacrificing the networks performance.