A note on genetic algorithms for large-scale feature selection
Pattern Recognition Letters
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Advances in neural information processing systems 2
Adaptation in natural and artificial systems
Adaptation in natural and artificial systems
Kolmogorov's theorem and multilayer neural networks
Neural Networks
Forecasting with neural networks
Information and Management
Neural Networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
The application of artificial neural networks to the analysis of remotely sensed data
International Journal of Remote Sensing
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Learning capability and storage capacity of two-hidden-layer feedforward networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The question of how many hidden layers and how many hidden nodes should there be always comes up in any classification task of remotely sensed data using neural networks. Until today there has been no exact solution. A method of shedding some light to this question is presented in this paper. A near-optimal solution is discovered after searching with a genetic algorithm. A novel fitness function is introduced that concurrently seeks for the most accurate and compact solution. The proposed method is thoroughly compared to many other methods currently in use, including several heuristics and pruning algorithms. The results are encouraging, indicating that it is time to shift our focus from suboptimal practices to efficient search methods, to tune the parameters of neural networks.