Mathematical statistics (4th ed.)
Mathematical statistics (4th ed.)
Efficient reinforcement learning through symbiotic evolution
Machine Learning - Special issue on reinforcement learning
Neural networks: a systematic introduction
Neural networks: a systematic introduction
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Evolving neural networks through augmenting topologies
Evolutionary Computation
X-means: Extending K-means with Efficient Estimation of the Number of Clusters
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Structuring Neural Networks through Bidirectional Clustering of Weights
DS '02 Proceedings of the 5th International Conference on Discovery Science
Optimization of Neural Network Structure and Learning Parameters Using Genetic Algorithms
ICTAI '96 Proceedings of the 8th International Conference on Tools with Artificial Intelligence
A population-based algorithm-generator for real-parameter optimization
Soft Computing - A Fusion of Foundations, Methodologies and Applications
A neural evolutionary approach to financial modeling
Proceedings of the 8th annual conference on Genetic and evolutionary computation
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
MILCS: a mutual information learning classifier system
Proceedings of the 9th annual conference on Genetic and evolutionary computation
Zcs: A zeroth level classifier system
Evolutionary Computation
MILCS in protein structure prediction with default hierarchies
Proceedings of the first ACM/SIGEVO Summit on Genetic and Evolutionary Computation
Hi-index | 0.00 |
This article presents a new approach for automatically determining the optimal quantity and connectivity of the hidden-layer of a three-layer Feed-Forward Neural Network (FFNN) based on a theoretical and practical approach. The system (MINES) is a combination of Neural Network (NN), Back-Propagation (BP), Genetic Algorithm (GA), Mutual Information (MI), and clustering. BP is used to reduce the training-error while MI aides BP to follow an effective path. A GA changes the incoming synaptic connections of the hidden-nodes based on MI fitness. Assigning MI as the fitness of individuals brings a competition between hidden-nodes to acquire a higher amount of information from the error-space. Weight clustering is applied to reduce those hidden-nodes having similar weights. Experimental results are presented, and future directions discussed.