Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Self-Organizing Maps
Dynamic Training Subset Selection for Supervised Learning in Genetic Programming
PPSN III Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: Parallel Problem Solving from Nature
Winning the KDD99 classification cup: bagged boosting
ACM SIGKDD Explorations Newsletter
KDD-99 classifier learning contest LLSoft's results overview
ACM SIGKDD Explorations Newsletter
A linear genetic programming approach to intrusion detection
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
Self organization of a massive document collection
IEEE Transactions on Neural Networks
Fast self-organizing feature map algorithm
IEEE Transactions on Neural Networks
International Journal of Remote Sensing
A granular computing framework for self-organizing maps
Neurocomputing
A New Training Method for Large Self Organizing Maps
Neural Processing Letters
Hi-index | 0.00 |
An active learning algorithm is devised for training Self-Organizing Feature Maps on large data sets. Active learning algorithms recognize that not all exemplars are created equal. Thus, the concepts of exemplar age and difficulty are used to filter the original data set such that training epochs are only conducted over a small subset of the original data set. The ensuing Hierarchical Dynamic Subset Selection algorithm introduces definitions for exemplar difficulty suitable to an unsupervised learning context and therefore appropriate Self-organizing map (SOM) stopping criteria. The algorithm is benchmarked on several real world data sets with training set exemplar counts in the region of 30--500 thousand. Cluster accuracy is demonstrated to be at least as good as that from the original SOM algorithm while requiring a fraction of the computational overhead.