Machine Learning
Optimal linear combinations of neural networks
Neural Networks
Machine Learning - Special issue on inductive transfer
A general probabilistic framework for clustering individuals and objects
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
A Greedy EM Algorithm for Gaussian Mixture Learning
Neural Processing Letters
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Clustering in Weight Space of Feedforward Nets
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
Engineering multiversion neural-net systems
Neural Computation
Hierarchical, unsupervised learning with growing via phase transitions
Neural Computation
Pruning in ordered bagging ensembles
ICML '06 Proceedings of the 23rd international conference on Machine learning
Using boosting to prune bagging ensembles
Pattern Recognition Letters
Using Boosting to prune Double-Bagging ensembles
Computational Statistics & Data Analysis
On-Line Estimation of Biomass Concentration Based on ANN and Fuzzy C-Means Clustering
ISICA '08 Proceedings of the 3rd International Symposium on Advances in Computation and Intelligence
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Algorithm of Neural Network Ensembles and Robust Learning
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Storage device performance prediction with selective bagging classification and regression tree
NPC'10 Proceedings of the 2010 IFIP international conference on Network and parallel computing
Municipal revenue prediction by ensembles of neural networks and support vector machines
WSEAS Transactions on Computers
A constructive algorithm for training heterogeneous neural network ensemble
RSKT'06 Proceedings of the First international conference on Rough Sets and Knowledge Technology
Content based image retrieval using a bootstrapped SOM network
ISNN'06 Proceedings of the Third international conference on Advnaces in Neural Networks - Volume Part II
Cancer prediction using diversity-based ensemble genetic programming
MDAI'05 Proceedings of the Second international conference on Modeling Decisions for Artificial Intelligence
Expert pruning based on genetic algorithm in regression problems
ACIIDS'12 Proceedings of the 4th Asian conference on Intelligent Information and Database Systems - Volume Part III
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
Margin-based ordered aggregation for ensemble pruning
Pattern Recognition Letters
Hi-index | 0.00 |
We show that large ensembles of (neural network) models, obtained e.g. in bootstrapping or sampling from (Bayesian) probability distributions, can be effectively summarized by a relatively small number of representative models. In some cases this summary may even yield better function estimates. We present a method to find representative models through clustering based on the models' outputs on a data set. We apply the method on an ensemble of neural network models obtained from bootstrapping on the Boston housing data, and use the results to discuss bootstrapping in terms of bias and variance. A parallel application is the prediction of newspaper sales, where we learn a series of parallel tasks. The results indicate that it is not necessary to store all samples in the ensembles: a small number of representative models generally matches, or even surpasses, the performance of the full ensemble. The clustered representation of the ensemble obtained thus is much better suitable for qualitative analysis, and will be shown to yield new insights into the data.