Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
MultiBoosting: A Technique for Combining Boosting and Wagging
Machine Learning
Machine Learning
Training products of experts by minimizing contrastive divergence
Neural Computation
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
A fast learning algorithm for deep belief nets
Neural Computation
Ensemble learning for independent component analysis
Pattern Recognition
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
ManifoldBoost: stagewise function approximation for fully-, semi- and un-supervised learning
Proceedings of the 25th international conference on Machine learning
Training restricted Boltzmann machines using approximations to the likelihood gradient
Proceedings of the 25th international conference on Machine learning
Using fast weights to improve persistent contrastive divergence
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
An Empirical Study of a Linear Regression Combiner on Multi-class Data Sets
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Manifold Learning for Multi-classifier Systems via Ensembles
MCS '09 Proceedings of the 8th International Workshop on Multiple Classifier Systems
Computational Statistics & Data Analysis
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
A hybrid approach for efficient ensembles
Decision Support Systems
Artificial Intelligence Review
A variant of Rotation Forest for constructing ensemble classifiers
Pattern Analysis & Applications
Cost-Sensitive Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
An empirical evaluation of rotation-based ensemble classifiers for customer churn prediction
Expert Systems with Applications: An International Journal
Encyclopedia of Machine Learning
Encyclopedia of Machine Learning
An experimental study of one- and two-level classifier fusion for different sample sizes
Pattern Recognition Letters
Ensembles of decision trees for imbalanced data
MCS'11 Proceedings of the 10th international conference on Multiple classifier systems
Learning algorithms for the classification restricted Boltzmann machine
The Journal of Machine Learning Research
Detonation Classification from Acoustic Signature with the Restricted Boltzmann Machine
Computational Intelligence
Ensemble Methods: Foundations and Algorithms
Ensemble Methods: Foundations and Algorithms
Random projection ensemble learning with multiple empirical kernels
Knowledge-Based Systems
Exploiting unlabeled data to enhance ensemble diversity
Data Mining and Knowledge Discovery
Training restricted Boltzmann machines: An introduction
Pattern Recognition
Hi-index | 0.10 |
Recently, restricted Boltzmann machines (RBMs) have attracted considerable interest in machine learning field due to their strong ability to extract features. Given some training data, an RBM or a stack of several RBMs can be used to extract informative features. Meanwhile, ensemble learning is an active research area in machine learning owing to their potential to greatly increase the prediction accuracy of a single classifier. However, RBMs have not been studied to work with ensemble learning so far. In this study, we present several methods for integrating RBMs with bagging to generate diverse and accurate individual classifiers. Taking a classification tree as the base learning algorithm, a thoroughly experimental study conducted on 31 real-world data sets yields some promising conclusions. When using the features extracted by RBMs in ensemble learning, the best way is to perform model combination respectively on the original feature set and the one extracted by a single RBM. However, the prediction performance becomes worse when the features detected by a stack of 2 RBMs are also considered. As for the features detected by RBMs, good classification can be obtained only when they are used together with the original features.