Computational limitations of small-depth circuits
Computational limitations of small-depth circuits
Multilayer feedforward networks are universal approximators
Neural Networks
Practical Issues in Temporal Difference Learning
Machine Learning
Training products of experts by minimizing contrastive divergence
Neural Computation
Circuit Complexity before the Dawn of the New Millennium
Proceedings of the 16th Conference on Foundations of Software Technology and Theoretical Computer Science
A fast learning algorithm for deep belief nets
Neural Computation
Classification using discriminative restricted Boltzmann machines
Proceedings of the 25th international conference on Machine learning
On the quantitative analysis of deep belief networks
Proceedings of the 25th international conference on Machine learning
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
Deep belief networks are compact universal approximators
Neural Computation
Unsupervised Layer-Wise Model Selection in Deep Neural Networks
Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
An implicitization challenge for binary factor analysis
Journal of Symbolic Computation
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part III
Deep adaptive networks for image classification
ICIMCS '10 Proceedings of the Second International Conference on Internet Multimedia Computing and Service
Discriminative deep belief networks for visual data classification
Pattern Recognition
On the expressive power of deep architectures
ALT'11 Proceedings of the 22nd international conference on Algorithmic learning theory
In All Likelihood, Deep Belief Is Not Enough
The Journal of Machine Learning Research
Learning two-layer contractive encodings
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part I
Training restricted Boltzmann machines: An introduction
Pattern Recognition
Feature learning and deep architectures: new directions for music informatics
Journal of Intelligent Information Systems
The Shape Boltzmann Machine: A Strong Model of Object Shape
International Journal of Computer Vision
Hi-index | 0.00 |
Deep belief networks (DBN) are generative neural network models with many layers of hidden explanatory factors, recently introduced by Hinton, Osindero, and Teh (2006) along with a greedy layer-wise unsupervised learning algorithm. The building block of a DBN is a probabilistic model called a restricted Boltzmann machine (RBM), used to represent one layer of the model. Restricted Boltzmann machines are interesting because inference is easy in them and because they have been successfully used as building blocks for training deeper models. We first prove that adding hidden units yields strictly improved modeling power, while a second theorem shows that RBMs are universal approximators of discrete distributions. We then study the question of whether DBNs with more layers are strictly more powerful in terms of representational power. This suggests a new and less greedy criterion for training RBMs within DBNs.