Multilayer feedforward networks are universal approximators
Neural Networks
Training products of experts by minimizing contrastive divergence
Neural Computation
A fast learning algorithm for deep belief nets
Neural Computation
An empirical evaluation of deep architectures on problems with many factors of variation
Proceedings of the 24th international conference on Machine learning
Deep belief networks are compact universal approximators
Neural Computation
MMG: a learning game platform for understanding and predicting human recall memory
PKAW'10 Proceedings of the 11th international conference on Knowledge management and acquisition for smart systems and services
Two Distributed-State Models For Generating High-Dimensional Time Series
The Journal of Machine Learning Research
Calibrating artificial neural networks by global optimization
Expert Systems with Applications: An International Journal
In All Likelihood, Deep Belief Is Not Enough
The Journal of Machine Learning Research
Hi-index | 0.00 |
In this note, we show that exponentially deep belief networks can approximate any distribution over binary vectors to arbitrary accuracy, even when the width of each layer is limited to the dimensionality of the data. We further show that such networks can be greedily learned in an easy yet impractical way.