Statistics and Computing
Training products of experts by minimizing contrastive divergence
Neural Computation
The rate adapting poisson model for information retrieval and object recognition
ICML '06 Proceedings of the 23rd international conference on Machine learning
A fast learning algorithm for deep belief nets
Neural Computation
Restricted Boltzmann machines for collaborative filtering
Proceedings of the 24th international conference on Machine learning
Constructing free-energy approximations and generalized belief propagation algorithms
IEEE Transactions on Information Theory
A new class of upper bounds on the log partition function
IEEE Transactions on Information Theory
Training restricted Boltzmann machines using approximations to the likelihood gradient
Proceedings of the 25th international conference on Machine learning
Factored conditional restricted Boltzmann Machines for modeling motion style
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
Products of Hidden Markov Models: it takes N1 to tango
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Discriminative deep belief networks for visual data classification
Pattern Recognition
Two Distributed-State Models For Generating High-Dimensional Time Series
The Journal of Machine Learning Research
In All Likelihood, Deep Belief Is Not Enough
The Journal of Machine Learning Research
An efficient learning procedure for deep boltzmann machines
Neural Computation
Learning a generative model of images by factoring appearance and shape
Neural Computation
Training restricted boltzmann machines with multi-tempering: harnessing parallelization
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Self-Avoiding Random Dynamics on Integer Complex Systems
ACM Transactions on Modeling and Computer Simulation (TOMACS) - Special Issue on Monte Carlo Methods in Statistics
Estimation based on RBM from label proportions in large group case
IScIDE'12 Proceedings of the third Sino-foreign-interchange conference on Intelligent Science and Intelligent Data Engineering
Expert Systems with Applications: An International Journal
The Shape Boltzmann Machine: A Strong Model of Object Shape
International Journal of Computer Vision
Hi-index | 0.00 |
Deep Belief Networks (DBN's) are generative models that contain many layers of hidden variables. Efficient greedy algorithms for learning and approximate inference have allowed these models to be applied successfully in many application domains. The main building block of a DBN is a bipartite undirected graphical model called a restricted Boltzmann machine (RBM). Due to the presence of the partition function, model selection, complexity control, and exact maximum likelihood learning in RBM's are intractable. We show that Annealed Importance Sampling (AIS) can be used to efficiently estimate the partition function of an RBM, and we present a novel AIS scheme for comparing RBM's with different architectures. We further show how an AIS estimator, along with approximate inference, can be used to estimate a lower bound on the log-probability that a DBN model with multiple hidden layers assigns to the test data. This is, to our knowledge, the first step towards obtaining quantitative results that would allow us to directly assess the performance of Deep Belief Networks as generative models of data.