Information processing in dynamical systems: foundations of harmony theory
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Training products of experts by minimizing contrastive divergence
Neural Computation
A fast learning algorithm for deep belief nets
Neural Computation
Classification and annotation of digital photos using optical context data
CIVR '08 Proceedings of the 2008 international conference on Content-based image and video retrieval
Manifold integration with Markov random walks
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 1
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
Learning methods for generic object recognition with invariance to pose and lighting
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
CoNet: feature generation for multi-view semi-supervised learning with partially observed views
Proceedings of the 21st ACM international conference on Information and knowledge management
Multi-source deep learning for information trustworthiness estimation
Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining
Latent feature learning in social media network
Proceedings of the 21st ACM international conference on Multimedia
Hi-index | 0.00 |
Deep belief network (DBN) is a probabilistic generative model with multiple layers of hidden nodes and a layer of visible nodes, where parameterizations between layers obey harmonium or restricted Boltzmann machines (RBMs). In this paper we present restricted deep belief network (RDBN) for multi-view learning, where each layer of hidden nodes is composed of view-specific and shared hidden nodes, in order to learn individual and shared hidden spaces from multiple views of data. View-specific hidden nodes are connected to corresponding view-specific hidden nodes in the lower-layer or visible nodes involving a specific view, whereas shared hidden nodes follow inter-layer connections without restrictions as in standard DBNs. RDBN is trained using layer-wise contrastive divergence learning. Numerical experiments on synthetic and real-world datasets demonstrate the useful behavior of the RDBN, compared to the multi-wing harmonium (MWH) which is a two-layer undirected model.