Fusion, propagation, and structuring in belief networks
Artificial Intelligence
Learning in graphical models
The Journal of Machine Learning Research
Hybrid modeling, hmm/nn architectures, and protein applications
Neural Computation
Supervised neural networks for the classification of structures
IEEE Transactions on Neural Networks
A general framework for adaptive processing of data structures
IEEE Transactions on Neural Networks
2005 Speical Issue: Graph kernels for chemical informatics
Neural Networks - Special issue on neural networks and kernel methods for structured domains
Expert Systems with Applications: An International Journal
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Task allocation and scheduling in wireless distributed computing networks
Analog Integrated Circuits and Signal Processing
Dynamic risks modelling in ERP maintenance projects with FCM
Information Sciences: an International Journal
Modeling maintenance projects risk effects on ERP performance
Computer Standards & Interfaces
Hi-index | 0.00 |
Machine learning methods that can handle variable-size structured data such as sequences and graphs include Bayesian networks (BNs) and Recursive Neural Networks (RNNs). In both classes of models, the data is modeled using a set of observed and hidden variables associated with the nodes of a directed acyclic graph. In BNs, the conditional relationships between parent and child variables are probabilistic, whereas in RNNs they are deterministic and parameterized by neural networks. Here, we study the formal relationship between both classes of models and show that when the source nodes variables are observed, RNNs can be viewed as limits, both in distribution and probability, of BNs with local conditional distributions that have vanishing covariance matrices and converge to delta functions. Conditions for uniform convergence are also given together with an analysis of the behavior and exactness of Belief Propagation (BP) in 'deterministic' BNs. Implications for the design of mixed architectures and the corresponding inference algorithms are briefly discussed.